IGF 2020 - Day 4 - OF20 Attention economy and free expression?

The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

 

>> PATRICK PENNINCKX: Good morning, everyone.

Good morning to all of you.  I hope you can all see us, which becomes the standard introductory phrase for any online conferences.  So dear colleagues and friends, a warm welcome to our open session number 20 "Attention Economy and Free Expression" which is co‑organized by the Hamburg Media School and the Council of Europe.  As we know freedom of expression is crucial for the enjoyment of other ‑‑ all other human rights.  It enables individuals to make informed choices, participate actively in democratic processes, and that's why also the European court of human rights and political decision makers refer to freedom of expression as a cornerstone of democracy.  We also have to ensure that this cornerstone is not crumbling.  I think that's a key role that we all have.

Freedom of expression, however, is among the human rights most impacted by the digitalization.  Digital technologies have significantly transformed the communication patterns and the behaviors of individuals, communities, societies.  So it has an impact on several levels, the level of communication between individuals, the context of newsrooms and media outlets, and also the broader societal level.  Communication between visits is obviously facilitated, structured and shaped by online platforms and social media.  But beyond the euphoria around virtually unlimited access to information online, and expanded possibilities to generate content, we now see that freedom of expression is increasingly shaped in multiple ways by technologies that rank speech for profit, seek to grab attention with sensationalist headlines and encouraged rights and shares.  Ultimately this crowds out analysis and nuance in favor of often hateful exchanges that silence many voices.

Second in the context of newsrooms and media outlets, microtargetting techniques have revolutionized the new ecosystem leading to the emergence and power of new media, including social media platforms and the prevalence of business model that prioritizes clicks over readers' trust.

This has reversed the flow of advertising revenues, prompting a structural shift within media markets and putting into question the sustainability of traditional media, and undermining even the conditions for quality journalism.

Third, at the broader societal level, including a political communication, algorithmic systems and data marketing tools shape our social and economic and political lives and affect our governance and influence the distribution of resources.

Faced with these unprecedented volumes of content, it is increasingly difficult for individuals to discern what is true and whom to believe, because this causes confusion, contributed to information disorder and impacts negatively on society's trust in the media and the democratic institutions more broadly.  But what is the driving force behind the complex plexus of causes and complications.  Does it lie in the technology itself or the way they are put to use?

Today, we will be looking at the attention economy, which embraces various users of algorithmic systems and processes to manage the attention of individuals and groups in the pursuit of economic or other interests and multilayered impacts on freedom of expression and on information environment.  We will attempt to capture that complexity of interdependence between business models employed by major social media and digital platforms and the draining quality information and public debate.

In today's debate, we will rely on Council of Europe documents and standards, in particular, our committee of ministers declaration on the manipulative capability of algorithmic processes and resolution on the human rights impact of algorithmic systems and of course on the outstanding expertise of our panelists.

I am Patrick Penninckx, I'm head of the Information Society Department of the Council of Europe, and I'm humbled to be able to present our panelists today, and I will give them a short introduction and then afterwards, I will ask them a question to start off.

First, we have Ms. Amy Brouillette, the senior researcher and editorial manager of the Ranking Digital Rights project for a fellow ‑‑ and a fellow of the center for media, data and society at Ranking Digital Rights, Amy is responsible for planning and managing the research and the production of the corporate accountability index and index‑related communications and outreach activities.

Joe McNamee is an independent consultant, Council of Europe expert on the committee of freedom of expression and digital technologies.  Joe has worked on Internet regulations since 1998, up to the end of 2018, he was executive director of European digital rights, the association of organizations defending online human rights in Europe.  He participated in the Council of Europe expert committees on the roles and responsibilities of Internet intermediaries and on human rights, as I mentioned, of automated data processing and different forms of artificial intelligence.

Professor Dr. Alexandra Borchardt, journalism professor, Universitat der Kunste Berlin, head of digital journalism fellowship, Hamburg Media School and partnering with us in the organization of this event, Alexandra is also the director of leadership programs at the Reuters Institute for the Study of Journalism at the University of Oxford and expert committee on the freedom of expression and digital technologies and also previously participated in our work on ‑‑ of the committee of experts on quality journalism in the digital age.

Last but not least, Mr. Aurelien Maehl, senior public policy manager with DuckDuckGo.  He was with APCO, worldwide and political intelligence.

Now, to start off, I will ask a question to Amy, and from Ranking Digital Rights, what is the attention economy?  How does it translate into the use of algorithmic systems, including targeted advertising by social media and digital platforms?

Amy, you have five or six minutes to give your introduction.  Thank you.

>> AMY BROUILLETTE: Thank you very much.  So the term "attention economy" is really what underlies, I think, the very problematic business model of surveillance capitalism, which is what we and others see as kind of the root cause of the infodemic viral hate speech and the disinformation and the other types of problems that the platforms are facing today.  I'm not an expert in the term or the scholarship around this term which spans many different fields.

But what I find interesting about the actual term is that it was first coined in the early 1990s, to explain how the over supply of information, which was interestingly seen as prevalent and problematic even back then, is actually creating an under supply of the public's attention.

So there are all of these channels of information and at the time, that was more about the 24‑hour news cycle, that upended the traditional print media as we knew it then, which was not only translating ‑‑ not translating into a diversity of new ideas or information but was also creating a kind of attention deficit disorder among publics.

For the news and information ecosystems, what this has meant is a decrease in quality in‑depth news and an increase in sensationalist infotainment style news and the rise of inflammatory talk radio, all of that we have seen over the past decades which has done a number on our political culture and fueled a lot of social divisions that we see kind of on full display today.

And then add to this problem that already problematic mix, in comes the tech revolution and social media, which brought unprecedented volumes of information, new channels for obtaining that information, which some ‑‑ many have actually argued doesn't make us any more informed.  In some cases and to the detriment of democracy, you know, in many places, even less in informed.  So we have more access to news and information ever before, but a far less informed electorate, is the idea.

And a lot, obviously, has been written and said about disinformation, particularly over the last few years, even though, you know, political propaganda as a phenomena is not ‑‑ is not especially new.

What is new is the speed at which this type of content can now travel, and the reach and how far it can go.

And this is due primarily to an excessively opaque combination of algorithmic systems and targeting that prioritize and amplify the most sensationalist content based on license shares.  So ‑‑ and while we know that algorithms are often the underlying cause of this virallity, we don't know much more about how these systems work, since companies are obviously fiercely protective of these technologies, despite their I measurable impact ‑‑ immeasurable impact on public life and despite calls from civil society and regulators for more transparency.

So the key question today is really about how to control the spread of harmful content without harming human rights in the process, without underlining the same human rights that we're trying to protect.

And so just to be somewhat self‑promotional, last spring RDR published two reports which we call the "It's the Business Model Series."  Instead of holding digital platforms liable for the content posted by their users regulators and advocates should focus on holding companies accountable for the underlying business model that generates and amplifies the types of harmful speech that is distorting the public sphere and threatening democracy.

I won't go into the "It's the Business Model" reports, but the point I would like to make is that those reports were based on some research that we conducted on indicators that we developed to evaluate how transparent and accountable tech companies are about their development and use of algorithmic systems.  These indicators were developed in part from recommendations by the Council of Europe, which set out standards for how private and public sector actors can develop and deploy human rights respecting algorithmic systems.

From the Council of Europe's recommendations and from extensive feedback from stakeholders we developed a set of new indicators that attempt to set global accountability and transparency standards, grounded in international human rights frameworks for how major publicly traded digital platforms can demonstrate respect for human rights as they develop and deploy these new technologies.

So circling back to the issue of the attention economy, hour Ranking Digital Rights, our new indicators ultimately try to address this underling business model of surveillance capitalism, which is based on the mass collection of user information, and targeting users to ‑‑ with content in extremely opaque ways in order to generate license shares.

And I will just stop there.

>> PATRICK PENNINCKX: Thank you.  Thank you so much, Amy.  That gives us a very clear picture about the speed and reach of sensationalist content on the one side, harmful content, and also the accountability for the systems and the business models that are behind it, and also gives us an idea already about the mass collection of information that comes with it.

Joe, our independent expert and member of our expert committee, attention economy companies, they curate news feeds, they rank content and make decisions on whether it will stay or can be removed.  They also collect data, and then employ to optimize the provision of content including by third party advertisers.  This sounds like a vicious circle to me, but what are the main interests and purposes behind such decisions?  What does this mean for freedom of expression?

I will let you answer that, but I will also tell the audience that they can send us questions and ‑‑ questions via the Q&A box.  So please send your questions to the Q&A box and after this round of panelists, we will then proceed with replying to the questions.

I already have some questions here.  So, Joe, please.

>> JOE McNAMEE: Thank you, Patrick.  Thank you to the Council of Europe for inviting me to this interesting panel.  The Council of Europe has done some very interesting work as was mentioned recently on the ‑‑ on algorithmic decision making and the roles and responsibilities of international intermediaries and Ranking Digital Rights if nobody has read their reports yet, they are really exceptional and move the debate forward some considerable distance.

I think what Ranking Digital Rights has done is all the more valuable because we as a society have failed to understand the multiple and the changing priorities of the attention economy companies.  And this means that we have failed to understand their influence on our information ecosystem, in particular with record to content moderation and content curation.  And due to this failure, policymakers have consistently, and I would almost say always underestimate the potential for self‑regulation by Internet economy companies for dealing with the harmful content that Patrick mentioned a moment ago.

So we have seen that the attention economy lives from personal data.  We have seen that as Amy said, controversial data generates more data and more data generates more profits, and this is the nub of the issue.  The customer in this ‑‑ in the attention economy market is the person that is paying for the data, that's paying for the influence and it's not the individual using, for example, the social media service to exercise their freedom of expression.

Freedom of expression is inherently in a subordinate role in this economy.  And whenever there's an issue that brings content moderation into the headlines, the call is never to protect our fundamental freedoms.  The call is always from policymakers to delete more and delete it quicker, delete it automatically, delete, delete, delete, quicker and quicker.

It drives harmful behavior and drives for calls of more content moderation.  As Patrick said, this is a vicious circle.  Such content moderation is not and will not be voluntarily done in a way that harms the personal data‑driven business model of the company.  So problems continues.  The cycle continues.  And ultimately without correction measures, unbridled correction, prioritizes this over speech.  We focus on deletion as the only solution, as if the attention economy actor was the neutral intermediary, or only had one set of incentives.

As Patrick said, the attention economy companies are multiple businesses at the same time.  They are social media providers, but they are also newsfeed providers.  They are advertising companies.  They are data companies and they are sellers of political influence.

And that takes us to the decision-making process.  In reality, while ‑‑ when they don't want to, intermediaries claim not to want to be the arbiters of truth and they restrict freedom of expression for any number of reasons.  They do it to ensure the business model remains valid.  LinkedIn might decide not to accept posts that are not professional, for example.  They do it to ensure that there's no content on the platform that would be displeasing for advertisers and would reduce revenue.

They do so to avoid liability for potentially illegal content and they do so because of the public relations value of being seen to do something good in the fight against something bad.

But what we see consistently is ‑‑ is in this area, self‑regulatory and co‑regulatory schemes are introduced, but without targets, without measures of success or failure, without key performance indicators, without independent review, without the underlying problems being assessed to see if they are changing or need a changed approach and transparency data are rarely complete, consistent over time or across industry or even in a standardized format.

In some, when it comes to freedom of expression of the digital economy, we fail to understand the nefarious effect of the business models involved.  We feel to understand the problems caused by our amplification profiling and data exploitation.  When content moderation problems arise, we fail to understand the vested interests of the companies involved, and we never insist on getting enough data to produce ‑‑ to assess the scale, nature and the evolution of the problems.

And all of that seems to be an advertisement for Ranking Digital Rights' fine work in this area.  So I will leave it ‑‑ I will leave it at that.  Thank you.

>> PATRICK PENNINCKX: Thank you so much, Joe.  Joe, I have known you for quite some time now, first of all as the director of European digital rights, and we were already very closely working at the time together at the Council of Europe with the guide for human right ‑‑ the human rights guide for the Internet user, which we published some time ago.

Delete, delete, delete, quicker, quicker, quicker, that is what I gathered from your input, Joe.  Thank you so much for giving us a clearer picture on who is ‑‑ on what are the mechanisms behind all of this.

Now, I have the pleasure then to give the floor to Alexandra Borchardt.  As I said from the Hamburg Media School and thank you so much, Alexandra, also for partnering with us in the organization of this event.  It's ‑‑ it's co‑organized between the Council of Europe and the Hamburg Media School.  So in this context, Alexandra, outside of social media, what does this mean for the business editorial choices of free media outlets that live from clicks and inside social media, what does it mean for them when they rely on engagement?  What is the overall impact on the media environment?  Many questions in one, Alexandra, and I hope to be able ‑‑ that you are able to give an answer to that in five minutes.

Thank you so much.

>> ALEXANDRA BORCHARDT: Thank you, mat trick, and thanks for having me.  So I have been a journalist all of my life and now I'm advising media companies, particularly local media publishers in digital transformation.  So I'm really close to what's been happening and what the impact is of the attention economy on the media industry.  I mean, some of this has already been mentioned.  I will still make a point of this.  The erosion of business models has been a significant ‑‑ has had a significant impact on the media industry, because the platforms, the big platform school, Facebook has absorbed the vast amount of digital advertising revenues.  So actually, media companies can't maneuver as well with the lack of this revenue.

Also the platforms absorb much of the attention that other media used to get.  I mean, people used to just, you know, tune into journalism because maybe they were bored or they felt like, you know, an obligation to do so.  So now they are just resorting to other stuff that's on their SmartPhones.

Most importantly, though, it's affected news consumption behavior, and I mean, it's already said something about this.  First of all, the distraction effect is quite massive because people get so many distractions on their SmartPhones that they don't necessarily rely on news just think about commuting on a train, you know?  People used to read newspapers.  Now they are just on their SmartPhones chatting away or gaming.

The drowning out of news and the news feed or in the stream or the constant stream of things that pick their attention has a serious effect and that also leads to a shortening of attention spans in news consumption because people just, you know, click themselves through stuff and distraction.  And that also leads to the erosion of news brands.  People say, oh, you know, I saw that news on Facebook, the Reuters Institute did a study on this.  They didn't see this on the BBC or "The New York Times."  They just saw it somewhere on social media.  So the brands have trouble establishing themselves.

And finally, the news consumption.  News avoidance is an increase phenomenon.  One‑third of the people surveyed say they admit to avoiding news at times because it's just too much, and it's just sort of, you know, too negative and too filled with an agenda.  So really news avoidance is a serious consequence because if you lose people on news consumption, then you have seriously lost them.

Obviously with the media companies, with the publishers, the attention economy has had its effects on content production.  News outlets are really desperate for what works.  So instead of investigating and investigations and producing exclusive content, are they just engage in copy and pasting what works with others to achieve fast gains and maybe grab some of that digital advertising revenue.  So the copy and paste kind of journalism is really been rampant and then that gives people less reason to pay for journalism, because they find the same content everywhere.

It adds to click baits and customers click on it, oh, I have to click on it, but it's basically like consuming sugar.  It raises the blood sugar, but you never really feel satisfied.  So click bait is just something that people click on and then they feel, oh, but I don't really need this.  This is really not good for me.  So it's actually eroding the credibility of journalism and sometimes ‑‑ by people, it's even perceived to be worse than fake news.  People actually label fake news sometimes as click bait.

The rat race.  Among media companies has accelerated because it's about, you know, grabbing attention, about being fast.  Even the digital sphere, collaboration is what it should be all about.

Another point is news outlets have lost control over curation that was already mentioned at news choice because they depend on the algorithms of platform companies at least on social media, when they display their news on social media.  And finally news outlets have given into producing their content for platforms of others, rather than creating the platforms themselves.  Just think about this.

News companies, they used to have newspapers or this was broadcast so they have their own ‑‑ they had their own platforms to publish.  Now they are depending on other platforms and this is a rat race also because they are just chasing.  Where do I have to be?  Do I have to be on Instagram or TikTok, where do I have to publish my news to reach audiences.

Misinformation has already been mentioned, the attention economy drives the proliferation of misinformation, and that has some serious impact because this leads to shift in agenda‑setting from news outlets to whoever spreads lies.  So it's mainly very often, it's politicians who spread lies or misinformation, and just to debunk this kind of misinformation, absorbs a lot of energy in newsrooms that could be spent or should be spent even on research, on investigations, on setting their own agenda.

And the attention economy, last but not least, has influenced language and tone of ‑‑ of publications because it rewards any kind of speech that stands out.  This can be funny speech or drastic or any kind of outrageous stuff that, you know, that gets grabbed by the algorithms.  So there's actually no reward on this for complexity of explanation.

Then also you have the effect on communities and on participation, because you have dialogue and community management on news sites and actually, social media and the comments sections give voice to the noisy few and that also means sometimes those who are really interested and who would like to engage get distracted also because, you know, hate speech is a serious issue for news outlets, and particularly female journalists are targeted of hate speech in massive proportions.

In the end, I don't want to be just so negative ‑‑ I want to emphasize some good things that the attention economy has also brought about to just not just let this get into a very negative tone here.  Actually the attention economy has ignited a debate about quality journalism, because maybe some journalism before didn't get the attention it deserved and that was for a reason, just because it probably wasn't interesting enough or it wasn't ‑‑ it wasn't made for the audiences that it's actually intended to reach.  So actually, I think journalism has gotten much better as a consequence of this debate.  So there's actually good news to this.

And the second good news is also the attention economy has also opened up new audiences that didn't consume news before on other media because they wouldn't subscribe to a newspaper or wouldn't tune into radio or TV, even though this is basically a theoretic possibility, since there is evidence that news inequality is rising and a serious issue, actually, that less people access journalism than did before.

So let me end with what to do about this.  First of all, and I will address what Joe said here, there needs to be indicators for quality journalism, because it shouldn't be just about deleting, delete, deleting, but also about ranking up good content, ranking up quality journalism after all and there's the Journalism Trust Initiative that run by reporters beyond borders that develops these indicators for quality journalism and it's been a huge collaborative effort and I think this is really a very hopeful initiative.

Also, what also to do is news media needs to attract audiences by focusing on quality content and by building stable and lasting relationships with their readers and there's actually an opportunity to really ‑‑ I said that before, rethink journalism, rethink the quality of the journalism and rethink the relationships with readers, audiences and engage much more in debate and dialogue rather than broadcasting, just broadcasting news from the top.

And lastly, news media needs to reassert control over technology.  They need to get, yeah, technology‑wise and develop their own algorithms and technology and platforms to really reach and address the audiences they want to address.

Thank you for this.

>> PATRICK PENNINCKX: Thank you so much, Alexandra.  You gave us a full overview.  And thank you also for drawing the attention to the digital news report, the Reuters News Report I think it is a very interesting source of information as well.  You spoke to us about the erosion of media companies' platforms that absorb attention, the drowning out of real news, the news avoidance as well.  The click baits which leave us with a bitter feeling at times, and especially also to lose control on curation, the editorial responsibility.  And taking energy away from the newsroom to ‑‑ in order to be able to respond to quite a bit of disinformation.

Thank you also for pointing out some of the positive effects and the way forward.

Now, I have a million dollars question for Aurelien of DuckDuckGo.  Can there be a viable alternative business model outside of the attention economy?  How would it function?  And can you share your company's experience on taking such a different route?  Go ahead.

>> AURELIEN MAEHL: Thank you very much.  Let me answer with many millions.  No, it's a pleasure to take part in the conversation that naturally has to see, we are an Internet privacy company, and because of widespread poor privacy practices, you know, that we have heard just before, too many people believe that you simply cannot expect privacy on the Internet, and we disagree to that.  And we have made it our mission to kind of set a new standard of trust online, simply by saying that privacy should be the default and not the exception online.

So people this is best for are search engine that doesn't track them.  We also offer private mobile browser and an extension for Firefox and Chrome that has our tracker blocking and protects people's privacy across the web.  So we are kind of giving users all the privacy essentials they need to seamlessly take control of their personal information just with one download.

And from that, I want to make one point clear from the onset, is that we provide a service that respects or protects people's privacy while at the same time being profitable and so since 2014.  And I can continue saying that it's a false dichotomy to think that a company actually needs endless amounts of personal information to run a profitable business online.  And companies like Google and Facebook have led everyone to believe their models, but that's just false.  Just that they choose to squeeze every bit of profit out of the users at the expense of their privacy rights, at the expense also of society and democracy as we heard.

So instead of relying on behavioral advertising like they do, we serve contextual ads to our users, in order to make money.  So it means that it's way less invasive way of making profit that works by showing people ads that relates to the keyword of their specific search, rather than to which websites they visited the week before or the searched yesterday.

And as we just discussed, harming privacy rights is not just a consequence in itself.  It has wide‑ranging societal impacts.  As you do on Google, you get results tailored to what they think you will be likely to click on, based on the data profile they have you from older tracking but such results are, in fact, dangerous in terms of their consequence, simply because they show you results they think you will click on which necessarily means it will filter out results that they think you will skip.

This is the center bubble.  They put people in the bubble, by filtering out the content that they particularly would not like.  And so to take this to the political dimension, that is just so critical these days, if you have political leanings one way or another, you are more likely to get results that you already agree with, and just less likely to see opposing viewpoints ever, which leads in total to increase the echo chambers that are significantly contributing to an increasingly polarized society.

And we back this with some research that we did.  We like to do research with our users.  We conducted tests in the US asking our users to search Google for several simple keywords steering controversy, like similar like abortion, or gun control, and these people received a wide variety of different results that appeared to be personalized by location and other factors and that was 2012 and 2018.

>> PATRICK PENNINCKX: Thank you.

>> AURELIEN MAEHL: You know, to conclude ‑‑ yeah, I just had a final word.  You know, we are just here to prove basically that business can be ‑‑ (Garbled audio) and the dominant model based on behavioral ads.  And so without sacrifice privacy, a good third of people are already ready to take action online to further their privacy and we are confident that more people will choose services like us if given a genuine choice.  But if I take Android, for instance, it takes more than 15 steps to set DuckDuckGo as a default search engine, and so, you know, we need regulatory intervention to provide these care conditions to operate, first through a menu, like a preference menu where the user selects the default that they would prefer and that would be a good step.

We have shown through research that if done right, this could decrease Google's market share by more than 20% and conversely bring more diversity to the market for privacy protective services.  These were the pointed I wanted to make and thank you again for having me.

>> PATRICK PENNINCKX: Thank you.  Thank you so much, Aurelien.  Thank you also for pointing out that privacy should be the basis of our work and that we can still have a business model that is profitable.  And pointing so the false economy that we would need endless amounts of data in order to produce excellent news and quality information.

I sometimes compare the harvesting data to the gold rush that is get as much as possible, which is what some companies seem to be doing, creating filter bundles and echo chambers and maybe also focusing on a polarized or polarized society.

Now, first of all, we have a very active audience.  So I will be able to ask you quite a number of questions.  But before that, I just wanted to show you that in Strasbourg, the sun does sometimes shine, for the ones who miss the Strasbourg sun, I can tell you that there is some here.

I have one of those active participants is Mr. Guy Berger who you know from UNESCO.  And the notion for attention economy, has the notion that human proclivities are the result problem.  On the other hand, different concepts like surveillance capitalism could put more stress on commercial drivers in the interest of driving behavioral change opportunities.  In this view, it is the system rather than the people who are the issue.  Which according to your perspective is right?

Now, I would like maybe Joe to start with this, and ‑‑ sorry to put you on the spot, Joe, but it's easier to be able to point to someone than ‑‑ than ‑‑ than asking everyone to respond at the same time.

Thank you.  So Joe.

>> JOE McNAMEE: Thank you, Patrick.  Well, I have known Patrick for ‑‑ Guy for quite a long time too, and he's invariably or almost always right and I think he's right in both of his assertions.  I think it's clear that human proclivities are being exploited in the digital ‑‑ in the information economy, and that that is driving an economic model called surveillance capitalism.  So I'm not ‑‑ I don't quite see the linguistic nuance that he's implying.  I think ‑‑ I think both have ‑‑ both terms have merit and both terms are appropriate depending on what we are talking about.  Attention economy is obviously much more focused on the human aspect of this, but I think ‑‑ I don't think there's a contradiction that needs resolving in short.

>> PATRICK PENNINCKX: Okay.  Anyone else who would like to react to ‑‑ yes, Alexandra?

>> ALEXANDRA BORCHARDT: I think it definitely is a structural ‑‑ a structural and systemic thing.  Obviously, there are human proclivities that contribute to that ‑‑ to that development, but, for example, if you look at, you know, majorities or minorities also of news consumers.  Actually, most consumers of news don't want to, like contribute or they don't engage in hate speech or ‑‑ this is usually just as I said for the noisy few.  So actually many people are actually ‑‑ they just want to ‑‑ they want to consume the news.  They want to, you know, live their peaceful lives.  They just don't want to polarize and drive people apart.  So actually, the emphasis on the noisy few is a thing that is actually ‑‑ that is increased by ‑‑ or that is driven by these systemic surveillance capitalism mechanisms that have been described.

>> PATRICK PENNINCKX: Thank you so much.  I have another question from Yannick who is the head of the digital development unit in the Council of Europe, who says, attention economy in the context of social networks is also based on a cognitive bias, especially likes, creating engagement but also a reward.  It's well‑known that they have based this system on slot machines.  So how to limit these effects?

Anyone feels ready to respond to that?

I think I will let you sometime to respond to that.  First of all, I have another question for Amy, which is accountability for business models is interesting idea, especially if there are negative impacts on the wider environment, such as the fossil fuel industry.  Would the aim be to regulate a model or require offsets, Amy?  It's another question from Guy Berger.  So next time, I think we have to invite Guy on the panel.

>> AMY BROUILLETTE: Thank you.  Let's see, I'm not sure about the fossil fuel offset piece, but I do think that there's probably some overlap with the environmental industry.  From the perspective of RDR, we kind of look at addressing the more structural aspects of surveillance capitalism.  In particular, we would advocate for companies to be required, legally required to conduct human rights impact assessments on the development and the use of algorithmic systems and targeting, in the same way that environmental impact assessments are required in other fields.

So our research has also shown that very few companies are actually conducting risk assessments on these ‑‑ on these technologies, which is a fairly big deal given how much of a human rights impact these processes have across the different platforms.  So that's one ‑‑ that's kind of ‑‑ that's ‑‑ in our opinion, that's one the core ways to address the business model question is to assert ‑‑ to advocate for regulations that would require impact assessments.

We would advocate for other types of transparency‑based regulations that would require companies to disclose how they develop algorithms and what user data they use to develop the algorithms and to give users more control and choice over that information.

>> PATRICK PENNINCKX: Okay.  Thank you.  Thank you, Amy.

I have a question from Ibraham ‑‑

>> AURELIEN MAEHL: If I may just ‑‑

>> PATRICK PENNINCKX: Yes, Aurelien?

>> AURELIEN MAEHL: If I may add something to Amy's answer.  Another solution is to also tackle the underlying root of the business model, and we have seen recently some good proposals from the European Parliament or at least some, you know, suggestions to ‑‑ to tackle the widespread use of the behavioral ads and move towards restrictions on the connection of data for the purpose of serving behavioral ads and, you know, this would be the kind of measures that would ‑‑ that would actually address the underlying cause of what we are talking now.

>> PATRICK PENNINCKX: Okay.  Thank you, Aurelien.  Actually, I think the next question is also for you from Ibrahim, who says I personally love using DDG and encouraging people around to do so for privacy, but the problem is about its accuracy compared to Google Search.  What can explain this?  And do you think communities, users can do something to improve the DDG's accuracy?

>> AURELIEN MAEHL: So thank you so much for being a user and a fan and spreading the word.  You know, I think it has largely to see with habits.  I have mentioned before the fact that Google uses certain ways to personalize content.  Google also adds large numbers of additional Google‑generated answers to its queries when ‑‑ you know, when people are looking for something, and so I think it's largely a matter of, like, habits of being used ‑‑ of having Google serving content that they think you would like if you come.  And increasingly, whether you are looking for something, the whole first page and even by scrolling down would be full of Google‑generated content and not as opposed to the nine organic links.

And we think it's also another tactic, you know to people inside the Google system and address it from the rest of the web.  So, yeah.  I think it's just a different approach to how we serve content.

>> PATRICK PENNINCKX: Okay.  Thank you, Aurelien.

I will make use of the panel for my personal interest.  I will be speaking to the international communications and consultants organization this afternoon, which is basically gathers all the communication industry.  And what should be my key message to that industry?  What should I give as a key message to those service providers, service ‑‑ commercial service providers, obviously?  So what would you give me as an advice?

Alexandra, I see you smile.  So since you smile, you have to answer.

>> ALEXANDRA BORCHARDT: Oh, gosh!  It's all about quality content, really.  Yeah, I mean, we have been ‑‑ we have been facing this overabundance of news and misinformation debates or really making sure that, you know, communication is fact based and it's quality ‑‑ it's quality communication.  This is really the key message and not about he said, she said, all kinds of, you know, attention grabbing, polarizing quotes and snippets, but about facts and data.  We have been witnessing this and the ongoing presidential election in the US, that, you know, someone declares a victory, it doesn't necessarily mean this is true.

So really, be very factual and, yeah.  Produce quality content that would be my key message to communication ‑‑ to the communication sector.

>> PATRICK PENNINCKX: Anyone else?  Amy?

>> AMY BROUILLETTE: Yeah.  I mean, I couldn't agree more.  I do think that the message would be to kind of capitalize on their assets and their assets really are, you know, that they ‑‑ you know, they can be the provider of quality journalism.  So it's almost like back to the basics.  Back to the workshop and try to foster the original things that journalism does ‑‑ should be doing quite well.

>> PATRICK PENNINCKX: Thank you.  Anyone else, Joe, would you like to react to that?

>> JOE McNAMEE: I'm not sure if you can still see me.  My video ‑‑

>> PATRICK PENNINCKX: Yes.

>> JOE McNAMEE: I think the ‑‑ I think reminding the market of the value of competition would be good.  I think the economies of scale are driving these problems and they are also driving companies ‑‑ driving a market that is not a market anymore.  So I think talking to market players, a reminder of the value of competition is ‑‑ would be timely.

>> PATRICK PENNINCKX: Okay.  Thank you.

One more question, because we are unfortunately already nearing the end is for Amy, also from Guy.  Accountability for business models is an interesting idea, especially if there are negative impacts on the wider environment.  I already read this one, didn't I?

>> AMY BROUILLETTE: Yes.

>> PATRICK PENNINCKX: Sorry, sorry, for this one.

Okay.

Okay.  And then Alexandra, is journalism a victim of the net companies business model or a victim of its own institutional and historical inertia?  And if the latter, how can journalism and news outlets compete for attention without compromising content and without getting into the data harvesting and manipulative sales models.  You already started replying to that in the chat box, but I would like to hear you about this.

>> ALEXANDRA BORCHARDT: Yes, it's a bit both.  And I would suggest that all the parties that are mentioned address their own issues and their own works.  I said that in my initial remarks.  Journalism really has to reassert itself and to focus ‑‑ or to lead a quality debate, what is quality journalism, and has journalism been good enough.  Has it been interesting enough?  Has it been relevant enough?  And has the tone be right?  And to mention the digital news report, again in 2019 issue, there was a question the audience was surveyed.  What do you think about the news media that you consume?  And the lowest ratings that people gave were for ‑‑ for relevance of topics covered and the tone.  So that they just ‑‑ most people feel like ‑‑ they felt like the tone was too negative, and it's too much and it's just an overabundance of news.  So there should be much more selection and much more thought about audiences.

So there's a lot the media industry can actually do to improve their journalism.  Also, to be less preaching from the ‑‑ less broadcasting but more engaging and building relationships, stable relationships of trust with their audiences and their readers.

Nevertheless, it is obvious that the platform companies have their share of responsibility.  So I wouldn't want to let them off the hook, but everyone, the platform companies on one hand and the media companies, publishers on the other, should really address the issues they can tackle themselves.  He wouldn't let anyone off the hook here.

>> PATRICK PENNINCKX: Thank you so much.  So last question to all of you, is regulation the solution?  And if yes, what exactly would be the need to regulate?

That leaves you ‑‑

>> AURELIEN MAEHL: Yes, on this if I can ‑‑

>> PATRICK PENNINCKX: Yes, please, go ahead.

>> AURELIEN MAEHL: Yes.  Thank you.  No, I just would like to ‑‑ I mentioned this before, what can be done on behavioral advertisements, but I would like to come back on what I said on competition, as a business here, we ‑‑ you know, we proved that it can be business models, qualities that rely on the business models that we have talked about, but right now, the web ecosystem is held, you know, by a handful of companies and it's us competing against Google.

We are in the first line of being affected by their exclusionary practices when they leverage their dominions on platforms they control, for instance, Android to push their own services and to give few visibility to rivals, and so I would say that one solution is to give consumer a fair choice when choosing their services and preference and then this would be first step towards a more diverse market share that responds for more privacy and more interest online.

>> PATRICK PENNINCKX: Thank you.  I have Alexandra and then we'll have to close unfortunately.  Alexandra?

>> ALEXANDRA BORCHARDT: Yes, I mean, of course regulation is ‑‑ is important because there's no democracy without regulation.  I mean, this is the very core of democracy to come up with solutions to challenges and to problems.  The question is:  What we regulate and obviously that is a fairly ‑‑ a fairly complicated issue and, yeah, maybe let me get back to the slot machine thing, because none of us addressed that so far.

So actually, regulating design of platforms would be ‑‑ or coming up with standards how to ‑‑ you know, how to design platforms that they don't exploit certain human proclivity, wouldn't be an option.  I wouldn't really put the regulation in that direction of just, you know, taking down content and giving the powers to take down content to platform companies, but there's many ways we could regulate and hopefully not just in the terms of services that no one reads but maybe also making it easier for individuals to find redress or to ‑‑

>> PATRICK PENNINCKX: Okay.

>> ALEXANDRA BORCHARDT: To really voice their concerns.

>> PATRICK PENNINCKX: Thank you.  I'm sorry I'm cutting you off.  I want a one word answer of Joe and Amy which is very ‑‑ which is regulation yes or no?

>> AMY BROUILLETTE: In a word, yes.

>> PATRICK PENNINCKX: And Joe disappeared from my screen.  I don't know, Joe, if you are still there.  Anyway, this was also an opportunity for you to give a last round and I have been moving away with the sun as you saw.  Thank you to all the panelists for this very insightful ‑‑ oh, Joe is there.  So yes or no to regulation, Joe?  I can't hear you.  Your microphone is switched off, unfortunately.

>> JOE McNAMEE: Competition is already regulated and transparency should be regulated, in relation to ‑‑ particularly to issues around content moderation.  So yes to regulation, careful regulation and regulation that's reviewed frequently.

>> PATRICK PENNINCKX: Thank you.  Thank you.  We will be cut off shortly.  So I will try and sum up.  Thank you so much to all the panelists for this very insightful interventions and thank you also to all the participants in this early hour for the interest in the session.  I think this was a very rich and fruitful debate providing a lot of food for thought for all experts and institutions working on the ‑‑ this intersection of freedom of expression and digital technologies.

The outcomes of the debate was certainly also feeding to the Council of Europe expert committee on freedom of expression and digital technologies which is right now preparing a draft recommendation on the subject, on the impacts of digital technologies on freedom of expression.

So at a later stage, we will organize public consultations on this draft, where you will be further ‑‑ where you can further provide your input and expertise.  I also seize this opportunity for you to visit the Council of Europe booth in the IGF virtual village and we will see each other later in other sessions.  Thank you so much.  Thank you for your participation.

>> Thank you.

>> Thank you.

>> Thank you very much.