IGF 2021 – Day 4 – Town Hall #51 Unbundling: Free speech and innovation on social media

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> We all live in a digital world. We all need it to be open and safe. We all want to trust.

>> And to be trusted.

>> We all despise control.

>> And desire freedom.

>> We are all united.

>> Good morning, everyone. Good evening. My name is Martha Tudon, and I'm the digital rights coordinator of Article 19 office for Mexico and Central America. It's a pleasure to be here. I'll be moderating this session, and I'm also going to be the one that has the role of being very annoying with the time. So I'm going to be trying to control the time.  And I'm going to ask the panelists, that when they start hearing music on my part, they have one minute to just wrap up their ideas and let the other person continue.

So we are here because Article 19 is trying to present a proposal about how we can reduce the power ‑‑ the concentration of power of digital platforms because we all know, because of several reasons, that different impacts it has in our lives. Maria Luisa Stasi is the one from Article 19, the international office that works on this proposal. She's going to present it. She's going to have five minutes for it which she's going to respect as well. Then we'll have two rounds of questions for our panelists which I will introduce later on. I hope you're comfortable, you're very welcome to be here. Maria Luisa, you have five minutes to tell us about the proposal.

>> MARIA LUISA STASI: Thank you, Martha.  Good morning, good afternoon, good evening to everyone indeed. It's a pleasure to be here. I want to thank all the speakers for joining the panel and providing feedback and discuss with us this proposal. I am going to be brief, as you have understood. Basically, our main reason to come up with this proposal is because of all the challenges we know we have with content creation, algorithms and the want it's performed on the major platforms today. There is a lot of research and some current whistle‑blowers have brought forth how many challenges we have. The solution we have tried to present, we will try to present and discuss today, it focuses on market perspective, let's say.

So content moderation is a service today offered in a bundle, in a package, together with other services. The main one is hosting. Every time we go on a large social media platform, we create a profile and have the content created by a same platform. This makes a lot of economic sense for the platform for a number of reasons. It's extremely easy to monetize. But it's not a necessity. There's two services that can well be offered in a separate way, and our thinking is that why this doesn't happen ‑‑ what we would need to make this happen, and why this would be convenient. So the basics of our proposal is to, as Martha said, to unbundle the hosting and content creation, to ask the large platform to unbundle the two services and to allow third party players to come in and provide the services to the users. In order to function, this proposal will need to be carefully designed. What we're convinced about is that this pace of carefully designing it should be part of our very comprehensive debate. It cannot be regulated to a closed regulatory dialogue between the regulators and the large platforms, but all the relevant stakeholders involved. For what we can say already, we make a number of recommendations.

First of all, it should be effectively implemented by an independent regulator because platforms don't have the incentive to do that in the form of self‑regulation. It should be offered ‑‑ our suggestion is this should be designed as a form of a function of separation. The platform should be free to provide the moderation continuation to the users that they want this. We are extremely aware of a number of biases and nudges. So that's why we believe that this should be presented to users as an option. So users should decide the content creation provided they want to have, and if they want to stay with a large platform, they have to expressly say and choose that. This is a safeguard that we think is the bottom line.

The other characteristics we can suggest is that users say they should be able to make this choice at the beginning and at any point in time. Every time there's a new service that comes into the market that they like the most, they should be able to switch in an extremely easy and smooth way. Those are the main characteristics we can already envision. I'm more than happy to have a discussion with you today.

The last point is why we do believe it is important to break this bundle and to open this market. We see that there are a number of advantages, but the three main I want to present now is, first, we're going to give some power back to users which is something that they don't have now. They will have real choices, the possibility to exercise those choices. Second, we will have competition on the market, a variety of players that will provide a variety of services. We will shift from a very highly concentrated scenario where a lot of power is going to be handled by one or a few companies to a decentralized scenario, and we think this is way more inline with our democratic values. I'll stop here. And I'm very eager to listen to your feedback and comments.11 Thanks.

>> MARTHA TUDON: Thank you so much, Maria Luisa. Now it's time for the first round of questions. First I'll introduce our experts. The panelists are Marcel Kolaja.  He's from the European parliament, a member of the Pirate Party. We have, also, Agustina Del Campo.  She the director of the CELE Center for Studies on Freedom of Expression and Access to Information from the University of Palermo. We have Cory Doctorow from EFF special advisor, and we have Vittorio Bertola, head of policy and innovation with Open‑Xchange.

For the first round of questions, we have five minutes for questions ‑‑ for answers. So when you start hearing the music, it means you have one minute left. The first question is for Marcel. We want to review the role of content curator through the market perspective. ‑‑ creating fair and open competition, what will this do for us users? You have five minutes.

>> Thank you. What are the rules here? Can I take off the mask when I speak. Thank you, and thank you for the invitation. The European parliament is finalizing its position on the Digital Services Act, which is a piece of legislation that sets new rules for content curation. Let me start from a broader perspective. The internet has developed into an entity where a few dominant players on the market decide over access to content. In other words, what users see. Google, Facebook and possibly a few others are the main entrance to the web for a majority of people. I'll borrow a term from another piece of European legislation called the Digital Markets Act and I will call these the gatekeepers. These gatekeepers build their business models around pervasive personal data collection around its processing and then subsequent automatic content curation. In other words, they decide what will be displayed to whom based on the personal data they have collected and on the willingness to pay for audience. The recommended systems are designed to maximize user engagement and by that maximize profit. Now, how accurate does this statement get in the context of recent revelations from Francis Hogan. She uncovered that Facebook was prioritizing content, evoking negative emotions. That boosts anger and hate in the society. The power of content curators in our democracy, sourcing based on data collection, killing diversity of news and information, feeding users with shocking content and leading some of them deeper and deeper into rabbit holes of disinformation and conspiracy theories and shaping public opinion by that. What can we as policymakers do about that? The first tool is transparency, platforms to disclose why content is displayed, to how many people, label if it is promoted, if it is promoted, is it a political statement? If it's a political statement, who financed it? These basic obligations need to apply to all, to everyone.

However, this is really insufficient. We need ways to analyze and process the data, access to transparency not only by an individual, not only by the users, but also by NGOs and consumer organizations so the ecosystem can be evaluated as a whole. My very basic political premise when it comes to technologies is to give the control back to people. They should be able to easily define and set their own content curation, what is relevant to them. We need to give the decision of what is relevant into the hands of people.

Now, back to the European legislation, the parliament's position on the Digital Services Act which will be voted next week, defines obligations concerning transparently of recommender systems. There's still room for improvement, though. The obligations apply only to main ‑‑ of the system and exceptions in the name of protection of trade secrets and intellectual property cause a severe threat of limitations. The Digital Services Act offers very large online platforms and the parliament's position is they need to provide one recommended system not based on profiling. Unfortunately, it doesn't have to be the default one. Also, there is no interoperability operations that would make it possible for users to supply their own third party recommended system.

My last piece, one of the major priorities of my political group in the Digital Markets Act is introduce interoperability obligation for social networks. That made it to the final parliament's position which will be voted in the plenary also next week. However, this obligation will be a major ‑‑ this obligation will be a major contributor to bringing competition to the market by limiting the network effects that keep users locked in on a major platform. However, I see room for improvement there as well. As you can see, the new European legislative package delivers some improvements. However, to some extent it is also a missed opportunity. Please keep your fingers crossed that we get the best out of it. Thank you.

>> MARTHA TUDON: Thank you so much, March shell. Our next panelist. The question is for Cory, and we'll talk about the privacy paradox. Cory, can you tell us what gains and risks come with a diversive environment for content curators and how do we make sure new players comply with human rights standards. You have five minutes.

>> CORY DOCTOROW: Hello everyone. Those remote and my fellow super spreaders here in the hall, when we talk about these digital networks, we tend to overemphasize the role that network affects play in their dominance. We do see networks effects at play. They grow by the fact that people want to join because people have already joined. You go to Facebook because your friends are on Facebook. Once you're there, that's a reason for someone else to join Facebook. In light of all that, it's worth asking why people stay on Facebook or in any of the other gardens. I don't think I've ever met anyone who is happy with any of them. It may not be possible to make a service that serves 3 billion users in hundreds of countries speaking hundreds and hundreds of languages whose norms can be captured in a three‑ring binder given to a subcontractor in the pacific rim. As a result, you have people who stay but don't like it. When you ask the firms why people stay there, they say they're revealing their preference. They're showing you they actually don't mind it at all. If they did, they would just leave. When you look at their private communications, you see something very different. You see in the amended complaint that the FTC brought against Facebook, these firms deliberately engineer in high switching costs. That's what you have to give up when you leave a service. In the amended complaint against Facebook you have executives sending messages to Mark Zuckerberg saying we need to make our photos product really good so people will lock their family photos up with us so we can abuse them more to our benefit because who would leave if it means leaving behind your family photos?

All of this is to say that anything that allows people to lower their switching costs, to have somewhere else they can go to get their curation, that has different rules and different policies that more closely reflect what they want and their values and priorities is going to be to the good. It's going to be a corrective for these network effects because people can leave without having to surrender their contact with their friends and family. Now, the firms themselves will tell you this will be a catastrophe for privacy reasons. Facebook will say, if we allow people to use a service other than Facebook to talk to their Facebook friends, what would stop Cambridge Analytical from showing up and taking all of those users' data? We need to be there to defend those users, and we can only do that if we have a monopoly on how you talk to Facebook users. Facebook has already disqualified itself from defending itself against Cambridge Analytical, by failing to defend us against Cambridge Analytical. Another point that might get lost in there is Facebook does defend you against lots of attacks, as does Apple, as does Microsoft, as does Google. They just don't defend you against attacks that are good for their shareholders. When it comes to your interests and your interests, their interests win. They'll shift their conduct towards their benefit and to your detriment. Apple will defend your privacy unless you're a user in China, in which they'll surrender your data because the cost is too high. Microsoft will defend your privacy if you want to use bing but not if you're an Office 365 user, in which case they'll gather data on every single thing you do and tell your boss about it. All these firms make poor guardians of our privacy. To hold them to account, is not for the firms to use your own judgment about when your privacy matters and when it doesn't, but for there to be a freestanding privacy law, something like the GDPR or the many pro potion ALS we've had in the United States that allows ‑‑ I hear the music, that allows users to have their private right of action so they can seek action, they don't have to convince a federal official that their case is worth taking up. That is to say, we need universal standards for when privacy is worth defending, not particularized standards or parochial standards set in corporate board rooms. We need fit tools for purpose. The fact these firms say the way they'll defend our privacy is by aggressively invoking privacy law or contract law is absurd. That's like the firefighter telling you they're going to put out your fire by aggressively using their wi‑fi. If we want to fight for privacy, we should have privacy laws.

I'll close by saying we are at a crossroads in terms of how we relate to these firms. You can see it in the Digital Services Act and Digital Markets Act. There's a large constituency of people that would like firms to be better at being in charge of our digital lives. The problem with Mark Zuckerberg is he's a bad digital king of 3 billion people's lives and we'll make it good with the right rules or laws or replace him and have someone else in charge of Facebook. There's a group of people who want to abolish the king. In the Digital Services Act we have proposals for upload filters. They cost hundreds of millions of dollars. In the Digital Markets Act you have enter operability that would break the model those upload filters rely on. We really have to choose one or the other. Thank you.

>> MARTHA TUDON: Thank you so much, Cory. The next question is for Vittorio. Which technical conditions and standards need to be in place for third party content to be do their jobs, we're talking enter operability standards and the role of standard setting bodies. You have five minutes.

>> VITTORIO BERTOLA: Thank you. I'd like to provide a perspective from someone involved in designing standards and implementing the technologies for quite some time. The first thing I'd like to stress is what is proposed here, unbundling interoperability is nothing new. It's one of the basic principles under which the internet was build. If you look at 1958, there are architectural principles like open standardization which put together interoperability and unbundling. The idea that internet services should be build on modules as small as possible that can be replaced, that interact with other modules through open standards so that multiple vendors, multiple service providers can provide the same service. You can choose one and replace one with the other and still interact with the models nearby. This is why the idea of unbundling the content creation algorithm is good. It's what was originally considered one of the founding elements for success on the internet. So even the security and privacy concerns sort of fade away if you consider we already have open federated services like this. I mean email, the web, services from the third generation, and they are mostly still built over these principles. They are not less secure or less private that this new wave of services like social media or instant messaging. Many of the problems that are inherent in, for example, multiple different providers in your data have already been solved for this. As a good example is the European payment services director which has created basically a way for third parties to aggregate your personal banking information from multiple banks ‑‑ you have an account or relationship with and then arrange the content and present it to you in a way you like. This is exactly what we should do here for content creation of social media. It's already been done and it works and people are happy about it. There could be problems of implementation, but they have mostly already been solved. So the sky will not fall on our rights because we introduce this kind of unbundling.

There are already many technical standards. Activity pub ‑‑ for Fidelity Social Media, Mastodon is already working on it and it serves millions of people. Again, this is not some weird concept. It's something that already works. There will be the need to define more standards and fix the ones that already are there, complete that. The internet already has internet standardization organizations. The view may be different depending whether you're from Europe or the U.S., because typically the organizations have been mostly led by U.S. people and U.S. companies. So I understand in Europe there is a bit of concern that, if we just defer these activities to the global host, the result might be the big tech companies from the U.S. ‑‑ we want to constrain, will still dominate the technical discussion there. Europe has a ‑‑ unfortunately have not been so open to internet technologies, especially to participation by other than big European companies. I think we should find a good model. The message I want to give is we possibly need both, multi‑stakeholder discussion over these standards and possibly let the internet standardization organization work on them and propose them, but we need the regulators to check what is done and ensure it meets the public concerns and policy objectives.

In the end, to close, the last message I want to give is I absolutely agree that we should ‑‑ I heard it. There are two types of interoperability, vertical, meaning you separate the services into smaller models, and horizontal, that you can replace the service provider with another one. ‑‑ we should also introduce competition and regulation against the network effect so you get horizontal interoperability and more service providers. In the end, I'm a long‑term Facebook user with like 12,000 follows. I cannot choose anything else because I would lose my friends that I discuss every day, though I also hate Facebook. That's really the situation we're in. Facebook makes me angry by preventing me with anti VAX content. They make my life terrible. I look forward to having the ability to choose the content I really want to see. Thank you.

>> MARTHA TUDON: Thank you so much, Vittorio. The last question goes to Agustina. How do we escape from ‑‑ of usual business models. Our alternative business models even possible? Taking into account communities of content creators and moderators. You have five minutes. Agustina was just here. I think she logged out accidentally. Okay. So we have a little probable with that. It doesn't matter. We can go with her after. We can start the second round. The second round is going to be one same question for everyone. Let me see if Agustina is not here. It's going to be the same question for all of our experts. The question is how will you make this proposal better? How will you improve it? What will you add? What will you remove? You don't have to explain why. You'll have your expert reasons. We want to hear straightforward ideas. You just have three minutes to answer this question. Remember, when one minute is missing, I'm going to start playing the song. We're going to follow the same order. So Marcel, how would you include this proposal? You have three minutes.

>> MARCEL KOLAJA: Thank you. You mean the proposal of unbundling, is that correct?

>> MARTHA TUDON: Yes.

>> MARCEL KOLAJA: Well, I'm not sure we need to improve the proposal itself. I think we need to improve legislation so that it allows this to happen. So as Vittorio said, that was very right on his part, that we need both vertical and horizontal interoperability. I pushed really, really hard in the Digital Markets Act because I'm the reporter for that for interoperability obligations and having very specific obligations when it comes to interoperability for chat platforms and social network services. So that is the horizontal type of interoperability obligation that would make it possible that, if users are on Facebook, they could also talk to their friends that are on other social networks, like on Mastodon, which itself is an example of a social network that is federated, which means interoperable between the different instances of its own network. If I'm on one instance, I can talk to my friends on other instances. In addition to that, and this is what this proposal is about, it's also important to think about, okay, if we have some really large online platforms that are dominant on the market and concentrate so much power, it would also make sense to introduce interoperability vertical that would make it possible to functionally separate the different components of that particular platform which means that, if Facebook provides hosting of the social network, it not necessarily means that they also have to provide content curation. They could but users should provide their own recommender system, their own content curation that would make it possible on the market of content curation.

>> MARTHA TUDON: We'll go with Cory. How will you improve this proposal? You have three minutes.

>> CORY DOCTOROW: If we're going to make the proposal sturdy, we have to look at how they fail and when they fail. You hear about hijacking by big firms. I've spent 20 years having my brains melted by big firms that hijack standards. I know it's true. It happens. It's real. What can you do when a standard is hijacked or when the future comes along and people have curation needs that were not countenance in the original regulation to make sure the law keeps up with the pace of technology. We have to look at technologies past. Historically it's been common for new market entrants to create interoperability against the wishes. You had plug compatible mainframe components that did this to IBM, Microsoft doing it to I works which reads and rights Microsoft office files. There's no reason we still couldn't be doing that, no reason a new firm that couldn't come up that would scrape your new waiting Facebook messages and put them in another service, let you reply and push them out to Facebook. Facebook would use the law to reduce you to a radioactive crater which is what they've done to everyone who has tried to create it. We need to engage with everyone from liability under a variety of legal theories, cyber security, contract, copyright, this huge arsenal that tech firms have amassed that they use to fence off adversary interoperability. It's a cruelty to ask a non‑English speaker to say interoperability. We shorten it to comcom. We can use that on the other side of a seesaw where on the one side you have these mandates that we'll make as good as we can, enforce as vigorously as we can. On the other side, you have the possibility of open guerilla warfare where if the firm is messing with the standard to exclude you from it, all you need to do is figure out how to make it work on your own. Now, firms will often have their hands stayed by the threat of that because it represents unlimited potential technical and financial downsides to them because they're going to have to fight off these new interoperators that are going in through a back door they're making themselves. They might just come into compliance as a result. But if they don't, you have a remedy available. You have people who are being frustrated by the shenanigans of these large firms who can engage in self‑help. In order to make that work and in order to ensure that we can distinguish comcom from hacking Facebook and stealing users' data, we need a freestanding regime that doesn't distinguish between whether Facebook is doing you wrong or someone interoperating with Facebook is doing you wrong. It doesn't ask Facebook to make that call, but instead asks democratically accountable officials to make that call. Otherwise what we'll see, even if we get it implemented we'll have to fight forever to keep the large firms from neutering it.

>> MARTHA TUDON: Thank you so much. We're going to have the same question for Vittorio, but before I mass the mic to him, I wanted to make you know that Agustina is facing some internet issues. Vittorio, same question. How would you improve this proposal? You have three minutes.

>> VITTORIO BERTOLA: Speaking of the proposal in itself, I think we should do unbundling. One particularly important thing that should be unbundled from all these platform services is identity. Identity single side. We have a sort of almost hidden oligophagy by Google, Facebook and Apple in how you log into these services. This is creating an opportunity for them to track whatever they do on the internet. It's you that tell them all the websites you log into and they just collect data. You should be allowed to put your identity, personal information somewhere else and use that information within the social media platform of your choice and also within the instant messaging service and any other website and any other service you want. You should really be able to pick your internet provider. The problem paradoxically is there are too many open identity projects, so there's no real competitor to the closed ones. We really need to find them. By the way, Europe has a great thing but only works for strong public administration. It doesn't work for everyday users. It never will. We need something else. More in general, I think that we should try not to make these proposals too much based on specific services. Now social media is all the rage and we need to address that. But maybe we ‑‑ we solve all the social media issues. In five years it's something different where public opinion forms and we exchange information and chat. We need to establish these things as principles for all service, all cases. I'm really happy we got the two new in social media and instant messaging. We should, if possible, try to get a general principle saying that any dominant platform in platform services of public relevance should interoperate so there should be competition. This is the kind of reasoning that we need to have. It also strives to prevent the risk of falsifying regulation so you prevent innovation. This is an objection you often get. If you limit your yourself to these high‑level architectural principles and let the technical implementation part deal with the service of the moment, I think the result will be very positive.

>> MARTHA TUDON: Maria Luisa, can you give us a wrap up, then we'll go with questions and answers from the public. If you have anything you want to be set straight or anything, please let us know and we can ask our panelists. Maria Luisa, you have three minutes.

>> MARIA LUISA STASI: Martha. The first way to wrap up is to thank you all for this very well‑thought comments and for flagging a variety of challenges and strengths that are part of this proposal. By the way, we have shared on the chat the link to our policy brief. It's going to be on our web page if anyone else wants to have a look. We're going to be more than happy to receive your comments and suggestions. It seems to me that sort of a consensus among us that interoperability is sort of a building block of the digital environment that was created in the '90s as Vittorio reminded us, but also of the vision we have of the future and how we want to interact with services and providers from today on, from yesterday, even better.

I think it's definitely interoperability and shaping this fight will be an essential component of this specific unbundling proposal for content creation and possibly for many more markets that are currently locked anywhere gatekeepers dominant. There are a couple of things that I've noted which is definitely this idea of what to do if something fails, if the standard fails, et cetera. I do believe that one of the biggest safeguards is to ensure as much transparency as possible, to engage users as much as possible in this kind of discussion and try to avoid the solutions to fix the problems come always from the same player and always from the same direction. But to sort of try to set the condition for these kind of choices, from this balancing exercise in between variety of tradeoffs and risks, we perform by society as a whole and not just a handful of players.

What else can I say? When it comes to more unbundling and trying to get it right, the DMA ‑‑ one of the most frustrating things in the past couple weeks has been when a number of proposals were tabled for Article 29 of the DSA for third party system was not sufficiently endorsed and supported. So there I think we are again missing consensus. Having said that, I am absolutely convinced that Article 19 with this proposal is at the beginning. We are happy to have put something on the table. It can be improved and I our call is a call for every stakeholder to contribute to that improvement if they want to. I'll stop here for the moment. Is Agustina back?

>> MARTHA TUDON:. Thank you. No, she's not back. If she comes back, we can let her in the discussion. In the meantime, does anyone have any questions from our panelist or any ideas you could like to share, a reaction, you love it, you hate it, why. Please share it with us. Otherwise, I don't know if the panelists want to react to the intervention of Maria Luisa. Yes, I see Marcel. There's a tiny camera. Marcel.

>> MARCEL KOLAJA: Yes, thanks. That was me who raised my hand. Not to Maria Luisa, but what has been said before by Vittorio on interoperability. I totally agree that we must not stop with interoperability of social networks and chat platforms which in the European legislation jargon is number independent interpersonal communication services. And there is actually, as a matter of fact, a broader more general interoperability clause in the Digital Markets Act. I'm really happy that it got into there, but I'm also sure that only the time will show what it really means, how this is going to be implemented, how this will really work, and I'm also sure that we will need some revision after a couple of years to reflect the development on the market.

>> MARTHA TUDON: Thank you, Marcel. Go ahead, Cory.

>> CORY DOCTOROW: I wanted to bring something up that id didn't work into my main remarks that I think is a way to optimize standardization and mandates. Which is the standards process also includes a part where you decide what the standard is for, what it's going to do and a part where you describe how the standard works. You can split those two parts from one another, functional requirement and the implementation, and one thing that you can do to kind of future‑proof things and allow for innovation and make it easier for new market entrants to determine whether or not the standard is being adhered to or not is to make the reference implementation from the standard the safe harbor, so you know you're complying with the law if you implement it the way the standards body says you could, but you can also do anything you want so long as it fulfills the requirements. That allows firms that have ideas incompatible with the implementation but still satisfy the requirements to guy head, it forestalls the argument you're freezing an ember on the dynamic process by which large firms make and update services. There's an old saying that an API is a promise. If we're going to ask them to make a promise they'll never break for 50 years, maybe we can sort of let them set the contours of that promise, not the minimum of that promise but the exact framework for that promise. Then we still have the ability to go to a regulator or go to court and argue about whether their implementation does meet their requirements. That's still intact. You're not having a government committee write software for Facebook which I think is probably something most of us would prefer.

>> MARTHA TUDON: Thank you, Cory. Vittorio, would you like to take the mic, or you're okay?

>> VITTORIO BERTOLA: I'm fine, but I'd really like to hear someone from the audience if there's anyone that has anything to say or to suggest. Are we all agreeing on this?

>> CORY DOCTOROW: I think something as uncontroversial as ours, that we couldn't expect any disagreement.

>> MARTHA TUDON: I think that's good. I can't actually see anyone from the audience. If you have any questions or reactions, please let us know. I really cannot see. Please hand him or her the mic.

>> CORY DOCTOROW: You there, total stranger. What is your question?

>> AUDIENCE: Cory may have heard this before. I like the proposal. I'm an enthusiastic supporter of these type of interoperability competitive compatibility mandates. One of the things I think about is whether they solve any of the other problems we tried to solve on the internet or whether this is really purely a benefit to users and a user empowerment thing. One of the obstacles I always come upon is what I tend to call sort of the stupid neighbor problem where we think that, if I have the ability to choose a different service, I will, but I worry about my neighbor who gets seduced by all the fake things they see on Facebook and they're still going to make bad choices.

One of the things I try to think about is does interoperability help with that at all? Most people don't think ‑‑ with misinformation, most people ‑‑ they'd say misinformation is not a problem for me because I can sort it out, I can figure it out. If I had a different service to go to that had less information, that would be great for me. But it wouldn't solve the problem because some people aren't smart enough to go to a different service. This is my unformed question. What's the limitations of this in trying to solve problems? And then the other concern is, are we ‑‑ is there a problem we're creating or exacerbating if we do this? I don't know the answer to that, but that's what I think about a lot.

>> MARTHA TUDON: Thank you so much. This is such an important question. Maria Luisa wants to answer it. So go ahead.

>> MARIA LUISA STASI: I'm sure each of the panelists has an answer as well. I can maybe just break the ice and say a couple of things. There are a number of relevant things to answer to this question. I think the premise, when we talk about the bundling of hosting and content moderation, we're not giving up on having human rights standards for content moderation. It doesn't matter which kind of provider we use. So our idea is this very much complements human rights standard for content moderation.

The second point is, it is true there's always the risk that certain people, they actually want to be exposed to certain content that we might deem not to be worth it. Two elements there, I think we need to keep separated when it comes to illegal content or what has been variously defined or labeled as harmful, bad, illegal content. This is part of our free expression. If someone wants to look for harmful content, I think as a society, we need to cope with the fact that, yes, they're free to do it, as long as it's not illegal for international or national law.

The other point is, it is true that ‑‑ how we can try to ‑‑ I wouldn't say fix it. But we can try to minimize the risk if we care about our society being better off. One major difference I see is this is already happening. But now it's happening in a very untransparent way. People are not aware, not enough, not sufficiently, and they have no choices. If you implement this unbundling, you create a diversified environment with a number of choices, at least we'll have the transparency. At least people will be more aware. At least they'll need to perform an active choice to go and look for that content.

What I'm trying to say is ideally a diversified environment to empower users is going to make them more active in their online experience. While nowadays what I see, the majority of us, we're passively taking everything that is provided to us. I think in the medium and long term this is ideally what I would like to see happening.

>> MARTHA TUDON: Thank you, Maria Luisa. Cory has a reaction and also Vittorio. I'll give the mic first ‑‑ Cory, Vittorio and then Marcel.

>> CORY DOCTOROW: I guess it depends on what your theory of why people are looking at stuff we think of as radicalizing or disinformation is there. There's one theory, the mind control theory that Mark Zuckerberg made a mind control to sell your nephew fidget spinners and Robert Mercer stole it and make your uncle into a Qanon.

There's another possibility which is that your uncle was always a racist. And I don't know that we can solve that with ‑‑ by making it illegal to have racist ideas or to go into forums with other racists and talk about racism. Notwithstanding human rights frameworks and what have you, if something is lawful to utter in your kitchen among friends, there will be digital kitchens where people will gather and say those things. I think any attempt to stamp that out with moderation policies as we've seen is going to do more harm than good. It's going to capture more counterspeech than speech. After all, there are a lot of people who tolerate a lot of things that are indistinguishable from racial abuse online but don't actually cross the line. The difference between almost but not quite racial abuse and racial abuse is indistinguishable to the person who is experiencing it.

So there are lots of ways you can violate the spirit of that law without ever committing an unlawful speech act, and really by giving people a place to go where the community is small enough and the moderation is sensitive enough to the local norms that they can main those fine‑grained and nuanced distinctions because the people making those distinctions are part of the affected community is the best hope we have for keeping people out of harm's way of that lawful but awful speech.

>> MARTHA TUDON: Visitor I don't?

>> VITTORIO BERTOLA: I think there are at least two parts to this story. The first one is in regulatory terms creating competition is not enough if you don't let people actually choose it, and the dominant platforms will do whatever they want and they can to prevent people from actually choosing competitors, even ones that exist. We've seen that with cookie pop‑ups which in theory you need to give consent, but you're giving consent without realizing it. You don't need to just introduce bundling, but introduce principles ‑‑ for instance preferencing or defaults or pre installation of apps. You do need to introduce them at the high level. That needs to be left to an implementation phase with the regulators and mult‑istakeholder technical community.

Then, once you give a practical choice to users, it's true there will be users that make bad choices. There are many people afraid of that. My reaction from MEPs, not the same party as Marcel, but other groups. I got that answer. I said you're proposing a federated open, interoperable social media, it's much easier to get the information taken down if we only have one social media provider and tell them what to do. I replied yes, it's also easy to avoid disinformation if you only have one newspaper in your country and they're the only people you have to talk with. We've decided long ago that's not good for other reasons. It's exactly the same. Social media are the newspapers of today. We need to apply at the same scale and same principles.

The other thing I'm wary of is this argument is what is being used by big tech to deprive users of choice. If you talk to people at Apple they say they have to ‑‑ and take over DNS that traditionally were provided by your ISP or third parties because in that way they can control your privacy and security because they can control everything. They can prevent you from making bad choices by not giving you a choice. Still this is building again and again this kind of centralized ecosystem in which no one has a word except the CEOs of these companies and now everything is encrypted you have no control about your data and your things. The concern is valid, but we have to be careful not to allow other people to use it for their own business interests.

>> MARTHA TUDON: Thank you, Vittorio. Marcel.

>> MARCEL KOLAJA: Speaking of newspapers, it's indeed not a good idea to have just one publisher in the country even though it's easier to control what is in there. Unfortunately, even in some European countries that joined the European Union as democratic countries we see this trend of consolidation of the media landscape and them becoming either under state control or oligarch control and that's a dangerous trend. Completely off topic here, of course.

What is relevant in terms of disinformation, and it's not connected with unbundling, but I think it really needs to be mentioned is the way ‑‑ how the content is spread. The content is amplified by those social network platforms in a way that they do the content curation and they prioritize the content you see. But, of course, they want to make money out of it. The tool that they have at hand is they know, oh, these five people here in the room, they actually believe that the earth is flat, and over there, these five people, they actually think that some other conspiracy theory is very valid. So why don't we throw them with some other ideas an advertiser may think and can target specifically these groups of people, and this can only work if we collect the vast ‑‑ if we collect the enormous amount of personal data. And then we do a so‑called targeted advertising which actually should be called surveillance advertising or tracking‑based advertising. I believe we truly need to regulate this and we actually need a ban of surveillance advertising because, if we do not do this, then it will be extremely easy to amplify such content. Now, if this was not possible, I would even say that it would be extremely difficult to amplify such content because then how do you do it? Of course there are ways, but the truth is that today, if you want to spread disinformation, then you have to throw a large amount of money at it and it wouldn't be easy to target it. So the tool would be very ineffective.

>> MARTHA TUDON: Thank you, Marcel. I think like you said it's not the unbundle but part of the problem to analyze what meaningful connectivity entails. We have four minutes and then they will shut us down. I'm going to be quick. I was thinking about the unbundling and how it relates to other efforts such as the oversight board and when there's no net neutrality and you have zero rate things, so it will entail that the same platforms will be able to have the same power or why not maybe? Maria Luisa, maybe you can explain a little bit more. Is this a resolution or you need several regulatory frameworks or policies at the same time to be able to decrease their power of the platforms. We only have, like, three minutes.

>> MARIA LUISA STASI: Well, just to jump in very, very briefly, I think that, once again, all these concerns that we have raised, they're all legitimate and most probably we have no one‑stop shop solution to fix them all. I still believe that, if we do open up markets and if we diversify and decentralize the environment, we have way more chances to see a variety of business model appearing in a variety of services that are ‑‑ that are diverse, and they could provide better compliance with disinformation issues or privacy issues and so on and so forth.

So the main point I think is to try to figure out a way to have this variety, this diverse landscape and also complementary to that is to find a way to empower consumers as much as possible. As I said at the beginning, I don't think the unbundling of content moderation can fix all these things at once, but I do believe if we don't have interoperable services and open markets, if we don't have alternative players, if we don't have user empowerment, then it is going to be extremely difficult to fix any other challenge be it disinformation, surveillance advertising, the spread of hate speech, so on and so forth.

>> CORY DOCTOROW: May I say a closing word? I want to say I think the fact this has more moving pieces is a feature and not a bug. We talk about media consolidation, we're talking about the same underlying economic policy choices that created tech consolidation. It wasn't the accident that the web turned into five websites from four. It was a tolerance for anticompetitive conduct. People suffer under that in every realm. Two companies own all the breweries, one company owns all the eyewear, four companies own all the shipping lines, they keep building bigger and bigger container ships that get stuck in the Suez Canal. We're all laboring under some form of monopoly. The fact that there are other elements of this problem that are not tech problems, but rather problems with concentration in other sectors, doesn't mean we have to solve all the problems at once. It means we have an army of allies in every sector. Cheerleaders who can only get uniforms from one company, runners get their shoes from only two companies. All these people are on the same side as us. They just don't know it yet. That is the ultimate strength we have in the fact that there is this big, wicked problem with so many different facets.

>> MARTHA TUDON: So much, guys, thank you, Marcel, Cory, Vittorio, Maria Luisa. Thank you. Have a great day. I'm sorry I have to say goodbye this fast. But bye, thank you.