IGF 2021 – Day 2 – Town Hall #59 A Human Rights-Based Approach to Regulating Platforms

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> We all live in a digital world.

We all need it to be open and safe.

We all want to trust.

>> And to be trusted.

>> We all despise control.

>> And desire freedom.

>> We are all united!

>> SEBASTIAN SCHWEDA: Good morning, everyone, from Germany.  And a very warm welcome to this session.  Session entitled "A Human Rights‑Based Approach to Regulating Platforms" we hope to discuss this topic with you.  At least one expert.  I'm not sure we have received information that there's one of our experts who can't make it today.  She's not feeling well, and there's another one that I hope is on the way.

And I will have the honor to introduce them to you in just a couple of minutes.  My name is Sebastian Schweda and I'm representing Amnesty International Germany, and jointly with my colleagues Jakob Scherer and Kristian Burghartz who will be monitoring the remote channels and pick up questions from you.

So the principal question of today's session is:  How can human rights be taken into account when regulating platforms?  Or more specific, what are the main human rights issues of the big Internet and the intermediaries, business activities and what regulatory tools are available to resolve them?

Before we delve into the subject matter, let me just briefly give you an overview of our housekeeping rules and especially your options to participate.  As you know, this is a town hall event.  So by definition, it should be very interactive, and everyone participating is very much invited and encouraged to contribute.  You can do so in various ways, and I assume you are familiar in principle with most of them, because using Zoom will help you know the features that are available for you.  You can use the raise hand feature on the bottom of the Zoom window to take the floor, and question live by audio and if you like, you can switch on your camera as well.

In the chat window, at the right‑hand side, you can pose questions to the panelists and then there's tweedback.  We encourage you to make intensive use of this tool to make side discussions on related and the tweedback channel will be monitored by us, and brought into the ongoing session in a summarized way.

So that's an additional tool, and the best thing is that unlike Zoom, tweedback doesn't disappear when the session is over.  So it will remain for a couple of hours.  So you will be able to wrap up your discussions properly and look them up again if you like.

The link to tweedback, I think, will be shared in the chat window just shortly by our online moderators and also feel free to share anything interesting on Twitter, if you like and if you do so, please use the Twitter hashtags, that, again, will be shared in the chat of Zoom.

With that, let's start into the substantial part of our session, and as a background, to give you an idea of what Amnesty has been doing so far on the issue of platform regulation, I will just point you to two documents that were published in this context in the ‑‑ in the past two years.

One of them is ‑‑ was published in 2019, when Amnesty came forward with a report on the most pressing human rights issues regarding the surveillance‑based business models of the largest online platforms to date, Google and Facebook.  This report entitled "Surveillance Giants" is a piece of research that urges both states and companies to act.  On the one hand, it's states by giving users a choice to opt out from ubiquitous surveillance when using digital services by adopting strong data protection laws.

And algorithmic accountability and by ensuring interoperability of services to allow users to search and find alternatives and break the far side monopolies that are prevalent in the market.

And companies are recalled to respect human rights and carry out human rights due diligence.  When the European Commission tabled the draft legislative acts of the digital services act and the Digital Markets Act, Amnesty took the opportunity to narrow down the recommendations from the 2019 report to highlight the need for improvement of the ‑‑ of their post legislation, such as by restricting targeted online advertising based on privacy profiling.

It's been the focus of legislators not just in the EU but in other regions of the world, and it will be interesting to hear from the panel what might be learned from that experience.

So let's turn now to our panelists in this session, and I will briefly introduce them to you.  I have heard that Greg is already here as well.  So first of all, we have the honor of welcoming professor Kazuhiko Fuchikawa from Osaka City University, he focuses on competition law, and he published a paper on digital platform regulation in Japan.  That will be an interesting point that we might be hearing from.

And then we have Greg Mroczkowski, he's here to bring in the perspective of the interactive advertising bureau of which he's the director of public policy for Europe.

So welcome to the both of you, and I'm, again, for any late comers, Nighat Dad won't be able to make it to the panel today.  There's one empty chair, but I think we will be able to have a fruitful discussion anyway.  Without any further ado, I will ask you for the initial statements on which human rights issues you consider the most pressing in the context of online platforms.

Kazuhiko Fuchikawa.

>> KAZUHIKO FUCHIKAWA: It's my honor to be able to speak to you today.  As a professor based in Japan, I can offer the Asian perspective.  The most impressing ‑‑ the most pressing issue concerning human rights is how to regulate providing and the gathering of personal data in targeted advertisement.

We see lots of advertisements on the various Internet platforms when we use search engines or social networks, et cetera.  There are two main types of advertisement, mainly search advertising and programmatic display advertisement.

First, search advertising is sold to entrepreneurs, depending on the particular time and period.  Second, programmatic display advertisement is sold to entrepreneurs targeting gender, generation, interests, et cetera.  Therefore, profiling personal data will be problematic, mainly for programmatic display advertisement.

Big Tech company, such as Google, Apple, Amazon, and GAFA makes the vast majority of their revenue through digital advertisements at the expense of our privacy, especially in the attention‑grabbing market.  Digital platform companies such as Facebook and Google sometimes offer free services to users while they collect their personal data.

On the other hand, they offer advertising subsidies to customers, targeting customers or viewers by gender, generation preferences, et cetera.  The structure of the digital platform markets consists of two groups, users and companies.  The platform services connects these two or multiple parties and related parties to each other.  These are called two‑sided or multi‑sided markets.  These targeted digital advertisements invade our privacies and human rights.  From the economic perspective, the sales of goods and commodities are enhanced by using targeted advertisements.  This will promote competition in the digital platform markets and will affect the consumer welfare since we will have more goods and commodities through the worldwide marketplace.  In a sense, the targeting will enhance consumer welfare, when the sales of goods increase through the targeted advertisements.

United Nations declaration of human rights provides no one shall be subjected to arbitrary interference with his family, home or correspondence, nor attacks upon his honor and reputations.  Everyone has a right to protection of the law against such interference or attacks.

In addition, Article 24 of the digital service act provides saying transparency obligation on online platforms.  The platforms need to be transparent to enable regulators and viewers to identify the sender of advertisement.

Furthermore, the platforms are required to disclose the main parameters used to determine the recipient to whom the advertisement is displayed.

The Amnesty International calls for stricter requirements online advertising.  It's not enough to project our privacy and human rights.  Our fundamental human rights, including privacy needed to be protected, on the other hand, the freedom of ‑‑ the freedom and the economics of choice is also important.  We need to think how to balance between protection of human rights and the freedom and the economics of choice.

That's all I have to say.  Thanks so much.

>> SEBASTIAN SCHWEDA: Thank you, Kazuhiko.  And we pass the floor directly on to Greg.  Welcome, Greg.

>> GREG MROCZKOWSKI: Thank you.  Welcome and I guess it's mostly good morning to everyone.  Very nice to be here.  Thank you for the invitation.  My name is indeed Greg Mroczkowski, I'm public policy director at IAB or we consider ourselves European trade association for digital advertising and marketing ecosystem.

So in the abstract, we would represent really anyone with investment in online advertising and the online advertising world as many of you might know, is quite ‑‑ is quite complex.  So there will be a variety of different players.  We try to bring together all of them.  So I'm thinking advertisers, agencies, technology companies, as well as publishers who have digital properties that will be ad supported.  We work with companies in our direct membership, over 90 of them, but also with a host of trade bodies around ‑‑ around Europe in the EU and beyond.  So there are 25 of these national bodies that are also part of our membership and all of our network would ‑‑ would represent probably over 5,000 businesses, as I said, quite a range of them.

We have been very closely following the discussions in the ‑‑ in the EU on the new Digital Services Act and on the DSA, rather than the DMA, but obviously, they kind of ‑‑ they do go hand in hand.  I think from our perspective, where we have been looking at it ‑‑ again, I can offer probably ‑‑ I can offer this very niche perspective of online advertising within the DSA, and that's this EU and still very specific ‑‑ very specific angle.  But I guess, you know, one can extrapolate and make further ‑‑ make further observations broadly speaking.

I think what we have been seeing, just very topline and then maybe we can explore some of these issues further.  We have been seeing interesting conversations around ‑‑ around how this ‑‑ how this new proposed regulation, Digital Services Act would articulate with other existing ‑‑ with other existing legal frameworks.

From our perspective, we have been very, very interested in how it will intersect with the EU privacy and data protection framework.  So that's probably one very high-level observation.  The other high-level observation would be the very scope of what are we talking about.  We conventionally talked about the digital platform, the digital platform, we use that terminology, but there's probably much more to it.  From our perspective where a lot of our members would be, for instance, would be ‑‑ would be the so‑called publishers, they ‑‑ well, a lot can be categorized under this term of the digital platform.

In the EU, a lot ‑‑ well, I guess many of you.  The Digital Services Act bill is nicknamed the content moderation bill.  I think that is also something ‑‑ something relevant, something relevant to consider.  And I'm saying all of this, again, kind of going back a tiny bit to my first point, which was about the fact that it's relevant to ‑‑ to consider ‑‑ to consider the existing ‑‑ the existing laws, and what is it that the proposals or new frameworks or really scoped to achieve and would is it that they properly ‑‑ they properly cover.

In maybe ‑‑ maybe also ‑‑ maybe also to speak to the ‑‑ to speak to the human rights aspect, I think ‑‑ well, of course, we are not ‑‑ we might not be ‑‑ we might not be experts on this, but I would say ‑‑ I would say that it is important for any law making, any policy making, in fact, to take into account a variety of ‑‑ a variety of these.  It should be ‑‑ it should be balanced.  So I'm looking, for instance, at the EU charter of fundamental rights and I'm thinking, well, respect for private and family life, protection of personal data, also freedom of expression, and information where ‑‑ where I think that has enshrined the media freedom and pluralism which is very important, and also freedom to conduct business, and right to property.  So I think that's articles 16 and 17 of that ‑‑ of that charter.

This is a non‑exhaustive list.  All I'm trying to do is probably one would want to ‑‑ one would want to take a balanced view of what ‑‑ what actually ‑‑ what kind of rights should be ‑‑ should be respected, rather than look at one exclusively.

I would probably stop here and I'm definitely happy to explore, provide for more views of what we are seeing in the discussions and also specifically on advertising.  But I'm thinking maybe it's the best ‑‑ maybe it's best to stop here and then we can continue.

>> SEBASTIAN SCHWEDA: Thank you so much, Greg.  Yes, I believe quite a few challenges to human rights have already been named, and before I think we can try to explore maybe some more, let's ‑‑ let's stay with, for a moment, what I think is the number one issue that we can discuss here, because it revolving around the very business model that drives the digital platform industry, and that I think we have some experts would can speak on that specifically.  It's the targeted advertising model, that's based on detailed user profiles, containing personal data about the user's interests, about their preference ‑‑ their preferences, their behavior, and I think you know Amnesty's position on that.  We believe that ‑‑ you will notice I'm trying to bring in the civil society's perspective now here.  We believe that it has a huge effect on the rights to privacy.  It's a very invasive technology, and when you balance it with other fundamental and human rights, that I think that both of you have already mentioned, this balance shouldn't be at the end, that goes on ‑‑ that hangs on the one end and doesn't consider that there has been a balance on this maybe before when we look at traditional use paper making, et cetera.  There have been other advertising models.  That weren't based on targeted advertising, not so much on personalized behavioral advertising.  Maybe more on a ‑‑ more fuzzy target group of readers, of that particular newspaper.

So maybe we can come back to this, and try to figure out a little more what can be done to develop this furthering in a way that would actually ‑‑ that would actually respect the human rights to the protection of privacy, and ‑‑ and see what can be done by the business to enhance protection on that without obviously delaying the whole sector that was relying on this ‑‑ on this technology.

May I ask you, Greg, again to maybe step in here?

>> GREG MROCZKOWSKI: Yes, sure.  Thank you.  And I have to say, we very much sincerely appreciate the opportunity.  I think very often, again, looking at the European perspective, mostly we ‑‑ we talk and sort of group ourselves and put ourselves in these ‑‑ in silos and there's not enough ‑‑ not enough dialogue.  So that's an opportunity for frank discussion.

I think I would first probably challenge the very notion of service, I think which came ‑‑ which came up once or twice.  I think it would unfortunately mischaracterize the online advertising or data driven advertising business model.

I would think that ‑‑ I would think that if we actually properly look at the surveillance, surveillance definition, it would point to monitoring that is done by enforcement, police, Army.  So that's a very different ‑‑ that's a very different notion.

Advertisers, they don't really ‑‑ they don't really care about quote/unquote surveilling their users.  That's not their objective.  Advertising industry does ‑‑ is invested in creating value for their customers whether this is b2b customers if they are in the broader supply chain, or eventually for their potential clients.  From the broader advertising industry customers, I mean ‑‑ from the broader advertising industry, it's also relevant for the web as such to remain ‑‑ to remain open and their advertising revenues, they are critical.

I think if we look at the ‑‑ if we look at the traditional media and existing business ‑‑ and existing approaches, you would ‑‑ one would quickly discover that well, in fact, this strive for adversibility has always been there.  This is why ‑‑ this is why in the past, some advertisers would go to one magazine or the other.  This is why certain advertisers would go for advertising around specific slots on the ‑‑ on TV or on the radio around specific programs, because they were looking to ‑‑ they were looking to address their commercial messaging to a specific audience.

With Internet, obviously there are more opportunities out there and indeed the industry is trying to leverage them, to make that addressability in the right manner, but that doesn't mean that the law ‑‑ that the law does not ‑‑ that it does not apply and that this is what some people would call the wild, wild west.

In fact, I think we would ‑‑ from the European perspective, we would look at the existing data and private protection framework.  So that would primarily exist of the GDPR and the privacy instrument, the so‑called cookies directive or the cookies law.  Where GDPR had digital protection in the advertising context.

I think heard here profiling being named.  I think that law comprehensively, indeed, covers a host of user rights and very different terminology that would actually point to saying, just to name a few things.  So anonymous advertisers, online advertisers such as cookies, device identifiers are examples explicitly of personal data under the GDPR.  So that's like Article 4.1.

In addition, GDPR provides rules on profiling and provides enhanced rights to users when the profiling takes place.  So that's for instance, Article 4.4.  Behavioral ‑‑ and that's where the user behavior is tracked online.  And explicitly online advertising is mentioned also in the regulation under recital 58.  What we are seeing is that Data Protection Authorities in the EU have been issuing a host of guidance, of updates guidance on, for instance, cookies and other tracking technology.  So this is an object of existing law enforcement, which is what we are very much supportive of.  And we have seen the Digital Services Act as an opportunity to provide for additional set of safeguards.  If you like are or additional layer, vis‑a‑vis the user.  I think my fellow panelists mentioned Article 24 or maybe that was you, Sebastian, so there is transparently provisions there and we would ‑‑ we would be definitely supportive of them.

Maybe I stop here and, yes, maybe also give the floor to my fellow panelists as well.

>> SEBASTIAN SCHWEDA: Thank you, Greg.  Yeah, I will just hand over to you, Kazuhiko.  Maybe you can bring in the perspective of the platform regulation in Japan, if there's anything that you would like to point to, welcome.

>> KAZUHIKO FUCHIKAWA: Thank you so much.  In Japan, from the aspect of Japan's aspect, we also have laws like the DSA or DMA as well, and disclose transparent ‑‑ promoting transparent access or something like that.  And we are struggling with regulating the SNS.  In my understanding the DSA regulate SNS.  So I think, yeah, the DSA is well‑established in that meeting and the enforcement is more stricter in Japan.  So content is quite similar, but enforcement is more strict in a sense.  But I think in Japan, enforcement is more flexibility.  So Japanese government, the ministry makes some governmental orders against Big Tech companies.  I think that's the difference, but we share the same kind of act about that.  Okay.  That's all I will say.

>> SEBASTIAN SCHWEDA: Thank you very much.  Maybe the second question, if you can briefly answer to that, from what do you see in the Digital Services Act and the laws in Japan that seem to be similar in that way.  Would you think that's the right balance that has been achieved in these proposals or in these laws already or do we have to reengineer that balance that would protect human rights in a different way or a more intensive way?

>> KAZUHIKO FUCHIKAWA: Thank you so much.  Yes, we need to research the negative human rights impact.  So yeah, we need to require the viewer to opt in for future advertising.  So I think the Amnesty International for the opt in‑based advertisement, instead of having to opt out.  So I completely agree with that in Japan, we can those kind of opt in and opt out is not expressed but, yeah.  I don't know about the EU, whether there's an opt in or opt out, the Internet platform companies have an obligation to offer their right to opt in.  So, yeah.

And, yeah.  And the Digital Service Act makes an obligation against very, very large online platforms, we also have a similar kind of regulation and so we ‑‑ we ‑‑ we register as a kind of similar to VLOP, Google or Apple and Apple store.  So maybe we have similar regulation about this.

And in Japan, the transparency, promoting the transparency in law is a report‑based regulation.  So the Big Tech companies, if those companies are registered as a kind of VLOP, they need to make a report annually against so yeah, maybe we have same regulations, but there are some differences, I think.  Okay, I will stop there comments.

>> SEBASTIAN SCHWEDA: Thank you so much.  I think quite a few interesting points have already been mentioned, and maybe it's now the time because we have to have an eye on the time, and to bring in to bring in the audience and their comments and their questions I see a question already here in the chat window, but maybe I will and over to Kristian to see what is happening elsewhere on tweedback and elsewhere.  Can you give us a brief summary and pose those questions to the panelists, thank you.

>> KRISTIAN BURGHARTZ: There are not so many contributions so far.  I would like to emphasize everyone to probably share their views and perspectives.  We have one question in the chat.  You probably want to elaborate a little bit on that.  But, yeah, it's concerning what the governments can do to the regulation and it points to the near future if we will see probably more privacy regulations sandboxes, right, where users and governments work together to build a regulations with a balance of right between privacy and enablement of digital economy.  I find it very interesting.  And probably also pointing to the question of procedure, how to achieve that balance regulations.  I would pose that to the panelists.  If you want to elaborate a little bit more on that, please feel free to do so.  It would be interesting to hear.  What you think could be made better, like to achieve probably a balanced regular compared to the DSA in Europe and the regulations.

>> SEBASTIAN SCHWEDA: Thank you, Christian.  Regulatory sandboxes, I think that's ‑‑ that's very interesting that it's brought up.  There has been a lot of discussions about that.  Who wants to step in and maybe give their view on how regulatory sandboxes should be used or can be used here in this specific area.

>> I'm not sure my sound is clear to the audience.

>> SEBASTIAN SCHWEDA: Yes.

>> PARTICIPANT: Allow me to elaborate.  We have the ability to make regulations but sometimes we don't understand the situation very well.  So we start implying a lot of sandbox and FinTech and even during the pandemic and the delivery apps, we put the sandbox on that and even in the privacy.  When we talk to global companies, when we talk about it, we can see there's a need of collaboration.  We can see that there's a need to make the ground of collaboration.  And with the recent publication of the department in the United Nations or the regulatory sandbox, we find this is very helpful for the regulators, as well as for the businesses.

It's helped us to understand the concept, the concern, if it is real, if it has an impact and it helped the companies to understand the integration and we build it together in a way that we are not stifling innovation.  We can have a full set of regulation and we see that collaboration is important in these things.  And this is the concept of the sandbox.  And I wonder if that's coming in the future, and this is why I put the question to the panelists so if there's anything that governments can do more on this area.

>> SEBASTIAN SCHWEDA: Thank you.  Is there anyone who wants to take this question?

>> GREG MROCZKOWSKI: I would be happy to share ‑‑ to share my perspective.  We have ‑‑ obviously, you know, before any legal proposal is put forward in the ‑‑ in the EU, there's ‑‑ there's a process where an impact assessment is made.  There's an opportunity for consultation.  So on and so forth, I guess it's very similar to other ‑‑ to other jurisdictions.  So that's for us as the industry, that's a critical ‑‑ that's a critical moment to engage with the policy maker.  That said, I think there are obviously limitations what this impact assessment can achieve.  Even though, you know, it consists of a whole gamut of different ‑‑ of different activities, including the said consultation, but then also independently provided for research, so on and so forth.

I think when you ‑‑ if I understand the question, well, you seem to be talking a bit more looking for actual testing, some of the stress testing, some of the ideas, how they could work.  I think very generally, it sounds very ‑‑ it sounds very appealing.  What we are also then seeing, trying to work with the Data Protection Authorities which ‑‑ would are to enforce the privacy and the data protection laws in the EU, we are seeing that obviously then, it's not such an easy task for an enforcer to under specific business models.  Again, there will be some limitations to what they can achieve.  And as important as advertising may be, the Data Protection Authorities have a whole host of other business models to and explore the potential privacy risk and so it's not like data protection authority can devote their time 100% to us.  So I think just in the abstract, kind of trying to reverse this whole thinking and trying to stress test some ideas, in the abstract, it sounds very interesting and it helps to avoid and test solutions to be put in writing without understanding the business impact.  It's times a great struggle from our perspective to try to understand that this potential business impact, just because of the interpretation.  So I think it's same for the other side or, you know, a policymaker.  It works both ways.

While I'm not providing concrete ideas, the very suggestion is very, very appealing.  As such, I would ‑‑ in the abstract, anyways, I would support that.

>> SEBASTIAN SCHWEDA: Thank you, Greg.  So regulatory sandboxes as a welcome concept that might be tested to avoid unintended impact and to test things before they are really put on the market.  Do you want to add anything to that?

>> KAZUHIKO FUCHIKAWA: Yes, it's very interesting to know about the regulating sandbox.  I think some regulator like EU and Japan and the UK also have ‑‑ make some reports about algorithm.  So ‑‑ and I think it's ‑‑ these experience will be helpful for the regulator.  That's a kind of, how can I say, sort of testing for other countries and regulators.  So, yeah.  That's my comments about that.

And, yeah.  In addition, we international community needs to be ‑‑ needs to share more information about the regulating the Big Tech company.  So like, bilateral agreement and multilateral agreement or international institution needs preferrable, but it takes time.  So I think bilateral agreement or multilateral agreement is realistic in my opinion.  Okay.  Thanks so much.

>> SEBASTIAN SCHWEDA: Thank you so much, Kazuhiko.  So the way could be to lift at the international level and maybe get ‑‑ get to ‑‑ to draft bilateral and multilateral agreements.  That's a very interesting point as well.

I ‑‑ if I go through the chat comments, I see that there is a discussion going on.  If it's realistic, that targeted advertising can just be abandoned because that might mean that services that users don't have to pay for right now would be payable and, on the other hand, is there a best practice?  That's a question from the audience on getting consent and consumer awareness to their data being collected:  Is the European model working?  We know that and I think that I mentioned it a bit.  That it's more and more going to a consent model.  Is that something that could lead to a peaceful settlement of this issue that users are informed and give their consent freely or have the choice to use a paid service instead if they don't want to give their data.  So that could be an alternative, because that's something maybe I ask you, Greg, before.  Is that something that would be welcomed by the advertising industry as well or say that's not a workable option?

>> GREG MROCZKOWSKI: Thank you.  I think ‑‑ I think we are actually quite invested in the consent paradigm.  I think there's some there's quite some conversations and maybe a misunderstanding as to what do we understand as opt in or opt out.

From where we sit, we see that for accessing the device so under the cookies directive, the eprivacy, the only applicable legal base sis the consent in the current law.  Whereas under the GDPR, we have six levels available.  What is happening in the market?  In an advertising business model, that's probably used mostly under the GDPR consent, or legitimate interest, legal basis.

But because of that, also because of that eprivacy requirement where consent is so legal basis available, it's a very much consent ‑‑ consent‑driven approach all together.

I think it's also important to understand that even for other legal basis under the GDPR, the bar is quite high or definitely higher than the previous ‑‑ with the previous data protection directive, because one needs to provide for much more transparency about what is it that the data controller would do with the data for the user and that's prior to the ‑‑ prior to any data processing, personal data processing taking place.

There is ‑‑ I think there is also this tension or, again, some discussion about what actually constitutes consent, freely given content at least under the European law and this is something that the Data Protection Authorities are grappling with and we have seen a host of data protection guidance in different Member States, but ‑‑ and we have also seen some recommendations.  I think there's already been some case law that this is somehow becoming, making these things a bit more obvious, but I think there's a way to go until we have it, you know, crystal clear, what is it that we mean.

Just to give you some perspective, just looking at the traditional news media publishers.  81% of their digital revenue comes from advertising.  A lot of this will be because of the attractiveness of targeted saying.  A lot of these revenues will be from targeted advertising.  And then looking at it from a different perspective from a user's perspective, indeed, users don't necessarily run, you know, straight for basements.  I think 69% ‑‑ we have been asking people this, 69% of European users say they would never pay for news content online, even if no free content were available.

We asked recently what is it that you prefer?  Do you prefer the existing business model of the Internet with the full understanding that it really is what majority is targeted advertising or pay only?  And 75% said we prefer the former.  And there is, of course, this looming question that restricting access in terms of and providing only paid alternatives, well, that will have major consequences on the society at large, but ‑‑ and this is the laugh thing I would say, we are seeing that publishers, they are obviously trying to test different business models, including subscriptions and advertising is not going to be the sole business model of their digital properties.  They need to test different things to create something that is a bit more ‑‑ which is appropriate to them, I will stop here.  I probably talk too much anyways.

>> SEBASTIAN SCHWEDA: Thank you so much.  It's very interesting that the advertising industry is also open to explore new models.  You said, it's 75% who would prefer to stay with the current model.  That is the majority, according to the service but there's still a strong minority that would be worth exploring and see how they can ‑‑ how they can be offered more alternatives to targeted advertising model.

But interesting.  Thank you very much.

We have five minutes left.  So I think we ‑‑ we'll have to draw it to a close, and before I ask you ‑‑ before I summarize the discussion, would ask you maybe to come up with one action point each that in your view should be addressed most urgently to make sure that human rights are addressed, when regulating online platforms.  That way our session will contribute to advance the debate on a global level.  Is it the GDPR as a role model and it should be main to the global level or any particular thing that should be explored further?  We heard regulatory sandboxes.  Anything that you feel should be done and by that way we will meet the IGF session to close with a call to action.

So I would invite you to give your last opinion, one action point that you think should be tackled.

Maybe we can start with you Kazuhiko.

>> KAZUHIKO FUCHIKAWA: Thank you.  I will make a brief comment about our panel.  Yeah, so I think GDPR is very well‑established.  Yeah, in Japan, how can I say ‑‑ the data protection, data portability is not protected enough.  So we have tremendous advertisement and we need to opt out.  So yeah.  European country is good enough to protect the data portability.  And I would like to say that data portability is helpful to prevent the concentration of the Big Tech company, and a Big Tech company like GAFA, collecting and gathering the personal data and make big data.  GDPR somehow protecting the acquiring the data from competitor or start‑up so we care about collecting the data as well.  So I think the GDPR is well‑established and we need to care about the data gathering.  That will be helpful for our privacy and human rights as well.

Yeah.  Thanks so much.

>> SEBASTIAN SCHWEDA: Thank you, Kazuhiko.  And onward to Greg.

>> GREG MROCZKOWSKI: Thank you.  And thank you for the discussion.  There were some very interesting insights.  It would actually be useful to continue to the conversation.  I do agree with Kazuhiko and this is one the questions I saw in the chat.  GDPR is quite well‑established and we are seeing actually, if you like, Brussels effect.  So, you know, just to name a few, California privacy bill, Brazil, I think China arguably, building on the GDPR.  Quite a few other places, I believe, South Africa, Singapore and there's probably more.  I think that is food for thought with that ‑‑ with that alignment, general alignment of the privacy laws.  In terms of the call to action, I think I would mention ensuring that, well ‑‑ governments ensuring that regulators have the right resources to actually properly enforce the law.

This is me trying to come back in one way or another to the fact ‑‑ to the principle that the DSA, for instance, should really respect whatever privacy and data protection legal framework is already ‑‑ is already in place in the EU anyways.  And we are seeing that Data Protection Authorities, I think probably should have ‑‑ should have been ‑‑ well, there's been a lot of discussion about the enforcement of the law and the capabilities, even, you know, human resources of the authorities on the ground.  So just making sure that the regulators, the eventual regulators have everything they need to properly enforce legal instruments the existing law and the future law, including the DSA.

>> SEBASTIAN SCHWEDA: Thank you, Greg.  And with, that it has been said that the time is over.  I would like to say thank you to all the panelists, to the attentive audience for all of your valuable comments, your questions.  This was a very dynamic and forward‑looking discussion, and I'm very confident it can be a fantastic foundation to build on for anyone would is able and willing to take things forward here.  I'm sure our panelists will do.

And with that, I thank you all very much.  Thank you.