The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
>> IGNACY SWIECICKI: Thank you and welcome, everyone, to our session called Do We Know How Much Our Data Is Worth?
My name is Ignacy Swiecicki, I'm Head of Digital Economy at the Polish Economic Institute. And we'll start the session with my colleague Krystian presenting the results of our research. And after that we will have a panel of distinguished guests.
Professor Avinash Collis from the University of Texas in Austin. Mr. Tomas Lamanauskas, candidate for Deputy General of the ITU and managing partner of Envision Associates. Mr. Marcin Petrykowski, CEO of Atende, who will join us in a minute. And Ms. Katarzyna Szymielewicz, the President of the Panoptykon Foundation, one of the most important Polish NGOs in the field of online privacy who will also join us in a minute.
Now I give the floor to Krystian to present the results of our research.
>> KRYSTIAN LUKASIK: Thank you. My name is Krystian Lukasik, and today I will present to you the results of research we conducted at Polish Economic Institute. We want to answer the question of what is the value of data that Polish internet users generate. And we answered this question from two angles.
Firstly, we wanted to know what is the value of data that Polish internet users generate for digital platforms such as Google and Facebook which was done because they are good representation of so-called free digital services.
And secondly, we wanted to know how much value users themselves ascribe to their data and privacy. And we estimated that using this discreet choice experiment, which I will explain in a minute. And in addition, we ran survey to determine on the views of Polish internet users on privacy, targeted advertisement, digital platforms, platform economy and regulations.
So I will start with the survey because results were quite surprising for us. So, first of all, it turned out that people, especially internet users, have quite a lot of knowledge on platform economy. 77% of our respondents said that for free digital services they pay with their data. So we would say they are quite aware. Probably Netflix documentaries such as "Social Dilemma" did their job.
And also what was surprising that majority of people, 87% are afraid or concerned about what companies are doing with their data. So 87% said that big tech know too much about us. 84% said that the technological companies should be regulated more strictly.
But, on the other hand, people are hesitant to pay for better privacy protection. 69% said that no internet services should be paid for, they all should be free of charge. And almost half said that they don't want to pay anything for better privacy protection.
Okay. Here we can see the experiment that we run. It was discreet choice experiment which was because it helps to deal with some inefficiencies of stated preferences survey. So basically the group of 944 people was presented with choice tasks like this. Each person saw 18 choice tasks like this. Nine for Facebook and nine for Google.
So here, for example, they could choose between two alternative options. For example, they could choose Facebook that gathers their data but only from the platform and not from other websites. They could choose Facebook that does not create their behavioral profile and that shows advertisement but those are random advertisement.
And also respondents had always status question option, which is basically Facebook gathers everything, creates your profile, targeted advertisement, but it costs no money.
And later we ran our econometric value to estimate what value and results are here and they were also quite surprising in the sense that people are willing to pay -- Polish internet users are willing to pay approximately 4.90 dollars for Facebook and 3.46 for Google for full privacy. So it's, as you can see on the right, is similar amount as they already are paying for some digital services such as Spotify Premium, Access to biggest Polish newspaper or Netflix, for instance.
And for like partial privacy they are obviously willing to pay less. Partial privacy means that platform gathers your date but only the one that you put in. For no ads, they are willing to pay monthly approximately one dollar on both platforms. And for no profiling, 88 cents for Facebook and 47 for Google.
What is really interesting here is that no targeted ads has negative value which basically we can interpret as if we would took from people targeted advertisement and showed them random ads they would expect some sort of compensation for that. So basically people like targeted advertisement more than regular advertisement.
And what is really important here is the bottom line where we can see that status quo also has negative value which means that people are not happy with the current situation in which they pay for their digital services with their data.
Here we have the other side of the equation. The value of data that users generate for Google and Facebook. And basically we calculated that by taking global revenues of those company and multiplying that by the share of the revenue that advertisement generate and then later we adjusted that for the number of people and for GDP per user in order to adjust for different value that data has in different geographical location.
So average revenue per Polish user for Google is $2.54 which for all of the users in 2020 made $1 billion which is substantially more than those companies put in their financial statement for Polish regulatory office. And for Facebook average Polish internet user brings 2.19 monthly which in total in 2020 was $0.6 billion.
If I remember correctly, it is three times more than this company stated in their financial statement. But we will come back to this later.
And on the right you have a chart that shows our calculation of annual average revenue per user globally for Google and Facebook. As you can see, the amount is rising so it can be explained by a few factors. People put more data in the internet now. Also the tools that we use to analyze and monetize data. Also obviously digital advertisement market is growing as well.
Here we have fun chart. I will try to explain it. Here we can see quarterly starting from 2010 top 10 companies by market capitalization grouped by colors. So red are digital platforms. Blue is other tech. Green is finance and yellowish is manufacturing, MCG. And gray is mining, oil, and energy.
We can also see how platform business model is booming. Right now we can see five companies that are in top 10. While 10 years ago it was zero companies.
So we can see gray dominating on the left in 2010, and then starting approximately 2014 platform business model is taking it over.
So to circulate back, we can compare those four numbers that we got. So we have this 2.19 average Polish Facebook user generates for the company. And 4.19 that average Polish Facebook user is eager to pay if Facebook would gather no data from them. Same goes for Google. Average Polish user generates $2.54 monthly for Google while Google user would pay $3.46 if Google would gather no data from them.
What does it mean for us all of the information that I mentioned? Firstly, we can see that real revenue of Google and Facebook and parent might be higher than the one officially declared. Secondly, people are quite aware that -- quite aware of the role of data in the platform economy and they are also scared about it.
And thirdly, they are unhappy with the status quo which brings us to the fourth point which is connected to this slide that there is perhaps possible different business model in which people would pay for their services less than they are willing to pay on average right now. But even more than those companies get right now from their data. So maybe some other option than paying with your data is possible.
But also we have to remember that in our survey it turned out that people generally are -- if you ask them directly, they don't want to pay for any services, they don't want to pay for privacy, and majority of the people say that they don't trust a paid version of Facebook or Google would protect their privacy better.
So maybe we should think about more radical change in the platform model than subscription model. And maybe, yeah, and maybe that I will leave to our panelists. So thank you for today. Ignacy, the floor is yours.
>> IGNACY SWIECICKI: Thank you, Krystian, for this excellent presentation.
And now Without further ado, I would turn to our speakers starting with Professor Collis if he's available online. Yes, I think I can see you online.
So I would like to ask you about the comments to the approach that we have and maybe a little bit more general. Is it at all possible to measure the value of private data? And what is the economic interpretation of such measures?
>> AVINASH COLLIS: Yeah, thanks for inviting me to participate in the panel. Yeah, it's very interesting piece of research which you have done.
I had a few comments how to interpret these results. I think the first comment I have is like when you run such online experiments, like, for example, asking people if you would prefer a version of Facebook which doesn't use your data to target ads. These are like, you know, very hard questions to imagine, right, as a respondent when you answer a survey.
So I think ideally what we want to do is do some kind of online experiment where the respondents actually experience the platform, let's say, without using their data. You know, what kind of ads do they see, experience it ideally over a longer period of time. And once they understand what they are signing up for, then they are in a better position to respond with the evaluations.
Unfortunately, as you all know, right, like running such experiments is challenging without collaborating directly with Facebook and with Google and with the platforms. And as academics like we have struggled a lot to convince these tech companies to collaborate with us for experiments. That is the first point.
The second point I wanted to mention was you might all know, right, from the research literature there is this concept called the privacy paradox. So basically if you ask if they value privacy, most of them reply and say that they really value privacy a lot. But when you actually ask them to like take real actions like put real dollars on the table or like, you know, put some effort in like increasing their privacy, then not many people actually follow up with that.
So there is some difference between stated preferences like, you know, if you ask people. Even when you give them a choice experiment and ask them to make a choice between two options, if there is no real money on the table like people are more likely to choose the option which offers them more privacy.
But when they actually have to take an action like when they actually have to use their own money like, for example, and spend it on a version of Facebook which doesn't use their data, then you won't really see the valuations to be that high. In fact, the difference can be quite substantial.
So I think it is important to keep some of these points in mind when we interpret these results. Yeah, those were my comments on the study.
>> IGNACY SWIECICKI: Okay. Thank you very much.
Tomas, moving on to you and your experience as a regulator and your experience in the ITU. Can you elaborate a little bit on the significance of such results for national bodies and for public regulators?
>> TOMAS LAMANAUSKAS: Thank you very much and for inviting me to the panel. The most important insight here, as you said, that people are educated, and they know that the data is used, and it is a form of payment.
But what is also just picking up on the last point is actually your study thing confirms that a lot of people, a lot of people say yes, we have too much data. We -- you know, we would be prepared to pay. But then you actually ask them do you think the services should be free or should be paid for, still majority of people say no, they still should be free. So I think it is a lot about sentiment versus real action question here as well which is interesting to explore.
Another thing is, and I think I will come to the regulatory experience here, but I think the data is an interesting thing which is only valuable when it is used. So if it is -- if I have my data with myself and it is not used to give me better service, it's not used to give access to others to me and offer better service, that is probably not much use of the data.
So again the question is probably it's important -- and I'm not saying that you do that, but I think it's important not to oversimplify that they can just move that value that easily from one place to another and its value will remain. Because if, you know, the products will be different. That is the thing also like the previous suggestion about testing differently. Because let's say if Google has to provide the results relevant to you without collecting data from you, you probably would see very different results in Google.
The same with the, you know, like other platforms that provide shopping experience. I have to admit I have in recent years bought something that was advertised to me on Instagram or others, and it is actually pretty relevant, you know, I have to admit. On the one hand, I don't want them to have too much data when it is not relevant to me. But when it is relevant to me, I'm really happy to get that advertisement, you know. So it's also this balance.
That brings me to the experience of telecoms. And I think what you learned in the telecoms and the monopoly experience that you cannot regulate the outcomes. When you say what will be the different model, you cannot predict. You know, we had the famous case where manufacturers had to remove the Internet Explorer. You remember which browser was there. Who in this audience remembers which browser was there. And the question who thought they will benefit. And well the browser, last time, I just checked, the last, the last version of the browser released is 2008. And today we are using Google Chrome. Do we pay for that? No. Is Google Chrome so different in terms of business model? No. It's still tied to some of the other services.
So definitely you cannot regulate the specific outcome, paid, not paid, data, not data into the existence just by setting the direction. I think here and probably I'm already jumping a bit further here, we're starting to see a lot of discussions, you know, everywhere in the world around the questions around data portability so that you can actually move your data to others.
The question is about the ability to aggregate your data. And especially on the sector level and take data from different banks and move it to one platform. And actually that would increase the value of the data, but then also need to have someone who use that.
And the questions about how much power can be combined in one platform, you know, or different platforms. And you see the landmark decision I think just a few days ago in the UK by actually mandating or trying to mandate some sort of coming back, you know, some sort of mergers which happen to go back.
So I think that is where we'll explore, you know, what can you do? I think for me my conclusion is yes, data is valuable. Good that people understand that data is valuable. But I think the final outcome, what does it mean the final design of the product, it is not that clear yet. Thank you.
>> IGNACY SWIECICKI: Thank you very much. Marcin, I would like to turn to you and add some business perspective to our discussion.
What is the importance and meaning of measuring the value of private data for business?
>> MARCIN PETRYKOWSKI: Thank you. Great research, thanks for sharing. Maybe first a few comments on the research itself.
I think, A, you guys are a bit underestimating the potential of this business. It is a far greater business than what the research shows. And it is relatively easy to prove just based on the results of the big, big social media platforms.
If you look at Google itself, its ad arm brought in over $33 billion in the second quarter of this year in revenue. 33 billion. If you look at Facebook, with an average monthly user base of 2.4 billion worldwide, Facebook produces $7 U.S. of revenue from each and every user. So every user generates $7 U.S. of revenue for Facebook itself on a monthly basis. Which I think in your case it was two point something.
So I think -- well, GDP should be probably adjusted. But this is an average figure per every user, so it is a squared out number across the globe. It is definitely higher. So I think the industry is far greater than what the study shows. That's point number A.
The second point is an old saying which says if you are not paying for a product, then you are probably the product. I actually stick very much to one point which was mentioned from your recap was I think the different business model. And I'm actually in a bit of a different camp.
I think we are currently at a time in the evolution of economies and capitalism and the internet where we are still very much tempted, and we are still very much benefiting from the model of freebie usage. And I think it will change around the upcoming will be a model where both clients, so both companies and platforms will actually be paying users for the data which they give to them.
What I mean by that, I can actually envision it thanks to blockchain so decentralized means of data, of sharing data information and also combination of micro payments, I actually can envisage a world if I'm using Facebook for every time I share date with them they would pay me via a micro transaction for that data that I give to them.
And then the competition starts to stand behind who is able to provides them with better quality of data. It's a fact, and we all know even today there is a differentiation in the quality of data.
Actually, interestingly, data on males, on men is higher priced than on females. Why? Because there is apparently more females in the world, so men's data is like more scarce.
B, the best category, the most valuable category of data is between 18-24 years of age. So that means effectively I could envisage a world where you effectively have platforms that twist the business model.
I don't think the model of using the internet for free will prevail. I think the model will convert into for me as a consumer, it is still for free, but I get paid because of my contribution to the system. And today's technology allows that to happen. Back in the days it was not possible. We didn't have micro transactions, it would be very difficult, yeah.
I think today we are actually now at the where together with the regulatory phase, the more we start to push both I would say advertisers and platforms to consider that model I actually think there is a solid business case.
Why? Because if you look at the P&L of platforms, it is very -- it's very unlogical. Why? Because they take the full revenue, but their cost is not the user.
If I provide them with data, that is not their cost. So it's not part of their L. It is part of their profit. It is part of the revenue.
So effectively it is like we are in manufacturing. If I need to produce a product, I need to purchase raw materials that allow me to deliver the product. Data is the raw material. It is the next currency of the world. So I think the model will move into a set of effective payments that are given to users for granting access to information.
>> IGNACY SWIECICKI: Thank you for these very interesting comments. I just want to add that in our research we did adjust the average revenue per user. We adjusted it for the number of Facebook or Google users in Poland and also for GDP per capita.
So I think that the global average is very different to local average in Europe, Asia or the United States. I think that those differences really, really -- but I already see a lot of different perspectives on the business models that can emerge and to which this research points.
But before getting back to that, I would like to turn to Kata Szymielewicz and ask about your comments on the value of private data and what is the importance of measuring it and making users aware of the value that they generate for the digital platforms?
>> KATARZYNA SZYMIELEWICZ: Hello, I joined the panel as a commenter, so it was new for me. Congratulations on the scientific approach to the problem.
What is the value of measuring it? I'm usually more concerned with measuring harm. This what we as Civil Society advocates are certainly more concerned. So showing, finding evidence and measurements to prove that the current business model run by dominant platforms is harmful and therefore we should not allow it. Rather than twisting, as you said, the business model and asking what would be the value, what kind of payout would justify for consumers accepting it even longer.
So if research like this was to open the discussion seriously about twisting the business model towards payouts, I would be not against the research because I'm never against research. But definitely against such a value proposition for users.
I strongly believe and there is also evidence to back that up that that would simply put a seal on our digital slavery and simply destroy the rights-based framework all together. So the moment we accept as citizens payouts for the data, large or small, it doesn't really matter, we actually lose any standing to defend ourselves against abuse. And the abuse will continue to happen because there is absolutely no technical or legal possibility for us to negotiate how our data is used the moment we agree to have it monetized.
This is what we have been observing for the last two decades. No matter what the consent framework is, no matter what you have clicked on, no matter how much you are getting paid for it, the moment your data, the behavioral data to be more specific is the data that comes from observation of your behavior, not the data that you declare, this is irrelevant. This is almost useless. And I believe that in that research if we dig deeper in the value that you are presenting, the value is made by behavior observation, not by the fact that somebody declared I'm interested in cats or I like to buy and use smartphone, right? So what we are really talking about here is the behavioral.
And if we allow for the behavioral observation to continue happening because somebody consented or because somebody got the payout, we are actually done with the rights-based framework and we are in serious trouble. So I would definitely caution against that.
But research like this is very interesting at least on -- with regards to proving the point we have been making for a longer time that this data obviously creates economic value, and we should not be fooled by arguments coming from big data companies that they are still too poor to be regulated.
I think it is high time to ask for a tax on that data power that has been created there. And just, you know, numbers like this should give even more confidence to regulators to say okay, this is a taxation time, like we have already taxed banks on their power and impose quite serious regulations on them. Clearly the next year is the time where we have to do the same with big tech. If we fail this, if EU fails this time, I'm not sure when the next opportunity comes. So maybe the data like this could be used to back up more strict regulation of that value.
>> IGNACY SWIECICKI: Let me ask one more thing because you said you are against the model where we are being paid for the data.
But do you see the chance, or would you accept the transformation towards the model where we subscribe to the services which right now we have some services which are paid by subscription, and we have some services which are paid by our data like Google and Facebook that we were researching on.
Do you see the possibility to transition to a subscription based social network and subscription-based package of other internet services?
>> KATARZYNA SZYMIELEWICZ: Certainly. And numbers prove that people are more and more willing to pay for those services with money. The awareness of the value of the data they generate increases also their willingness to use real money instead of using data. But yeah, that model would certainly clean up the situation.
What one of the sources of the crisis of trust we observed when it comes to big tech and how it functions, is the triangle, the triangle-based business model where the client is advertiser. And as somebody said already, the user is not even the product. I usually say it is a digital biomass. It's just processed information into marketing profiles and simply exploited.
So we are exploited users or addicts, yes, social network coming back. Social dilemma movie coming back in the discussion once again. So we are either exploited users or addicts. We are not clients. We have to fix that part. How to become clients finally, well, probably through payments. I certainly support that. I also think we are wealthy enough at least in Europe to do that and that would be a much cleaner business model.
But in order for us to make that move and start paying with money, we need to be offered the service, which we are not, right? Today we are being offered manipulation machine or attention exploitation machine. We are being exploited. Why should I pay for being exploited? It's ridiculous.
So somebody has to start with offering real service. Therefore in the whole discussion around platforms we usually say authentic personalization is the start of that discussion. Yes, if somebody offers me authentic personalization. So I'm offered access to my data. I don't even have to say no. I'm being asked whether I want to say yes to being profiled. And then I can also choose how I'm being profiled. Now we are entering the space of real personalization.
Be that for advertising or be that for content personalization like newsfeed. And then I can pay like Netflix has just proven that business model beyond any doubt.
SO yes, I do believe in it but has to be reshaped from the begin so that we are back in the client seat.
>> IGNACY SWIECICKI: Thank you for this passionate intervention.
Professor Collis, I would like to get back to you. I know that you have been researching this topic for years and from many angles with much more detailed methodology.
Do you see in your research or in your opinion the chances to change the business model that is currently most prevalent?
>> AVINASH COLLIIS: Yeah, so lots of interesting points raised by the other panelists.
I think to add on some of these points. So one of the model which we are discussing now is people paying for these services, right. So my own research and also your own survey, right, shows that when people actually have to put dollars on the table it is only a very small percentage of people who choose to do that.
Let's say you offered a version of Facebook or Google without ads and offer it as a subscription service. My research shows that like at least in the U.S. context less than 1% of the population will actually pay even like a small amount like a dollar, you know.
So most people at least in this online like platform space, I'm not sure if that business model would, first of all, work.
Another point is I personally think it is really amazing that we get access to these services for free. You know, if you take something like a search engine, right. Like I mean my own research shows that a search engine creates at least in the U.S. like several thousands of dollars in value to the average consumer and you get that for free. And no matter if you are rich or poor, right, like you get access to that for free. So it is in many ways quite wonderful that you get access to these things for free.
Having said that, right, like many of these platforms have also like, you know, have suffered from data breaches or improperly using your data like without your consent. And regulation definitely has a role to play there.
Another point I wanted to make with respect to using your data for targeting of ads, you know, I think it is important to think in terms of counter factuals.
So let's say the option is like you get like a platform like Facebook or Google uses your data to show you targeted ads. The other option would be, you know, they don't use the data to target ads but in that world to make up for that lost revenue they are probably going to show you more ads, right?
So the tradeoff here is not the same platform with or without your data but using your data with perhaps fewer ads but better targeted or not using your data but you end up with way more ads because Facebook wants to make up for that loss when they're not showing you targeted ads.
I think it is important to think in terms of the tradeoffs here as well. Of course, as a consumer like you're going to say that you don't want your data to be used. But it was not a platform with like the similar experience but no data, right? The experience might be much worse.
So these are some of the tradeoffs to keep in mind. But I fully agree with many of the points raised by the other panelists.
>> IGNACY SWIECICKI: I would also like to ask you about, I know you have been researching the attitudes in different cultures or different social groups.
Can you comment a little bit, is our discussion and our focus on privacy, is it like something that we only -- like only one part of what can be seen? And is it very different in different countries or different social groups or is everyone so much concerned about their privacy and data?
>> AVINASH COLLIS: So that's a good point as well. We just did a recent experiment with real money on the table. We did a study where we like paid people to give up their Facebook data to us. And this was a study where we like actually gave them money and they uploaded their data to us.
So this is like not just cheap bot but like rather a reveal preference approach where money was on the table. And what we found is like there are lots of differences in terms of different groups in terms of how much they value data.
For example, in our study what we found is that there is also gender differences like women seem to value their detail less compared to men, at least in the sample in the U.S. context. There is also differences based on income levels. Richer people seem to value their data more and required way more payments compared to lower income individuals. In the context of U.S. there were also like differences in terms of ethnicities between like Black and White and Asian individuals. There are lots of differences in terms of how people value their data.
And this can also be seen in terms of -- we can also think of these valuations as how much they value their privacy because they had to give up their data to us.
Yeah, so the takeaway is there are lots of differences between different demographics. And if, let's say, we move towards the business model where the platform or the regulator forces the platform to pay people for their data, this might lead to like also income inequalities.
Like, for example, we found low-income people and also women and Black individuals like who are traditionally discriminated in the labor market actually like, you know, would get less payments in the world where people get paid for their data. So it's important to keep these differences in mind as well.
>> IGNACY SWIECICKI: Thank you. It's really very interesting to hear. That was very detailed different perspectives.
Tomas, I would like to turn now to you. You have a lot of international experience in your previous roles. Can you say a little bit about was the approach towards the topics we are discussing here, the platforms, privacy issues in different places around the world, you can also see that in ITU, I'm sure, that the approaches are very different.
>> TOMAS LAMANAUSKAS: So thank you for that.
Now I would like to pick up on the other points we discussed. We shouldn't overestimate that everyone thinks like us. That is always very important in these discussions.
And we don't need to go far, you know. In this part of Europe just a bit north and here, a single ID to provide to get good access to government services is okay, a normal thing, yeah.
Across the Atlantic, single ID to get ID to their government services that is issued to every citizen is something unimaginable and total affront to their privacy.
So we don't need to go when we sometimes think about more exotic places, distant places. Just between those two, you know, between the U.S. and Europe we have rather different experiences. And then within Europe we have rather different experiences how much we value data. And how much, you know, that data is -- how much the caring or concern about data is in the category of needs as well compared to other things.
Compared, for example, what before was saying compared for with getting access, you know, to the internet, let's say, versus getting -- you know, sharing your data for that or getting access within services. So that definitely differs.
So a few other things. But, of course, it is important to me and that is coming from the discussions about taxation in telecom. That's why I wanted to pick up on your point of taxation.
Taxation is an important topic for many reasons. To get to allocate the tax right, to different economies right, to get to the various governments and now there is a recent tax deal.
I would definitely be very skeptical about using that as a way to compensate users. Because there is no -- that is actually taking away money from those same users and funding something else. Usually we didn't have a good experience in the telecom sector, you know, the way where telecom sector is taxed. There's actually laws and that money doesn't always get back to the users, that money gets to the different other purposes. And usually increasing a lot of inefficiencies there.
I'm not saying the addition of taxation is not an important topic, I think it is, but for very different reasons than user protection.
Now in terms of the final solution and picking up on some points. How we see that is a bit simpler. So in any service, there are different ways to pay for that. And we had these discussions about the net neutrality and about interconnection and all those other topics.
And we had like from the basic economic standing these are multi-sided markets. So let's say you have the platform, and you have the costs, and you have different revenue source s. You have revenue source ad providers. You have revenue source the people who want to pay you money for digital insights into the overarching aggregate of user behavior.
And what you are proposing today is a third revenue source, so the user payment and some platforms are using that.
What we need to understand, it is all the money that comes into the same business, you know. It's not like -- by definition it is not a values is changing. It is just the kind of different ways you can collect the money. So if the user pays, that means someone else pays less, you know. That means that your advertiser, you know, maybe cheaper for advertiser to reach you, you know.
So there is of course more direct questions. If you are directly targeted by someone who uses your data, maybe you get the discount or get the cash back, but that is already happening there, you know. If the data is used for something maybe you are getting paid.
But I think the idea there is not about creating money or someone is not paying. It is all about the money coming into the overall market and then being used to -- for three things. To cover the cost of capital, that means how much the money invest in the market.
And then I think the real discussion there, you know, for the extra margins that we believe are arising there because the competition is not sufficient in the market, you know. So I think the real elephant in the room is it's not about the user rights, users being paid, the real elephant in the room is actually from the regulatory perspective. The real elephant in the room is the idea that there is no sufficient competition. And that no sufficient competition leads to ability to kind of raise more funds and money and charge higher prices and attract more value, then it would be done in the real environment.
So as a regulator, I think maybe that is where the answer is, you use regulation. You create the rights for the users and make sure that the users have rights, but you don't tell them anything. That's very important. You don't tell them how to use rights, you know.
Some users want to use rights one way, some others want to use other way. As long as they are aware of their rights, as long as they are free to use them, as long as they have them, that's the way.
Apple now allows users much more clearly to decide whether they will retract the platform or not. And then users are given more rights.
And the second thing is really ensuring that it is a competitive environment, it is a sufficient level of transparency, it's a sufficient level of abuse control. And once you have that, you know, you then allow the market to sort out. Some users would prefer to pay, and some users would prefer to not pay. And that will depend on a lot of things as occurs in the previous presentation in different parts of the world. Thank you.
>> IGNACY SWIECICKI: Thank you very much. Marcin, before turning back to you, I want to just say there is a microphone in the room if there are any questions from the floor. Please raise your hands and stand in the line. We will have some time for it in the end.
I already see Yanek asking for the floor. There is also different approach to the changing model discussed, proposed by Kata as well as a call for more regulation from Tomas. I'd like you to talk on those emerging topics.
>> MARCIN PETRYKOSWKI: All right, thank you. So I fully agree.
I think taxation is not really the solution because we are not giving back the power to the user. We're just effectively adding an additional layer of payments to a company that is already benefitting from something that wasn't going back to the user.
I'm a big believer in regulation smart regulation. And I think that is where the devil sits. That needs to be done in a way that won't hinder the development and the freedom and the liberty of an ability to develop the digital economy, but it needs to stand behind the user.
I think then two additional points. The professor mentioned the inequality issue. Your model, Kata, I fully agree which is exactly what happened with Netflix. Netflix is an elite service for a few and you have a lot of people that are excluded from it and that would always happen in a pay-per-view model sadly.
Also if you think, what has the internet given us? The internet has given us democratization. Democratized things that were scarce to effectively everyone. So if we use this model, it is completely against democratization. Hence the model of paying the user is a very democratic model and I fully agree that it will mean that some users will be paid more, and some users would be paid less.
But if you think about it, that actually creates competition and actually incentivizes the users to improve so that their data is worth more by education, by effectively improving their living conditions.
Now it's a very -- you know, that's how the free market is used. When you empower people to make a difference by valuing the data that they give, you motivate them to make a difference in their life. And I think that is mentioned in this discussion.
And last, but not least, something that we haven't mentioned is I think we are not talking about a very big upside of data. Like if you think about where we are today in terms of the world, you know, having this huge issue of environmental risks, data can be used to benefit to, for example, decarbonize the world.
If you look at how much data we can extract around movement, around transport, around industry of things, and I think this also needs to be added into the equation that this data is not only used to benefit the companies, but it can also be used for the benefit of the society as a whole.
And I think this discussion is a bit missing here because we're focusing on the social media platforms. We're forgetting about, I don't know, smart metering is an example. The more smart metering we use, the better we are at assessing how we use energy, how do we add green energy into the mix, and we effectively improve the world.
And that's also an additional component of the discussion where data can really add an additional layer of value which I think needs to be added. And that is again why some of it needs to be paid for because we are actually giving it back to society on a bigger level.
>> IGNACY SWIECICKI: Thank you. Kata, I see you want to comment so I will ask for your short comment and then I will give the floor to Janus who has prepared some questions.
>> KATARZYNA SZYMIELEWICZ: Right, since you asked me to comment, I really wanted to do it now.
It's a huge topic, and it has many aspects so I wouldn't open today seriously a discussion about reuse for societal objectives of the data we generate. I'm all for this, but I just wouldn't conflate it here because it is yet another discussion.
One comment is, one new concept maybe for the ending, we cannot forget how complex is the motivation on the side of user. I mean how different are the human stories behind the decision to continue using abusive service. Some won't have choice because they are not educated. Some will not have choice because they will not have the money to pay. Some will choose Apple, a very expensive service, privacy protective where you pay for the hardware.
Some people will continue using abusive services because they are addicted to the services or because they have been manipulated. So I think we cannot just treat like the liberal economy sometimes do, I'm not saying anybody here is one, that there is another complexity that I think we are missing here.
The novel idea maybe just signaling it. We have been publishing a lot on Panoptyka website if you want to understand better. Unbundling of the online platforms seems to be the approach that offers space for all of these innovations.
So if we unbundle something like Facebook and, make that platform open for alternative services to operate on the top of the social network, then you can do what you mentioned, competition of business models without betting on one.
So then you don't have to choose really which one would be the best for consumers. You let them choose different depending on their motivation.
One will go for a subscription, the other might go for a payout, and the third will go for something that I like a lot which is publicly funded something like BBC for data model running on Facebook. Or nonprofits offering protective services, right. So we could really open the whole array of business models in the space where there is now only one exploitation machine. Thanks.
>> IGNACY SWIECICKI: Thank you. I saw the comments, at least here in the room some were nodding on your comments. So this is obviously we are going towards a consensus.
Now I give floor to Janek and his question or comment. Please be brief. We have 10 minutes left for the session.
>> AUDIENCE: Hi. Do you think that maybe we should not only look at the regulatory side and on the consents and on the rights but also look at institutions? So design a totally novel data institution?
There is talk about data trust and data co-ops. There is successful research in data co-ops. We know that probably -- and there is studies of it that if we just allow people to take decisions they might not take the best ones.
What do you think about these data institutions, and could they be like BBC like public data comments, so to speak?
>> IGNACY SWIECICKI: Thank you. I'm not sure if any of the panelists wants to answer first. Professor Collis, I'm not sure if you want to answer or leave this to the panelists in the room.
>> AVINASH COLLIS: I will let the other panelists.
>> IGNACY SWIECICKI: Tomas, please.
>> TOMAS LAMANAUSKAS: So I think we are already on that road.
So as you see the Data Governance Act in the European Union that was just adopted and has the data intermediaries concept where the third-party will help exchange the data and hopefully will also help the users understand where the data sits and access that. And understand where that data sits.
Then you have different sectoral approaches like different specific organizations especially in Europe the SD2, no payment for financial sector actually created those data exchanges. So if you want to access your new bank and your new bank wants to access your data from other banks, they can do that.
So I think we are starting seeing that. I think it is just too early to say which is to kind of have a final design of that. So I think we are now in the process of experimentation and definitely will be some version of that. Thank you.
>> KATARZYNA SZYMIELEWICZ: Yes, I'm also a big fan of experimenting with institutions. I trust more institutions than human beings on their own to make the right choices.
It's interesting direction, but we cannot ignore the power play within institutions and the problem with creating access in the first place for the institutions that would be entitled to protect consumers.
So an example, if we discuss data trusts and if we imagine wealthy consumers sitting in Silicon Valley with their own pressure measuring devices and these type of people dealing with their data that they knowingly collect and manage themselves and making some deals with private hospitals. This is a very different story from a Facebook or Google story where we have digital biomass processed for advertising profits without any awareness whatsoever.
If you want to introduce an institution here, you need to break the dominance first. You have to either unbundle platforms or enforce data portability realtime. I mean do a revolution in terms of who has control over data.
And that is the biggest issue, right? So first I think we need to reimagine how the data is collected and stored. Data pods or something like that, something in between, for these institutions to even come into a game.
So whenever I hear data trust, I usually think like, oh, another very liberal kind of concept coming from another universe where people are aware of their data, able to manage the data, have time and money to do that. It is very abstract. This is niche, this is elite.
So in order to make that response more Democratic, we need to stop the data dominance on the side of big platforms in the first place.
>> I would only add that I think the issue of institutions is and will be how to assure objectivity very difficult manner and you will end up having different type of politics driving institutions.
I think objectivity, and second is empowerment. How do you really empower institutions to make a difference? There's so many institutions out there that have zero empowerment. And again, that's where I think we should look at the regulator. That's where the regulator then supported by institutions which are objective can really make a difference.
>> IGNACY SWIECICKI: Thank you very much. I want to -- if there are no more questions from the floor, which I don't see at the moment, I have one more question to the panelists.
Because now research apart from the value of data, we also asked about the views and the perception that people have on how the platform economy works. And we see a number of, let's say, paradoxes there, for example.
People generally agree that the big tech platforms know too much about them. But on the other hand, they don't believe that the paid versions would serve them better.
So I would say there is some kind of, on the one hand, the discourse which promotes more regulation is already trickling down to when people start to see there is a problem. On the other hand, what I -- how I read the results is that they are quite confused on what the direction should be and what are the possible solutions.
In your view, maybe starting with Tomas, is it possible to have a general consciousness of the changes that need to be done and on the new business model? Or does it have to be first by the regulation or institutions on all the internet users?
>> TOMAS LAMANAUSKAS: First of all, I think that your research very clearly shows the sentiment is not sufficient.
Actually lack the remote panelist approach that you have to test really kind of test that sentiment with the real -- real money, you know, when people are interested to pay.
And is it because they get the value or because they learned helplessness? You know, when you know like whatever, they have my data, what will I do? Probably a little bit of both, you know.
But I think that general sentiment that, of course, is now channeled in the political discussions are real that people want to have to feel a bit of a control of their own data, of their own information, of their own lives is important. But that doesn't necessarily translate in direct changes in the product design. You know, it can change in the general policy discussions and regulatory discussions.
I think just as a concluding remark here, I think what is most important here is to really have -- to really ensure that people have rights, and those rights can be exercised. And I think that is -- and then allow them to exercise those rights.
And then in that context I would include in questions that was mentioned here. Do they understand it? For what reasons they are. So of course those rights should be meaningful but for me that would be the main focus before I would talk -- and I would not imagine to predict what the outcome that focus would result in as long as the people can make those choices in the real terms. Thank you.
>> IGNACY SWIECICKI: Marcin, I would like to ask you for your summary comments whether we can count on people understanding more and changing their behavior or we need to have some pressure or nudging from the side of regulation?
>> MARCIN PETRYKOWSKI: Well, I think it's a mix. I like concept of a hybrid approach. I think it actually makes a lot of sense, giving people an option to choose. And some people are going down that way.
If you look at LinkedIn, LinkedIn gives an opportunity to use it for free and gives an opportunity to pay for premium services. So in a way I think that is probably the solution. I would mix it with education. I think there is a lot we need to do in terms of educating people to understand how much their data is worth and how can they use to their own benefit and to make sure that they don't use it against themselves.
And B, I would stress this point of I think we still do not realize how much power data can bring in order to improve some conditions of the world as we know it today.
Decarbonization is one. Environmental protection is second. Social unrest. Political. There's so many things. And I think that part is missing from not only the discussion but from a more wider perspective on data as like effectively a fuel of the digital economy.
>> KATARZYNA SZYMIELEWICZ: Yeah, I do hope we can discuss that maybe next year here or in better even venues. I know an institute that is about to publish a big report rethinking data including exactly how to generate or grasp that social value of it.
I think this is coming. This is coming big time. Will it come from the people? No. Will it come from the elites? People like us, we have to reimagine internet, we have to propose these models and then hopefully it will trickle down. Maybe visionaries like Apple in the past will help. Maybe the really innovative startups, I just hate that term because it has been so worn out by just big tech telling us innovation when they do exploitation.
But I still think that maybe innovation can be reclaimed for something truly beneficial. So maybe we can see -- maybe we will see great innovations proving these points, but somebody has to start.
The people who are now the digital biomass completely passive and exploited, they will not run from the internet, this is clear. So it has to come from other places. That revolution.
>> IGNACY SWIECICKI: Thank you very much. We are exactly on time.
I just want to use last five seconds to say that you saw the preliminary results of our research and the report will be published early next year.
It will be openly available on our website, PEI.pl. Thank you very much all panelists for this excellent discussion.
Professor Collis, thank you for your remote participation and comments on our research. And have a good time at the IGF. Thank you very much.