IGF 2021 – Day 1 – OF #12 Digital Selfdetermination: next steps

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> ANDRIN EICHIN: So I think we're going to start soon.

>> THOMAS SCHNEIDER: So when I sit here, people see me?  Do I need to wear a mask sitting here?  Hello.

>> ANDRIN EICHIN: A few people have joined in Katowice, which is great.

>> We all live in a digital world.

(Audio is not loud enough on Zoom for the captioner).

>> ANDRIN EICHIN: Great!  I think that was the intro.  So I guess we will be starting.  Very well, welcome to this Swiss open forum on digital self‑determination.  Thanks for coming.  My name is Andrin Eichin, I'm from the Swiss Office of Communications and I will be moderating the open forum today and I'm joined by Alice Weiss, who will be moderating the online debate in the chat.

Today, we will be looking into the concept of digital self‑determination and in particular, what it will bring to the current discussion of international data governance.  Three years ago, we started, and like all good policy questions, it started with a few hard questions.  We asked first, how can we ensure that more data is being used and reused in Switzerland?  How do we make that happen especially in the context of growing skepticism about why data ‑‑ about the way that data is being used and handled, and then third, how are we handling the problem of increasing data centralization where a few strong actors control key data and how could this actually start to impact society?

We very quickly realized to solve this question, neither of the existing answers was sufficient.  Strengthening data protection rules or mandating data localization would bring problems with regard to innovation and data flows and not doing anything on the other hand, would only increase the existing tendencies of skepticism and data concentration.

So our answer to this problem was to create an entirely new approach to data, what we call digital self‑determination and our vision is that individuals, companies and society as a whole have more control and access to the data.

And during this open forum, you will hear and learn more about this approach.  So my short intro should only be a first scene setter.  Some of you may have already noticed that this is already the second time that we do an open forum on digital self‑determination at the IGF.  We were already present at last year's virtual IGF and had a great discussion about it and this year, we would like to focus more on the discussion around international data governance and how self‑determination could offer a fruitful solution to how we approach this on the global level.

We have a great panel here but before I introduce them, I would like to remind all of you that we want to have a session that is as inactive as possible.  Use the chat to contribute, ask questions, or raise a hand that we can link you in at the later stage.

After the first interest statements, we will have ample time for open discussion and if you would like to say something, just raise your hand or indicate it in the chat and we can give you the floor.

I will also ask Thomas who is on site to look out for questions in Katowice and I will regularly check back to see if there are questions there that we might want to take on.

And now to the panel, I will introduce them all at the same time, before we hear a short statement by each one.  We are honored to have Gayatri Khandhadai, she joins with us the Association of Progressive Communications.  And then we have Torbjörn Fredricksson, with UNCTAD.

And we have Nydia Remolina, a research associate at the Singapore Management University for data, for.  AI and data governance.  She's an academic specialized on AI and think tank.  And Ambassador Thomas Schneider is the head of international affairs in the Swiss Federal Office of Communication.  He is a regulator in the area of communications and media and he's one of the originators of our work on self‑determination.

And last but north least, Roger Dubach who is the deputy director, the directorate of international law at the Swiss Federal Department of Foreign Affairs.

And without further ado I want to give the floor to Roger for the first intro.

>> ROGER DUBACH: Thank you much, Andrin.  Hello, everybody.  So I will give a very quick intro into the idea of digital self‑determination.  I will focus on the individual component of this idea, and Thomas will then elaborate on the collective determination.  As Andrin already said, we started three years ago and the starting point was to realize that the current way of collecting data and use data has two fundamental flaws.  First, as all of you know, data largely remains in silo.  So we have to find a way to better use and also reuse data.

The data flows restriction has sometimes good reason, as namely data protection, but very often the lack of sharing is motivated also by a lack of awareness about the potential of shared data uses or unfounded fears with regard to losses of competitive advantage or there might be an infringement of intellectual property.

The second flaw we have identified is the one I would like to focus on today, is the fact that individuals have virtually no control over their data.  So we tried to find a way to turn somehow the users, the individuals which are only users in the space to transform them into drivers that they are finally the drivers of the digital transformation.

And so we see on one side that the digital economy allows us to realize tremendous social and economic benefits through new and innovative services but on the other side, the very individuals have to act in the digital space, might also be risk to freedom of choice and action, and therefore, we believe that there is a need for more agency of individuals in the digital space.

Now, how to counter this, and how to create or how to realize this idea and especially as Andrin said, the governance framework that allows individuals to maintain their freedom of choice and action.

We believe individuals should always have access to data, and understand its value as well as the impact it can have on their life for what digital self‑determination stands for.  Our vision of digital self‑determination is to empower individuals to become proactive citizens and cocreators of the digital environment.

The key to self‑determination, at the individual level rests on three factors, first knowledge, second, freedom of choice and third, ability to act.  This means that individuals should understand what happens to their data.  This he should also know where the ‑‑ they should also know where the data is.  They should be in a position to have an independent upon on what is going on and what the data, where the data are, able to make decisions and be in control how and by whom this data is used.

And this means also that digital services and service providers should enable them to do so by technical means as well as through transparency and adequate governance mechanisms.  I would like to illustrate this approach with an example from Switzerland.  We are currently in the process of developing an electronic patient record in the area of health.  So the electronic patient record is the personal collection of the patient's treatment‑related documents.  This might include regular test results, x‑rays, prescriptions, blood pressure measurements or a hospital discharge report after an operation.

By having access to the electronic patient record, healthcare professionals can get important information easily and quickly.  The security correct diagnosis and therapy increases and also the risk of dangerous wrong decisions decreases.

The advantage of this approach is that patients are in full control of their electronic patient record.  To open an electronic patient record, the patient's consent is needed and they can choose their preferred level of security for every item franking from normal access, to restricted access to secured access.  They can choose who is able to access their data and get informed when somebody used their data.  In addition, every access to an electronic patient record is locked automatically for greatest transparency this means a patient can always control who got access to their data at what time and for which purpose.  Patients can access their locked files at any time on their computer, or cell phone, whether at home or traveling abroad.

Thus with the electronic patient record, patients are able to take an active role in the treatment process.  So with this illustration, I get a quick overview of the individual aspects of digital self‑determination.  And thank you for your attention.  I will give back to you, Andrin.

>> ANDRIN EICHIN: Thank you very much, Roger.  I will give the floor to Thomas on site who will give us a short component on self‑determination.

>> THOMAS SCHNEIDER: Can you hear me?

>> ANDRIN EICHIN: The sound is quite ‑‑ can you say something again maybe?

>> THOMAS SCHNEIDER: I say that I like the color of your suit.  Is that understandable?

>> ANDRIN EICHIN: Yes, it is.  It is now getting better.  You might have to have a bit more distance.

>> THOMAS SCHNEIDER: Closer or farther away?

>> ANDRIN EICHIN: Much better.

>> THOMAS SCHNEIDER: Hello, everybody.  Happy to be part of this discussion again.  I will be going right away to my points.  Which are about a little bit more of the societal or the collective side with regards to data spaces.

So as you all know, we all know, data transforms the lives and companies, but it transforms of the ways societies happen in a whole.  The relevance is the impact of the public sector's ability to provide its basic core services.  In all areas that are used today to rely on the universal services.

In many areas, platforms ‑‑ platform providers is have been an essential feature of our society's infrastructure and we only realize this when they are not working.  So for instance, the recent blackout of Facebook and Instagram and what it did to our ability to community and interact is just one example.  Another one is also a fact that there is an increasing push of new start‑ups into various sector at least in Switzerland and much of Europe were part of core government, and have been transformed.

Let me give you an example in the area of transportation, and we could give you examples from health and others.  Already today we see a massive development of mobility as a service.  Many of us rely regularly on services like Uber, on services that are lending or selling electric bikes or escooters and what will likely follow next is the integration of these different services into one single channel, one single platform and ultimately the integration of these services within or emerging with public transportation system.  In fact, Uber has recently released the position paper that it is setting out its vision on how it intends to reform and make the public service better in cities, but also in rural areas.

In many ways, these developments are good news because they will likely bring innovation, and efficient and sustainable use of resources as well as a better customer experience for many users.

However, at the same time, we need to be aware of risks that these developments could have on society and individuals, especially if current trend of lacking public control over private services that have to come to operate part of our infrastructure continues.

If we return to our example of mobility, on the one hand, new innovative mobility service providers can extend and expand and personalize and make it more efficient the offer that we ‑‑ from which we benefit and close gaps in public transport.

On the other hand, there is the danger that these services come to replace, existing current services, not just taxi services but public transport services.  This can have a number of consequences, like worse or under connections in less serviceable areas, and increased use of individualized modes of transportation.

So innovative and improved services, of course, yes, we should use the technology available to innovate and be more efficient and use our resources as our resources better but this needs to come along with all the necessary precautions to ensure that public interest objectives like sustainability, affordability, and accessibility can be guaranteed.

So if you ask yourself now, what has all of this to do with data and digital self‑determination, the answer is actually fairly simple.  Access and use of data are not only the basis for the development of new and innovative services, but in many infrastructures, also the basis to maintain and develop the system.  In these instances, they itself become part of the infrastructure.

And this is another aspect of digital self‑determination that there needs to be an access and control over data that is fundamental for the functioning of socially relevant system and by control, I don't necessarily mean control of the state but control by the people themselves or by the municipality they live in or by whoever is the recipient of the services that people can define and have control over what kind of services they are getting and know that they can decide themselves.  And they are not dependent on a third party that the service provides over which they have no control.

So what are the potential solutions that are discussed currently in different international circles?  One solution could be regulation of the relevant services and sectors.  There's for instance, German professor Crystal Bush, would has infrastructure law this could include elements like universal service obligations like certain providers, pricing regulations, because as we all know that in many of these areas, there are scaling effects, network effects, that render ‑‑ that give a chance or a risk that in the end you will have a monopoly, at least locally and, of course if it's a private monopoly and the people depend on, it there should be ways to somehow control prices or services and so on, as we have it in our traditional world.

Another solution is to develop trustworthy databases that allow data sharing in a decentralized way so that where necessary, key data should be labeled as infrastructure and made publicly available so that actually different services can be delivered on the basis of this service so that there is a competition and there's a mutual incentive to innovate and to over a longer term period serve the people's needs.

We are aware, of course, that this discussion, a lot of it relies on the fact that people, societies are used to have a currently functioning public service or services, be it water treatment or energy delivery or public transport, or health service that is functioning.  There may be areas in the world where these services are functioning less well and, of course, for these areas, new and innovative systems are even more promising because it may actually help them to bring to the next level.  At the same time, the current systems and the governance models of current system, the higher of the risk is, in certain areas they would depend even more on would is delivering this so‑called public or maybe in the future private service.  I'm interested to hear from people from different regions to see the pros and cons of using data and creating data spaces for, yeah, keeping our societies working more efficient with the idea of digital self‑determination and demands.  Thank you very much.

>> ANDRIN EICHIN: Thank you very much, Thomas.  And then we will go to Nydia, who will talk about SMU's work on digital self‑determination and especially you have been looking at what digital self‑determination or could look beyond European perspective.  So Nydia, the floor is yours.

>> NYDIA REMOLINA: Thank you so much, Andrin, and thanks, Roger and Thomas for the introduction on the topic.  This very relevant and pressing topic that will affect everyone, every one of us as data subjects in the digital space.  At the SMU center for AI and data governance, we see digital self‑determination as an important concept that we need to discuss beyond even policy implications.  And changes ‑‑ and changes in regulation.  Why?

Because we are talking about, as Roger mentioned in the beginning ‑‑ we're talking about how to empower data subjects, but this empowerment goes beyond data protection regimes.  So for some, these discussions are on an international level could be a bit difficult if we only are subscribed to data protection regimes because, for example, in Asia, we have a different perspective of privacy.  Very different one to what you have ‑‑ what regulators, policymakers, and civil society have in terms of privacy and in the European Union.

In Asia, privacy is not a constitutional right in all countries, however, it doesn't mean that the development of society and data subject empowerment is not important, and not important to discuss these.  These that consider this data as very close to a human right.  We all care about our empowerment, about our autonomy, and how to determine ourselves in these digital spaces.

So the trustworthiness of a digital space, it's a very important concept from a conceptual and theoretically perspective and societal perspective in all jurisdictions.  It is definitely a problem as well as the use of intelligence and the implications around the world for the humanity in the world.  That's why we are talking about in a policy making debate around the world, regardless of how we conceive legally privacy and personal data, we are all talking about data subject empowerment, data access.  And now digital self‑determination.

So I wanted to highlight this because often we tend to be locked in discussions such as data ownership or what this particular country considers to be personal data or secondary use of data and we want to bring the discussion to self‑determination that is beyond a personal data redeem in a specific jurisdiction.  That's one thing.

And the other thing is that data subject empowerment from the international perspective we think should be taken bond data portability and these type of rights that only circumscribe to also personal data.

We talk in the SMU Center for AI and data governance about self‑determination, beyond personal data and so we talk about secondary use of data, opinion data, and the information and data that is the outcome of an artificial intelligence model.  This affects how the person is determined themselves.  In order to create trustworthy data spaces we need to talk about the implications of use of data beyond the personal data regime, beyond the concept of empowerment, circumscribed only to consent and the leg uses for data in legal regimes and talk about information and data in general and how that affects the individual and how the individual determines themselves in the digital space.

I will let it discuss beyond these points and I want to talk about t introduction and how we tried to conceptualize digital self‑determination and we have put together a theoretical framework that has been discussed with some of the participants in this discussion, particularly Roger and his team.  The feedback has been really useful for us in the development of what we want to produce in order to incentivize companies, governments, civil society, academics, around the world, with regard to what governance self‑determination is, and how to put this into practice in specific use cases.

I will go back to you, Andrin and I will be happy to discuss further during our panel.  Thank so much.

>> ANDRIN EICHIN: Brilliant.  Thank you so much, Nydia.  Thank you so much for sharing some of your research and what how you have looked at the SMU.  This is really useful.  I'm sure we will have a few questions for you afterwards.  But let me turn to Torbjörn now.  He has been working with his branch on the digital economy report that was released a few weeks ago and he will introduce us to UNCTAD's view of digital data self‑governance.

>> TORBJÖRN FREDRIKSSON: Thank you very much, and greetings from Geneva.  Let me first thank the Swiss government for inviting me and UNCTAD to speak at this open forum which is extremely well timed, I think.  It's very topical.  We see, of course the growing importance of data and data flows as a key driver of the digital economy, with regard to more platformization, which was stressed also by Thomas.

In fact these two phenomena or closely interlinked.  In our digital economy report, I put a link to it also in the chat for those who are interested, we note that the top platforms of the world are becoming increasingly important at all stages of the global data value chain.  Not only in regard to their role of collecting data and data transmission, data or cloud storage and data use.  Amazon, Microsoft and Google, they now operate over half of the world's hyperscale data centers.  The same three companies plus Alibaba account for two‑thirds of all cloud infrastructure revenue.

Alibaba, Amazon, Facebook and ten cent account for 7 it% of all the world's digital ad revenue.  So it's really very high levels of concentration here, that's all data.

These top platforms, as we know originate mainly from two countries, the US and China and as much as 90% of the market capitalization value of the world's largest platforms is linked to these two countries.

I the same two countries also account for 94% of all funding of start‑ups in the artificial intelligence field which is a stunning number, 92% in these two countries.  So we see a need of new ways to explore governing data and data flows and how we handle data, we think will greatly affect our overall ability to achieve the sustainable development goals, all of them.

It will have implications for both economic and noneconomic outcomes for instance, our ability to harness data can generate huge social value, for example, by enabling the sharing of data for the development of new vaccines more quickly, or in the fight against climate change.  It can also generate huge private profits as we have seen from the performance of the digital platforms, the handling of data is also relevant from various human rights perspectives, including that of privacy, but also law enforcement, the national security.  So it really has so many different dimensions.

And at the same time, if badly handles, the use of private data can generate various social harms of the individual.  Ultimately, we think we need to ensure that data can flow security across borders as freely and necessary as possible, but while ensuring more equal distribution of the benefits.  Harnessing the data between and within countries and allowing countries to address various risks related to, for instance, human rights and national secure security.  The current land scape with regard to data governance is really highly fragmented and the main players in the digital economy have opted for quite different models of data governance and we don't think that extreme positions on cross‑border data flows can be very constructive moving forward.  It's very hard to envisage a world with totally free data flows, and it's also highly undesirable to have very strict data localization requirements, the world.  We need to balance the various concerns in countries in terms of their development objectives but also their levels of digital readiness, and ability to harness data which really varies considerably around the world.

And, of course, finding a middle ground solution would be important to avoid further fragmentation of the Internet, which we are really heading towards right now, which will not be in the interest of anyone, really.  So that is why we are stressing why we should find ways to go global when it comes to data governance.  We need to think about ways that can reflect the multiple and interlink the data.  I like very much what Nydia was saying here.  We need to go beyond data protection of personal data and try to balance the different interests in a way that supports inclusive and sustainable development.

This will, of course, require the full involvement, also of those countries that are currently trailing behind in terms of capabilities, in terms of harnessing data.  All countries, not just the major players should have a seat at the table when we discuss how best to govern data at the global level.  As of today, we think it's fair to say that no one knows exactly how best to deal with data for development, I really applaud the efforts of Switzerland here to try to, you know, launch the self‑determination concept and see how that can be applied.

I think what is much needed for us today is to come together across nations, across disciplines and even within the UN, we have various parts of our big system that deal with various parts of data but we seldom come together to talk about it in a holistic way.  We would need to find future work opportunities across stakeholder groups and look for ways to engage in structured policy dialogues that can look at sustainable progress in this area.

Just as a simple thing as agreeing on definitions and taxonomies.  Many countries mean different things when they talk about personal data or personally identifiable data.  We think that the UN should offer a very appropriate and inclusive platform to advance the multifaceted agenda, but it will be, of course essential to ensure that the involvement of all relevant stakeholders is secure in the process.  We recognize this will not be easy but we think it's absolutely necessary.  So we really need to bring the brains together and see how best to make progress in this area.  So let me stop there for now.  Thanks a lot.

>> ANDRIN EICHIN: Thank you, Torbjörn and thank you for this great call for international collaboration.

I would now hand over the floor to Gaya who will introduce us to another ‑‑ or let's say, a different okay on why the data governance is needed and looking to other examples that have worked or struggled with governance issues and come to a different outcome.  Gaya the floor is yours.

>> GAYATRI KHANDHADAI: Thank you, Andrin.  I guess I would like to add to what my colleagues said and bring the discussion back to how as far as digital self‑determination goes, the way we look at data is the starting point, because oftentimes these states or experts, how data is discussed as if it's different.  I would like to talk about Anja, Kovacs.  Our bodies don't exist outside of our social context.  Therefore when someone else is in control of our data or is able to do what they please with our data, essentially that is happening to our bodies.

There is a clear sort of intertwining between the physical and digital self‑and for us to discuss digital self‑determination, we need to perhaps have ‑‑ be on the same page when it comes to how we look at data, not necessarily as a resource or as a thing or as a commodity and extension of self.

The reality is that we are discussing all of this in backdrop of an entire industry and significant monopolies that are built around the commodification of data that are fueled entirely by data.  That's simply the reality and the context of which we are talking.  And it is not only the private sector, but it is our governments too that hold huge branches of our government, especially through biometric ID systems or health systems or other office systems.

Now, if we are going to talk about self‑determination, there are three key concepts that have to be central for a human‑centered and rights based approach to data and data governance, most importantly self‑determination, which are control of the uses and depending which context we are talking about.

But the current environment, with little to no knowledge of what happens with our data, be it in relation to what data is collected or stored, how is it processed and who is it shared with and how are they using it?  What happens thereafter?

These questions remain unresolved and when we couple this with arguments that it is, you know, not really something that users will fully be able to understand, therefore it's not necessarily something that we need to provide them extensive information about.

All arguments around the need for low levels of regulations to fuel innovation, low level controls of integration that there are development solutions in my opinion are a little problematic, because it has larger implications.

While the data subject or the user is at the enter consider of the discussions around the digital self‑determination, give than I only have four minutes, I would rather focus my attention on two significant power houses, the state and the private sector.

In our histories of Industrial Revolution and perhaps in some cases Industrial Revolution, there's ‑‑ all major industries have gone through phases of no control to even regulated that is the trajectory of history.

Now let's look at the farm industry, for instance.  Lower levels of regulation or control will obviously favor development of new drugs at a much swifter pace but it will be at what cost?  What would be the risk associated with it?  And is that really what we are hoping for when it comes not digital industry?  Why haven't those concerns translated to the digital industry.  Perhaps it has to do with what I started off with, which is how we look at data.  Perhaps not enough people in power have been affected by the consequences of data being abused.

Now, let me quickly move to the governments.  Now, governments all around have sort of entered into a data frenzy.  Unfortunately in jurisdictions, including India, data belonging to 1 billion people is collected without necessary infrastructure or regulatory frameworks to ensure its safety.  There are essentially sort of making up the rules as we go.

This has become even more problematic around the pandemic that our health data is collected and our movement is collected in the apps.  Apps that are meant for safety of women, how our little agency is really provided in terms of what happens to our data?  When will it be deleted?  Another part of the problem is that we believe that data cannot go wrong, therefore, data has become a gatekeeper when it comes to essential services.  The solution proposed by states is data localization or data sovereignty.  And there was a data about sovereignty that came up in the chat as well, without paying much attention to lack of regulatory safeguards or practice around the safeguards that Nydia pointed out.

With we talk about it, ultimately, we need to ask the question, is it going to benefit the people?  And if we talk about sovereignty and self‑determination, it belongs to the people as against the state.

Therefore, I would say that new models that we are thinking about, for, that we need to look back as much as we need to look for innovative solutions to start with I think, we need to find a way to strengthen the UN framework on business and human rights to significantly enforced.

And if you are still figuring out another ‑‑ I would say the factors that we are figuring out a lot about our data and this needs to improve drastically.  We need to force companies to give us more accurate information on how our data is commodified and how our data is being marketed.  And this is something that can only come from people who are working within these companies, and therefore, we need data partnerships and mechanisms to get this kind of information.  I feel this opt in or opt out is an outdated model.  Users need to have more choices not only in kinds of services that we have or rather we can access, but also sort of choices built in service options that allow us to have greater control over what are the settings we want?

At another level we need to have secure access to infrastructure and devices as something that's not only limited to or out of reach for states and individuals who are not able to afford it.  In other words, sort of data security, data self‑determination, or digital self‑determination that is not the privilege of the rich.

So sort of wrap up my intervention, I would say that many states and regional groupings have digitalization strategies, what we need is multi‑stakeholder spaces including the existing ones to take up these issues more seriously, bring out more transparency and push for better standards, especially on what is happening with our data.

I mean, I would really just say that perhaps discussing future models ‑‑ when we are discussing future models we mustn't existing data which is with different actors which may have already been compromised but we need also to address that.  Thank you.

>> ANDRIN EICHIN: Thank you very much, Gaya.  We have a really interesting discussion starting here.

Before ‑‑ as we are already quite advanced in time, we will move directly to the open discussion, and we already have some questions in the chat.  First I would like to invite Nicholas Colette, he's the swiss youth representative, and I will give you the floor for your comment and your ‑‑ any questions you might have to the panel.

>> NICHOLAS KUREK:  It's a great pleasure to be here.  Our job is to participate in various international conferences, represent the youth in Switzerland, and then in Switzerland, explain to young people how the UN works and how international cooperation works.  I found is interesting listening to you all using the word "data" and using it in quite an obvious manner as that information that is stored on a computer, in the digital space.  A few weeks ago, I was speaking at the UN World's Data forum in Berne which was something that was totally different.  It was mostly about public statistics.

I asked people, what is data, what do they mean with the words data and again listening to you too, I realize that within the public sector, within government, you have two branches which focus on quite different issues.  They are related but roughly two big issues and using the same word to describe it.  You have the information that is used by policymakers to make those policies.  That's what those statistical offices are doing and on the other hand, you have all the information stored by computers, which you guys are working on.

And I realize that there are challenges that are different in both worlds, but which is branded under the same name because of how language is structured.  I figure that element is something for you to take into consideration.  Who do you think are the different actors, the different people that aren't part of your discussion, that you should include?  We here on the panel have public sector, national and international and academia.  What do you think are the other people that you should have on your panels?  Thank you.

>> ANDRIN EICHIN: Brilliant.  Thank you so much, any could lass.  We also have Gaya, a representative of civil society here.  We have almost all of the stakeholder groups but not entirely all, I agree to that.

Would wants to react to what Nicholas highlighted or mentioned?

Any thoughts?

>> TORBJÖRN FREDRIKSSON: I think the comment from ‑‑ is very, very valid and we also, of course, participate in the Road to Berne and the Data Forum.  All of this is data, in a sense but what is particularly different now is the immense increase in digital data that is ‑‑ I mean we have seen I think in 2022, we expect that data flows on the Internet will be bigger than all the data on the Internet after 2016, in one year.

So this is really still growing exponentially and this is going to continue to grow as we roll out 5g, and as it becomes more used and as more people come on to the Internet, we will just continue to grow.  And this is presenting completely different challenges from a governance perspective than ‑‑ I mean, it's not new that we have data and information flowing across borders but the magnitude and the implications for all aspects of development are so enormous now that we need to really take a very more ambitious way of addressing this challenge, moving forward and I think in terms of players that should be involved in this process are all the above.  We need to have the knowledge and understanding from the scientific community to really understand what is I believe to do with data in the digital space, without causing more harm than good.  And we also need to have the government side.  We need to have the private sector involved and we have the academia from all the disciplines, from trade development, from human rights aspects to law enforcement and so on in this service.

I think this is really the huge challenge that we are faced with and we are sometimes comparing this with the challenge that we are facing in climate change.  In 1992, when we met at the Summit in Rio, we brought a lot of different people together and one of the common identifications was that we didn't know enough to really know exactly where to move forward, how to move forward.  So we set up this international panel on climate change.  Maybe we need another international panel on data governance that can bring the various disciplines to go.

>> ANDRIN EICHIN: Thank you very much, Torbjörn.  I saw Roger raised a hand and then Thomas.

>> ROGER DUBACH: Thank you very much.  My reaction was I had a discussion some people working in statistics office recently and I mean I think their biggest challenge is to produce realtime statistics.  They said that they have to adapt in this direction and, I mean, there you enter our topic somehow, namely ‑‑ I mean, the big platforms have the ability, this kind of social media data to come up with statistics which are more close to realtime statistics.  And so the question is:  Who has access to this data?  And what can you do with this data?  And so therefore, I think that even statisticians end up asking similar question as we are discussing right now, and I mean, I would say the old world where you have a very clear basis to collect data to produce statistics and then one year after, you come up with the statistics for the public, but I think that there are real challenges to this model.  So thank you.

>> ANDRIN EICHIN: Thank you Roger.  Thomas?

>> THOMAS SCHNEIDER: Funny, I was going to say more or less the same thing like you about realtime and the word statistics also implies the word static, which is probably not that relevant anymore.

Just one other point is that, of course, we come from different worlds.  The statistic ‑‑ the big issue in statistics is how to organize data with metadata and how to structure an infrastructure that actually finds the right data and so on and so forth.  And in fora like this, we probably talk about ‑‑ let's talk about technical issues and the implications on societies and so on over the use of data.  But I think these two discussions will grow together.  Thank you.

>> ANDRIN EICHIN: Thank you very much, Thomas.  We are already in full discussion and I have two hands that are being raise.  I would give the floor to Amir Mokabberi, would put a question in the chat.  Maybe you can also highlight the questions that you put in the chat because I was coming to them, but you can ask them now in person hopefully.

>> PARTICIPANT: Can you hear me?

>> ANDRIN EICHIN: Yes, perfectly well.

>> PARTICIPANT: Hello, can I jump in here?

>> ANDRIN EICHIN: Sure, absolutely.  Please.

>> PARTICIPANT: Hilt low, thank you, everyone.  Distinguished panelists thank you very much for giving me the floor, I'm Amir Mokabberi.  My question to the Distinguished Representatives of Switzerland, Mr. Thomas Schneider, which is the relationship between self‑determination and national sovereignty in the digital space, and who do you think or what are the main barriers to achieving digital self‑determination?  Don't you think that having global legally finding framework like Treaty for Cyberspace could contribute in this regard?  I think that having a treaty can promote trust in the global level.  Thank you very much for your attention.

>> ANDRIN EICHIN: Thank you very much Amir.  I think the question was directed to Thomas, but I would be really interested to hear whether ‑‑ especially on the second part, Gaya and Nydia, you have a contribution on what you think the main barriers are to determining self‑determination?

>> THOMAS SCHNEIDER: Thank you.  I think these are two very good questions.  With regard to digital and sovereignty, the way the discussion is going internationally on the digital sovereignty, people understand, let's say, the autonomy of a country or of a government's infrastructure from influence or distribution or attacks from a third party or another country, of course this is one of the preconditions for digital self‑determination.

If you have no control over your infrastructure, if that can be turned down or manipulated, of course that has effects on your digital self‑determination capability as well.  So it's like if you have no electricity, there's no need to talk about Internet freedom.  It's like a basic condition.

And the digital self‑determination concept, given that you have an infrastructure that you can use, then how to organize yourself, how to organize the society that your healthcare system works and can use the benefits that these technologies allowed without being dependent of a platform, a big international company from another country that tells you what you can do on their platform or not.  And giving the control to the municipalities that they can define themselves how the local transport system should look like, the health, the energy supply system should look like.

These are complimentary and they necessitate each other.  They are related to each other.

A treaty would probably good and given the tensions and where this world should go and who roles and governments and private sector should have, I guess would not be too successful at this time if we would go for a treaty.  Many people have already tried but so far, there's not ‑‑ at least to me, there doesn't seem to be enough agreement and shared vision that actually that would have a chance to succeed but we should never give up.

>> ANDRIN EICHIN: Thank you so much, Thomas.  Maybe Nydia or Gaya.  There's also a question Ajith, when personal data is no longer personal data, and what does that mean for digital self‑determination?  Gaya.

>> GAYATRI KHANDHADAI: I wanted to address the sovereignty and self‑determination.  Sovereignty from my understanding has more to do with control and noninterference, but self‑determination is a much broader concept, in terms of being able to shape and have sort of agency over your data, right?  So I think that ‑‑ I think in that sense, they are slightly different but connected concepts, but also when it comes to sovereignty, oftentimes at least the way it is being currently discussed in terms of data sovereignty, it comes from three protectionist.  Digital self‑determination is about a much more broader culture.  That brings up the question, what are the impediments?

In my opinion, observing from the civil society perspective, I would say there are many impairments but the two that stand out to me is the lack of culture that respects data at the same level or importance as people and their rights.  I think that's sort of the number one problem.  Number two, I would say is lack of consequences for abuse of data and there's no serious consequences, at least not in the global south, there's very little consequences.  And there's very little that might follow from it.

In terms of ‑‑ in my opinion, personal data never ceases to be personal.  Data that emanates from me, don't stop being personal in my opinion but that's precisely the issues that we assume that date, just because it doesn't have my name attached to it is different, and it's not a concept that goes down yet with my head, but maybe I'm wrong.

>> ANDRIN EICHIN: Great.  Thank you.  Nydia.

>> NYDIA REMOLINA: Just to connect what Gaya was staying on the last question on this important issue that are circumscribe.  In the context of digital self‑determination.  We want to go beyond this concept of talking about personal data and how it affects the determination of a data subject in the digital space.

I will give an example, for example, when companies or governments use artificial intelligence to produce other information or other data defending on the jurisdiction it will give different names of opinion data in some jurisdictions.  This is not considered in most data protection regimes, personal data.

In that context if we circumscribe the question on digital self‑determination, we will leave out of this action‑oriented in terms of digital self‑determination and how it affects the individual this type of opinion data and the outcomes of an algorithm and we don't want to leave this out of the picture because it's affecting the individual in many relevant ways.  In a specific use case, in the financial services, we are observing the implications that go beyond just asking the person if they want to move their data from one company to another or ensure this data property type of right and think about the implications of the secondary use of data, of the opinion data and the outcomes of an algorithm, even if this is unprotected or not being discussed from a personal data perspective and in a personal data regime.  We want to go beyond that.

And that's why I talked about going beyond the regulatory implications or thinking about changing the regulation in a specific jurisdiction.  We want to provide an action‑oriented type of practice or guidance, that will ensure digital self‑determination that not necessarily means that we need to amend the data protection regimes now because data protection is intended to address issues about personal data and it goes beyond as the question was implied beyond personal data.

Thanks.

>> ANDRIN EICHIN: Thank you very much, Nydia.  I see that there's still a hand up.  I'm really sorry.  I don't think we can take that question anymore, but Allan Magezi, if you write your question in the chat, we will get back to you in a written form and we will stay here for a few minutes, but to wrap up the session, I really would like to give the floor to each one of you again for a very quick and I think it has to be a ten second statement, what needs to be done next, sort of in trying to move this into the future, Gaya, I would like to start with you.

>> GAYATRI KHANDHADAI: Well, since it's ten seconds I look forward and I'm optimistic about solutions and I think we should keep these discussions going.  I really like the idea of connecting this conversation to the climate change trajectory.  I think it's a very interesting context.

>> ANDRIN EICHIN: Thank you very much.  Torbjörn.

>> TORBJÖRN FREDRIKSSON: Yes, no, I think it was really good.  Always too short, but I think one thing to keep in mind also as we talk about data and value, is that just having data will not be enough.  We need to consider also the capabilities needed to turn the data into the value, the social value, the private value and that's where we are really concerned about the divide growing within and among countries.

>> ANDRIN EICHIN: Thank you, Nydia.

>> NYDIA REMOLINA: Digital transformation and trustworthiness is related to self‑determination, the governments, the companies and the society in general don't see this as an opportunity to make things right, we could harm innovation and competitiveness of different jurisdictions.  So this should be the view that we need to have when we talk about digital self‑determination and how to protect ourselves in this digital society.  We are talking about how to shape future.

>> ANDRIN EICHIN: I mare we have to end.  If we have 20 more seconds.  I will give the floor to.  To mass.

>> THOMAS SCHNEIDER: The notion of opinion data that may be based on aggregates of personal data but not personal data any more is an important concept that makes people understand that this is not personal data only, but it's much broader, but still then impacts the individual.  I think that's a very helpful concept.

>> ANDRIN EICHIN: And last but not least, Roger for the end with the last 10 seconds go to you.  Thank you.

>> ROGER DUBACH: Thank you very much.  First of all, I think it's a very nice discussion we had.  I take many things with me and I mean the challenge I see is until now we try to be very action oriented and now the challenge is how to bring the discussion to a policy level and to the data governance discussion, and so I'm looking forward for this next step.

>> ANDRIN EICHIN: Thank you so much.  Thank you, everybody, for being here and great discussion.  Apologies to the organizers for overrunning but I think it could have gone on for another half hour, we have much to discuss and this will certainly be continued in some shape or form.

>> THOMAS SCHNEIDER: Thank you Andrin for moderation.

>> ROGER DUBACH: Thank you very much, yes.