This is now a legacy site and could be not up to date. Please move to the new IGF Website at

You are here

IGF 2020 - Day 5 - OF28 Swiss Open Forum on Self-Determination in the Digital Space

The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 




     >> MODERATOR: We will start if all of the panelists could put themselves on mute that would be great.  So we have everything -- Dominique, I see you are still unmuted, if you could mute so we don’t have any feedback.  Perfect, brilliant. And then I have some background noise, I don’t know where that is from.  Now it’s better, perfect.

Great. Well, thank you all very much for being here, you IGF participants.  It’s great to have you all here.  Many thanks for tuning in to this Open Forum on Self-Determination in the Digital Space.  My name is Andrin Eichin, I’m from the Swiss Federal Office of Communications, and I will be moderating the session today. 

As you can see, we have a number of speakers lined up and at the end we will also have some time for your questions and inputs.  But before we start, it would be great if you could in the chat quickly say who you are and where you are from and what your interest is so that everybody can see who else is in this conversation.  And before we start with our speakers, let me give you a little bit of background on what we are working on within the Swiss government. 

As anywhere else in the world, the use of data has also grown rapidly in almost all sectors in Switzerland.  We see new and innovative database service being developed, for example, in mobility, health and finance and others.  And those services can enhance efficiency and improve well-being in both the private and public sphere.  On the other hand, an increasing and concentration of data in the private sector or governments can also create the risk and dependencies for individuals, companies and also the general public. 

So the questions we have been asking ourselves are how can individuals be at the center of this development, how can we further empower individuals and enable them to be in control of their data?  And how can we at the same time enable the use and reuse of data for the benefit of the economy and the society as a whole?

I will go with this open forum to give you first insight into the work that is currently going on at the Swiss administration and then open up a discussion on how we are approaching the issue and that is where you will come in as well.  And you can see on the bottom you have the Q&A section so please use that.

And in the last segment we will also allow people if you want to intervene to raise your hands and give a short contribution.  But let's kick off with the first speakers.  I will introduce as you go.  The first speaker is Ambassador Roger Dubach to give us an introduction to what digital self-determination means.  He is Deputy Director of the Directorate of International Law in the Swiss Department of Foreign Affairs.  You have the floor.

     >> ROGER DUBACH: Thank you very much.  Hello, everybody, ladies, and gentlemen.  I'm pleased to open the session and to present you some thoughts about digital self-determination.  And Andrin described, we think that the digital reality increases the way we exercise our fundamental rights and liberties and we are aware that data is the most valuable raw material in digital societies. 

For this reason, we believe that in democracies, citizens should have access to data and understand the value and effect it can have on their life in order to make self-determined decisions.  The way data is made available, accessed exchanged and used it is not only paramount for meaningful politic participation by constitute the basis for a flourishing digital society.  We think today in the digital world citizens are primarily users with a very limited say with regard to their data.

     They either give up personal data by consuming digital products and services, or they choose not to use the digital service at all in order to protect their personal data.  And so there is where our idea of digital self-determination comes in. So the objective is the question is how could we turn a user into a proactive citizen into a co-creator of its digital environment.  So somehow how to bring the people into the driving seat of digital transformation.

     So if I say this, it is very much based on individuals but there is very much a collective element as well. So digital self-determination apply as well to businesses, associations, for instance, a district in a town so there is definitely this collective element in the notion of digital self-determination.

     And we think that in the current environment there is very little space for that.  There is the consent, the informed consent, but it becomes very much a blind clicking exercise. And so there has to be a new way to deal with these issues.

     Our approach is to try to establish trustworthy data spaces and such data spaces should be ecosystems.  And around these data spaces or these data spaces should be designed or have to be designed in an inclusive and fair manner allowing equal participation.  In doing so, the trust between those who use digital services and those who process data will be strengthened.

     So, trust is absolutely crucial in our concept.  And we think that with trust and transparency in the data space there will be more interaction and participants would be more willing to share data and so to use and -- to use the innovative and new -- I mean to make an innovative and new approach to data use.

     Such data spaces have also to be how to say they should be developed in the meaning that it is about whole data life cycle.  Digital self-determination needs to be thought through the whole data life cycle.

     We are very much aware that this is a big issue that there are a lot of questions.  And that is also why we would like to discuss this issue with you today.  In Switzerland, we started about thinking about the concept two years ago.  We have put in place a kind of a national network with people in the public sector, private sector, academia, and there is an increasing number of people joining this discussion.

     And so with this spirit, we also would like to engage a discussion on international level and we very much think that it makes no sense for a country like Switzerland to have this discussion only on a national level and that is why we are here today to present the first thinking and hope we will engage in a common discussion and identify the challenges and see how we could put this discussion forward.  Thank you very much, and I give back to Andrin.

     >> MODERATOR: Thank you very much, Roger.  Our next speaker is Dr. Matthias Galus.  Matthias Galus is the head of the Innovation Office at the Swiss Federal Office of Energy and responsible for digital advances and technology diffusion in the sector and works on the adaptation of the regulatory regime for efficient renewable energy integration, energy efficiency and decarbonization and will talk about how the concept did be translated into the energy sector and what the challenges entail in the sector.  Matthias, you have the word.

     >> MATTHIAS GALUS: Thank you very much for this nice introduction.  Of course, we do see that the future of the energy system is very much digital as well and wrapped around the questions on how to use data and how to access data much better in order to use it for a more sustainable and renewable future in energy provision that is, of course, then used for different applications in other sectors.  So if we have a look at the energy sector how it is today, I give you a very brief introduction in order to make you a little bit more familiar with the system or with the energy sector.  Next slide, please.

     So basically, and I think that is the case for many countries, in the world, it is dominated by incumbents, by national, rather big national energy companies. And if you look at different countries, of course, there are more or a little bit less of those companies.  But they are still there and looking at the transformation in the energy sector more and more new players come in, especially from the renewable side.

     If we look now at questions on digitalization on data access and so forth, we do see that often we face kind of aging or infrastructure for data exchange at least in Switzerland I -- I am tempted to say that most of it is Excel based unfortunately when it comes to data exchange which is quite disturbing, of course.  However, that is the case.

     And, of course, we do see a large momentum to change that.  But, in general, the data exchange is then based on, you know, FTP kind of solutions where each player sends messages to all other players so it is highly decentralized and the manual part of processing the data is quite strong still.

     So we will soon see more and more metering devices at the consumer level introduced by federal regulations. And I think that is the case in many countries in Europe driven by the European Commission but also internationally so-called smart metering devices that are then connected to a digital infrastructure in order to have the data on consumption and production much better available as it is today.

Just a brief comment here often the data from household levels that it is available every let's say if it -- in a good case it is a month in the normal case it is six months.  So it is quite a big gap if we look to digital world and the consumers and others deliver data to the companies that are natural monopolies which is also quite a special case for the energy sector. And then the data is managed and shared on behalf of consumers without having much of access for the consumers or any decision making of what is done with the data so far.  Of course, there are privacy rules in place, but it is not obvious for the consumer right now what happens to the data.

     We see a large momentum for digitalization and more and more platforms are about to be built and created for different services, all of them trying to use that data much more efficiently and to get a hand on it. And that, of course, raises the question how do we look at the data and how do we have it?  So if that is basically the case here, if you see I just created a very simple more or less simple picture so the data comes bottom up from the so-called prosumers and consumers, the data owners and they go to different incumbent companies.

And then what is planned here in Switzerland is the national data infrastructure in the middle that kind of orders the data exchange with all other stakeholders that right now need to have the data for security system operation but also new player.  So the data infrastructure will play a very central role. And if we look at -- have a look at the next slide. 

It is somewhat a bit different than the digital space.  So if we talk about infrastructure, we talk about the lower bottom area where we really build the physical connections for sending the data, but then distributing to all other players, that becomes -- that is the gray part -- that becomes more of a digital space where people use data in different forms and aggregations to do something with it to offer services or to keep the system up and running.  And there we can pose the question. Next slide, please.

     How do we ensure that this digital space if we want to call it, like this is actually kind of an ecosystem where digital self-determination where consumers can enforce who is doing what with their data and they do have kind of a transparency of what is happening to the data? And another question is then, of course, if we want to have this which kind of rules do we need to have and what is the role of the data infrastructure that will certainly be regulated in Switzerland?  Because at the interface between data infrastructure and the digital space, we could enforce certain rules on how different data ecosystems will use the data and enforce transparency for the consumer.  Next slide.

     That brings me to the end.  Very quickly, I just pose a couple of questions.  Because it is not clear what is a digital space and digital sovereignty.  Do we really have the same understanding also, in when talking about the difference between the national data infrastructure and who is to operate them in order to have a digital space that ensures digital self-determination.  And how such a concept could look like for individual consumers. Thank you very much.

     >> MODERATOR: Thank you very much, Matthias. And thank you very much for bringing the whole concept a bit more to life in the energy sector. 

Next on we have Kerstin Vokinger. In her research she focused on questions at the intersection of law and technology.  How to regulate technology or how technology can serve society with respect to individual rights and talk about how Swiss law has reacted to database developments in the health sector.  Kerstin, the floor is yours.  You are still muted at the moment.

     >> KERSTIN VOKINGER: Thank you very much for kind introduction.  I would like to highlight legal aspects of digital self-determination and insights from the network the Ambassador mentioned at the beginning of the session. 

To summarize again, what is self-determination -- digital self-determination mean?  Citizens have control of and access to the data they have provided or to data that is relevant for their decision making. And digital self-determination is a way of enhancing trust into digital transformation.  In order for this to be successful, we believe that legal principles that enable and enhance digital self-determination is crucial.

     And one of my key messages is that the Swiss legal system already provides such legal basis and principles.  So even though our Swiss constitution is more than 20 years old, and even though digital transformation evolved especially more recently in the more recent years, we have, among other things, two fundamental rights.  The right to personal freedom and the right to privacy that include the right of digital self-determination.  For example, the right to privacy includes the right of every person is protect against the misuse of their personal data.

     And digital self-determination is not only embedded in our constitution and our fundamental rights but also in laws and acts.  Not only in theory but also in practice we can see that the digital self-determination is valid and respected.  And I would like to share two specific examples in that regard.

     The first example is the Swiss COVID app, a digital tracing tool surely familiar to you.  This digital tracing tool has been developed to help control the spread of the coronavirus and when within Bluetooth range the phone exchanges random identification codes with other mobile phones that have a compatible app installed.  And they are stored for 14 days before being deleted automatically.  If they test positive, they receive a code and it allows them to activate a function in the app thereby also warning the other app users.  We can see that digital self-determination is being protected on different levels.

     First, using the app but also sharing information with other users is based upon pure voluntariness.  The individual decides if and which information he likes to share.  Second, no data is saved in a server.  It is a decentralized system and a third example not more data than necessary is being collected.

     Another probably less prominent example for digital self-determination are the electronic health records.  They are regulated in a federal act.  Electronic health records show how to improve health by enabling more efficiency, structure and data sharing, data sharing between patients and healthcare personnel, be that nurses or physicians but also hospitals.

     And we can see also here with regard to the data sharing the protection of digit digital self-determination in the federal act.  Examples, first only if the patient or individual consents an electronic health record is established.  Second patient may upload data and the patient decides who may access his or her data.  Even if emergency situations if the patient decided beforehand that someone specifically or no one at all may access his data that has to be respected.

     I would like to conclude my short presentation with the three key messages.  So first, to enhance trust in digital self-determination and digital transformation the law must provide a compatible and trustworthy legal framework and in Switzerland this law doesn't have to be reinvented.  It is a fundamental right that we have in our constitution and laws.

     And third, two practical examples for digital self-determination has been successfully implemented are the Swiss COVID tracing app and the electronic health records.  Thank you very much for your attention.

     >> MODERATOR: Thank you very much, Kerstin.  So this gives us a first oversight on how we have so far approached the concept of digital self-determination. But as you can see, we are so far very much centered on the Swiss perspective. And as Ambassador Dubach has mentioned, this is national problematic and the next segment and the following interaction with the audience aims to open up beyond just Switzerland. 

So we have a few commentators lined up that will reflect on the elements that were raised.  As I mentioned, after that we will open the discussion wider so please share your questions in the Q&A section and later on you will be able to raise your hand if you wish to intervene. 

The first commentator is Professor Mark Findlay at Singapore university.  In addition he has honorary chairs at the Australian National University and University of Edinburgh and New South Wales and honorary fellow of the law school university.  Thank you very much for being here.  Mark, you have the floor.

     >> MARK FINDLAY: Thank you very much for the invitation to join this excellent panel and give reflections from an international perspective.  Can I say perhaps somewhat provocatively that digital self-determination depends on people knowing something about their data.  And from the work we have done recently, a significant amount of your data you know nothing about.

The production of secondary data and the production of data which in fact comes back to you through platform preferencing, a range of different situations where in fact you are totally unaware of the fact that you are a subject.  It makes the formal protections of digital self-determination quite problematic.

     Secondly, digital self-determination from where we are sitting in an Asian context raises three issues associated with that phrase.  The first is the digital.  And digital is obviously something which talks about a space, space that is related to communication, a space that is related to technology.

     But that is not a separate space.  It is not a space which is cordoned off from the physical so the majority of people who use social media day in and day out cut across the physical and digital daily.

     Secondly, self.  I would like to make a point about self.  If the western northern notion of the self, digital self-determination when talking about the protection of individual rights, individual privacy, a range of issues like that if you are looking at particularly an Asian cultural perspective the self is entirely determined by the family, by the society in which you live so it is a much more communal notion even in a business sense.  Asian business is about communal concepts of association.

     And so when talking about determination, as Roger opened up earlier on, talking about the self being a concept much more directed towards the way you sit within society.  And finally the issue of determination and I will move from that very quickly.  Determination assumes some power.  We have been doing recent work on COVID and COVID interception and in many situations with groups like migrant workers the institutional age range or other sections of the society that are discriminated against in a structural way, questions how they can use the technology is much limited by their powerlessness in society.  When talking about digital self-determination, we must consider questions of the power to choose.

     And we need to build those into any framework of protection, any framework of security and assurance that we believe digital self-determination provides in a very active and vigorous way. Thank you for that.

     >> MODERATOR: Thank you very much, Mark.  Insightful things here.  If I quickly summarize from my perspective, you mentioned the aspect of knowledge the fact that we often don't know what kind of data is going around and that we are kind of like have a massive blind spot in that regard. 

You mentioned the point about the audience who are we approaching when you talk about the self.  Is the individual or is it a larger entity.  Which I think is a really, really valid point and something that Roger mentioned before as well that is the way we have approached it so far is not just the individual as well, it is the individual as well but it goes a bit wider and maybe we can discuss about it later on. 

And last, but not least, that the point about the power relations and the power to choose, I think that is one of the -- that is at the center of this whole discussion.  Very, very useful.  Thank you very much for the contribution.

     Next on we have Marco-Alexander Breit.  You have the floor.

     >> MARCO-ALEXANDER BREIT: Thank you very much.  I think when it comes to the energy and health sector, I can assure that the German health COVID app is working quite the same way.  I would like to answer a question first from my point of view that is how outcome digital sovereignty and digital self-determination related.  I think that digital self-determination is something that is based on digital sovereignty.  Why is that so? 

I think that digital self-determination is something that individuals can do.  But me being a state representative from the Federal Ministry of Economic Affairs and Energy in Germany, and being the head of the project Guyer X that you might have heard of, I think that what you first and foremost need is a digital sovereign data infrastructures and digital sovereign technologies and the possibility to use European technologies wherever it is needed.  That doesn't mean that we need to do everything, but we need to be in the position if we want to choose between a European product and the product that comes from abroad that there is a European product.

     So I want to try to emphasize the point if I give you the notion of European large-scale companies industry giants, that I have data relations from the United States to China but as we see technology and the use of technologies and services is more and more dragged into the power struggle.  So what we see is that unless we are in the position to use our own infrastructures and our own edge computing infrastructures or our own AI services and data management services, that we can still be caught in the middle between this -- in the middle of this conflict. 

That is even important when you take into account the political side that brings a lot of attention and a lot of awareness to data sovereignty and digital sovereignty in the corporate businesses especially as a lot of these larger, larger scale larger infrastructure providers, we call them hyperscalers, go deep into the domains of their former clients and they go deep into other domains. And that is just cloud infrastructure and some of them even want to build cars and some want to build automation -- automation platforms.  Some of them want to go into artificial intelligence that helps businesses improve their CRM and all these thing.

     So what we see is these companies are very big, they have a lot of access to data and capital and they have a lot of success with having good and competitive product. And as they are related with the customers they see how the customers work and say well, you know, we could do this, too, because we have the data and the money and the people so why should we stop?  This is another point to add to the list of where we are not sovereign.  You have to be in a position to choose.  It has been mentioned already.

     We need to be able, and I emphasize again be able to choose a European technology, based on the fact that this is firmly based on European data protection laws and regulation, too.  And if you want to use an American company, if you want to use a Chinese company, it is perfectly fine. But if there is something that you do not want to share if the other people, in the people in your own country the personalized data then you need to have an alternative.  And that is what we are working on.  The importance of platformization has been mentioned already and this is of utmost importance to be honest.  If we want to sell our regulation and sell our data protection, then we need good and competitive and strong products.

     And these products are more and more platformization products or hardware.  The iPhone?  It is a hardware obviously but it is a platform.  So the more people are using the iPhone and I really love my iPhone, too.

     The more people use the iPhone use the health applications of Apple.  And if they are good, why not?  But it is the same people when asked by the insurance companies if they want to use the app of the insurance company they say oh, my God, no, I do not want to give you my data because this could have detrimental effects on my health insurance or something like that.  But you give it to Apple for free. 

This is something that mentioned the awareness and platformization are closely together because all of us that have been Facebook or are still using Facebook know we pay with our data. When I had Facebook in 2006, I had no awareness of the data issue, I had no awareness what I'm going to publishing the Facebook account that is free with and what kind of effect this would have 10 years later how transparent I as a customer and as an individual is in relation to the platform providers.  Thank you.

     >> MODERATOR: Thank you very much, Marco-Alexander.  Interesting here to hear that same elements that Mark mentioned before about the right to choose now intersected with the more kind of infrastructure hardware and the platform products that you mentioned.  So definitely something to take onboard for us as well.  Thank you very much. 

Next up, we have Anriette.  We are looking forward to your thoughts on this as well.

     >> ANRIETTE ESTERHUYSEN: Thanks for inviting me.  It is interesting to reflect on the conversation from a developing country perspective and I think I will start off with really agreeing with what Mark had said that digital self-determination requires people knowing about their data.  And I think it also requires a particular type of access to infrastructure.  And levels of skill and a certain type of citizenship and relation between citizen and State which is characterized by trust.  And that simply doesn't exist for most people in the world.

     I'm interested in the digital self-determination and digital sovereignty because there is an active conversation in many countries about digital sovereignty.  It there isn't trust and a reliable legal framework, as pointed out, you have immense fear of abuse. And therefore even if people have the capacity to comply and participate in digital self-determination, they might be reluctant to.

     And I think the other -- and I'm interested in how the speakers would feel about this, how extreme social inequality would actually impact on the notion of digital and self-determination.  But to me, an immediate application that I do see in developing countries is perhaps at a more de-centralized and localized level.  You have in many very poor rural areas people are creating their open internet access networks and community networks and creating their own renewable energy networks using solar and I think those bottom up initiatives could be immensely enhanced by digital self-determination.

     And perhaps if one looks at implementing digital self-determination at a more localized level, that can over time through partnership of local government or regional government actually begin to contribute to more state or national level -- and digital self-determination being innovative in Switzerland can be applied in developing countries but probably more at the more local level than at the more centralized level.  But many countries are very different.

     So these are general remarks.  Data and looking at sovereignty applied by the state.  With the oversight over nonpersonal data.

     If anyone has comments on that or looked at that, I would be curious to hear.

     >> MODERATOR: Thank you much, thank you very much, Anriette.  Really useful.  Two things that stand out, and one that I'm grateful that you mentioned is this element of trust that we put so much importance on is something that has come up so very often in the national discussion so really good to hear you kind of translate that into as well into a perspective from a country from the global south. 

And the other aspect that I really enjoyed hearing from you is that approach of seeing digital self-determination as coming from a local context in a localized environment.  This is something that we have discussed as well based on the fact that we are a federal country where we have a lot of different levels of governance and government.  So from that perspective, quite interesting as well to highlight that point.  Thank you very, very much for this very insightful contribution.

     So last, but not least, we will have our last commentator before we open up the discussion, and I would like to give the word to Dr. Urs Gasser, the Executive Director of the Berkman Klein Center for Internet & Society.  Focus on information law, policy and society issues and the changing role of academia in the digitally network age.  As always, thank you very much for being here and the floor is yours.

     >> URS GASSER: Thank you so much for such an important and thoughtful conversation.  I was listening carefully and instead of sort of adding my own comments what I'm trying to do is distil. maybe four questions that may mark a productive tension as we are I think still wrestling with the concept of digital self-determination and what it means.

     And we'll see if the answer is it is all of these together which is harder to say what is digital self-determination.  The first challenge is are we taking a protective stance or is the concept about the empowerment? 

I think we heard also in Kerstinn's presentation the background of digital self-determination, especially self-determination has a legal history and a concept from constitutional law in the realm of protection and a defensive right actually. And I think one question is as we know translated into something like digital self-determination which is not a right per se but a concept or position.  Is that sill the same connotate? 

For some of the position papers out of the Switzerland isn't the idea around empowerment or probably here in U.S. context, agency, give people the ability, the skills and tools necessary, the infrastructures not only to control the data but also to make meaningful use of data to evaluate data which is less some sort of protective and other in the category of empowerment. That is the first tension. 

And the second tension I'm hearing out is, is it about the technical infrastructure conversation or human capacity building?  The concept starts out and the description and the promise that it is a concept to be committed to human centric issues.  But immediately of course even in the Swiss context we talk about Swiss cloud and cloud infrastructures and the rest come up with example as well. And we tend to start with human centric notions, but immediately we move towards infrastructure conversations. And maybe that is part of it, yet I think we also have to address or consider what is the mindset requirement?  What is some sort of the human software part of the story?  Data life cycle was mentioned, but what about human life cycle.

     How are we thinking about the changing degrees of self-determination when we look, for instance, at young people or elderly people?  What is the human software part?

     The third tension, and Mark introduced it and Anriette added to it.  The concept of -- I think by definition the concept of digital self-determination starts with emphasis on the individual.  Now there are at least two issues.  One is the one that Anriette pointed out helpfully, the question of structure inequalities and symmetries.  And I would add to that even in developed parts of the world so-called in advanced economies we see that relying on the individual hasn't worked out that well, actually Kone's work and other in the tech and data context. 

Second, as another dimension to complexify, isn't it the case that the world's biggest problems from climate change to public health and the like may actually need more data sharing and something like data solidarity?  And I think there it is more than just the label and the semantics whether you emphasize data digital self-determination or solidarity, I think these are fundamental questions in how we position our policies and our own mind sets.

     The fourth tension I am hearing out is this a story about personal data or is it a story about all types of data?  Of course, the questions was already asked what is the relation between the concept of digital self-determination of privacy even without answering that question just going through the examples we heard the use cases I think there is a tendency at least to focus on personal data.

     And that is fine and understandable.  Yet, if we go back to the first question, well, isn't this really about the story of agency, then I think the real potential in front of us is and challenge as policy makers how can we unlock the potential of the data age for individuals so that they can participate in the wealth of the data that is accumulated in so many different ways?

       How can we empower people to become data scientists?  Civil citizen data scientists to shape their own lives?  To create their own environments and make better decisions based on all data and not only the control of their personal data.

     So these are a few reflections.  And my fear is that we say yes, yes, all of the above.  But the concept of digital self-determination becomes a Rorschach test where everybody can read into what they want which might be fine but not necessarily helpful in policy context.  Thank you.

     >> MODERATOR: Thank you very much.  I'm incredibly grateful to break down into the four tensions that you highlight well and I would have my own views on how to resolve the tensions.  We are already running quite late into the session and I want to open up for questions as well. 

So you will now be able to raise your hand if you would like to speak and ask a question.  We still have a number of kind of open questions I see in the Q&A many of you and us have been very busy in answering questions.

     But maybe we can pick up some of the unanswered questions as well.  So, one that I see is how does digital self-determination apply to an individual in a way who wants to create a new global online service?  She will not have much agency to innovate if she and the coworkers have to re-comply the product to comply with 37 different sets of national regulations, the digital sovereignty.  That highlights the elements or the tensions that Urs highlighted about is this about empowerment?  About protection?  And also the difference between the technical concept or human capacity and where are we finding ourselves in there?  Anybody wants to kind of like highlight or react to this question?  Maybe if -- Matthias.

     >> MATTHIAS GALUS: Maybe I will give it a shot.  I tried to come up with an analogy here.

     So if we look at data as a resource for economic processes, let's say, for new products, it is kind of the same when, you know, you look at other resources in different countries if you look about, you know, how to produce coal.  You have different regulations in each country, of course, how to, you know, run a coal power plant or a coalmine and how to deal with, you know, the working staff that is working in those production facilities and so forth.

     So I believe economy will find a way, especially such a strong tech economy to deal with certain kinds of different rules when it comes to data and using the data, especially if we look on the empowerment of the individuals.  Because that for me is a very central point that if we want to talk about empowerment, if -- when it comes to self-determination then, of course, data, or the data access can be done quite easily once you are empowered by the consumer.

     But we have to come up with rules for that on how this empowerment would be actually realized, I guess.  That's my point of view.

     >> MARCO-ALEXANDER BREIT: Can I add to this?  In Europe we are talking about regulation, regulation, regulation.  It is good that we do this, but nevertheless the regulation and data protection and values are sold by the product.  That is what I meant with the Apple example. 

I'm not turning against providers or companies here.  But the thing is Apple has a good product.  So what happens is Apple sells a lot of devices and then they bring up apps and then they -- the people buy the devices and use the apps and then somebody, you know, United States, Europe, we force them to adhere to some rules and then they stick to rules. 

But if it goes the other way around first set rules and then try to invent services and try and invent applications that follow the rules but are competitive, it’s like a billion times harder to do this because you have no devices and no applications first and no platforms and no customer base and then you are supposed to do a regulatory feasible approach for something like just an application.  So what I tried to say is innovation sells standards.  And not standards sell innovation.

     >> MODERATOR: Thank you very much, Marco.  We then have two raised hands already.  I would like first to give the floor to Michael Nelson.  And I would ask you to kind of keep to 20-30 seconds max so we can have the best number of interaction.  Thank you very much.

     >> MICHAEL NELSON: Thank you for asking my question from the chat.  I apologize for not going on video, but it is dark here in Washington.  I want to disagree strongly with what Matthias just said about data being like coal or oil.  That is completely the wrong way to think about this.  Data is global.  Data is the new air.  And to say oh, we are going to have different regulations in different countries, no problem and to say digital sovereignty is consistent with self-determination is to miss the whole point of digital sovereignty which is to assert at the national level rules that are different than other people.

     And to limit what individuals can do in that country.  Thank you again for asking my question.

     >> MODERATOR: Thank you very much, Michael.  I don't know before we react to that, maybe we get to the second intervention that we have from Humayra Rabab and then we can react to them.  So you have the floor.

     >> HUMAYRA RABAB: My question is about the most recent comment about allowing innovation to take place and then focusing on the regulation.  Wouldn't that then open up to opportunities where innovation allows the legislation to change which may bring in security concerns or privacy concerns where maybe an innovative product is allowing more surveillance that isn't required at the current climate for example?  So that is really the basis of the question. 

So if we don't allow more regulation on innovative products or companies, can we then see surveillance being the product being sold to governments because regulation isn't a priority at that stage?  Thank you.

>> MARCO BREIT: This is the point why it is hard to find the right sweet spot between innovation and regulation.  For example, in Germany we have laws that if you want to export certain goods to states that have no Democratic regime, then you are actually hindered by this.  For example, we try -- we phrased some of our laws that AI, facial recognition is something that we need to think about twice before we export it.   Nevertheless, you see industry giants from the United States that said oh, yes, we don't want to foster any kind of autocratic regimes so we stopped building and exporting facial recognition software, too.  What I'm trying to emphasize here is that there is a thin line between innovation and regulation we need to walk.  If you go too much into the innovation direction you have the dangers you mentioned.  And if you go too much in the regulation direction then you will have no competitive goods in the world market and maybe then the autocratic regimes will have the opportunity to flood the world with their services because they are running so well and well, they don't care about regulation right now. 

This is what we can see especially in the view of AI where China is doing a lot of progress when it comes to facial recognition and the use of facial recognition and obviously it is for causes that we wouldn't necessarily endorse in Europe.

     But if we don't step up our game in facial recognition, we will be in the position to buy these in the end?  I don't know.  But the danger is looming.

     >> MODERATOR: Thank you very much, Marco.  We are slowly starting to get to the end of the session.  I feel like there are still so many questions that we could answer and discuss but I would like to slightly maybe abuse my right, my privilege of being a moderator and ask two questions before we then wrap up with Thomas Schneider to the group.

And the first is to maybe the Swiss contingents or experts speaking there.  I thought that kind of the logic that was set up to us was really useful so I would like to ask you to react to what are your first thoughts to the four tensions that Urs Gasser mentioned. And then maybe to the international commentators that we have it would be quite useful to hear a quick reflection on how do you think we can go from there and start the international discussions around those four tensions.

And, Anriette, from you we heard there are so many different approaches, so many different challenges that we face in different countries and different contexts.  How do we go addressing this in an international and meaningful way?  Maybe start first with our Swiss experts and ask you to keep it short and quick so we then wrap up with only a slight delay.

>> KERSTIN VOKINGER: I couldn't agree more with what Urs Gasser said.  I think he raised valuable and important points.  

On the one hand, when Urs emphasized that the self-determination is something that we have very much embedded in our Swiss understanding and Swiss system already before the digitalization and that helps that we don't have to reinvent the wheel. But we can sort of already use the principles that we have and therefore already have out of trust and actually are functioning. And I think that is a very important point.

And the point he raised regarding data sharing with regard to climate change, an example with public health crisis that data sharing there is crucial and that is what we are discussing in our network and we further have to discuss is how to balance the different -- sometimes can divergent interests between the individual and the public interest but sometimes they can be the same.  We have to think there more about how can we further emphasize data share and enhance data sharing and make sure that, of course, this is not being misused.  That is a valuable input and emphasis we should have in the network and in upcoming conversations we have.

     >> MODERATOR: Matthias, you have 20 seconds.

     >> MATTHIAS GALUS: 20 seconds, wow.  These were crucial points for me as well and I don't have a clear answer so we have to keep up the discussion on those points. 

What I do see absolutely is that if we talk about more self-determination concepts like this, it is about the right to choose and if we want to have a little bit more of this right we have to talk about digital skills and that was mentioned by Anriette and that was not necessarily very broadly available in communities and we also have to talk about infrastructure that actually gives the transparency to people so that they can choose.  So it is a skill set combined with infrastructure questions of course, and then you know, a lot of measures to sensualize people in order to have an increasing skill set to deal with those questions.

     But that is I think it is a long way.  Right now.

     >> MODERATOR: Brilliant.  Thank you very much, Matthias.  Roger?

     >> ROGER DUBACH: Thank you very much.  First of all, thank you very much for this very lively discussion.  I very much enjoy and learned a lot about what we are going to do.  Perhaps quickly to react to the four tensions mentioned by Urs Gasser which are helpful for our own thinking.

     For the first one, I mean I think that it is very much about empowerment.  But I think the self-determination is a right we have in the current life and do not have in the digital world.  It is how to bring the concept in the digital space not as a defensive approach but very much as empowerment.

     And then on the second one, I'm very much on the side of human capacity building.  I see that, of course, there is the infrastructure part, but the discussion about digital self-determination is already very broad and I would not like to engage in technical infrastructure discussion as well.  But that's my personal view.

     On the third one, the individualistic or commentary concept.  I mean yeah, the -- the easy answer is to say both.  I think it is about individually -- individual rights or data control but, of course, with the collective dimension.

     And the fourth one, I mean we started our thinking very much and that is also what Matthias Galus pointed out if you talk, it is hard to deal with personal data.  We tried how to see to use personal data in data spaces.

     That is where we are with our thinking.  Of course, I very much see the point about all types of data but that is I think we have to think a little bit about this tension as well. Thank you very much.

     >> MODERATOR: Thank you very much.  And I really would like to give the floor quickly to our international commentator as well.  Please I know it is difficult and it is a big question.  If you could make it as short as possible so we don't overrun too much because afterwards we will have Ambassador Schneider sigh a few words as well.  Anriette, your thoughts.

     >> ANRIETTE ESTERHUYSEN: I will be quick.  We need to talk about human rights and accountability of States as well as corporations.  I think the points are spot on about human capacity and technical capacity needs a lot more thought on how they interact with one another. 

And then finally I think the point about data solidarity and how that relates to data self-determination and my question is do we have the legal concepts and the regulatory tools to build data solidarity.  So much of what we rely on to achieve self-determination has been personal data protection laws and principles.

     And so my question really is, is that enough?  Is that enough or do we need to build new principles to achieve the kind of data solidarity that he was referring to that can, I believe, co-exist with digital self-determination.  Really interesting session.

     >> MODERATOR: Brilliant.  Thank you so much.  Mark, your thoughts?

     >> MARK FINDLAY: Two things.  We should stop thinking about data sovereignty because we live in a global world and the problem with pandemic control is that states focus on their own interests and don't focus on the issue of public health as a global concept. 

The second issue associated with that is just to realize the fact in the south world the majority of individuals don't live in functioning states and they don't have the power or the capacity to assert their sovereignty or even their human dignity.  Final point to think about, and that is that the world is largely run now not by Nation States but by multinational corporations and they run the platforms that produce and manage the data so concepts of sovereignty make sense when talking about a multinational world and 70% of those live in worlds that don’t live in functioning states.

     >> MODERATOR: Marco, I know you might have a slightly different vision.  Please go ahead.

     >> MARCO-BREIT: I yes, I do but I don't want to end on a negative note.  He zoomed out and made the structure theory and zoomed in and made from the reaction theory consideration if we want to determine how they work together we need to understand this this, if you want to be free, you need to be living in a country with free structures or free consider the structure of freedom.  And if you want to change the structures towards freedom you need the collective action.  This is not a separation or a tension.  I think this is just two sides of the same coin and this is how we should understand it because I want to be self-determined but the discussion needs to be digital sovereignty because we know what kind of data is going to flow between you and me and on what kind of infrastructure is going to run.

     >> MODERATOR: Thank you so much.  Your five cents, Urs.

     >> URS GASSER: Maybe two thoughts about how to have conversations about digital self-determination. 

The first one builds very much upon Mark's and Anriette's reminders that we have to be careful when using a term or a concept like digital self-determination. And I do think, of course, especially in international arenas as we mentioned the south a number of times, this is extremely important to have these normative assumptions in mind.

     The second in terms of the output side, the normative consequences of introducing new terminology and new concepts particularly if it is coming, you know, or is related to a government of a country.  I think we have seen already in our space of technology and society issues that there is a danger.  There is a risk by introducing highly ambiguous concepts because they can easily be hijacked by politicians and tweaked because they are so open-ended for their perspective.  And I think these two things being aware of the normative assumptions and second of the normative consequences of innovative new concepts is perhaps wise to keep that in mind.  Thank you.

     >> MODERATOR: Thank you very much, Urs, for the words of caution in this time as well.  That is really useful.  It has been interesting.  I'm sorry that we were not able to answer all of the questions.  The chat has been lively and we posted our  e-mail address and we will repost it right now. Thank you very much, Dominique, where you can contact and send us questions as well and we are more than happy to react. 

And before we end here, I would like to give the word to Ambassador Thomas Schneider from the Federal Office of Communications to wrap it up and give us a vision and a way forward.

     >> THOMAS SCHNEIDER: Thank you all.  Can you hear me?

     >> MODERATOR: Yes, we can.

     >> THOMAS SCHNEIDER: We hope this is the start of an expert change throughout the world that will grow and help us and everybody to promote empowerment and digital self-determination and reduce risks of control for people to be controlled by whether it is a government or a company by others than themselves basically or their society. 

One element is legal production of human rights.  It is not enough that the GDPR led to us clicking 10 times I accept when using a search engine.  It is not complete.  We need to think about technical standards and economic regulatory incentives that helps individuals, businesses and societies to move towards self-determination and the key point raised by many of us.

We also need people to be empowered to realize what is at stake and see the risks but also to see the opportunities and to enable them to create services on their own on local level with whatever partners they trust to help them improve their lives.  Two sides.  One is protection and one is regulatory measures. And the other is empowerment and the identification of opportunities and incentives to get people to become active to use data, to use the opportunities. 

And in particular, in developing countries -- and I’m very glad for all those who brought this in -- there are more challenges about legal securities and social inequalities and things that make it historically more difficult in some countries or cultures to get to the digital self-determination because already analog self-determination may be more challenging in one way or another.

Also there I think that as Anriette said actually the concept of digital self-determination can boost and support in particular local initiatives can bring people together with the same understanding so they develop their open services and not just use what is there. They have no control and don't know what is happening to their data, but to encourage people to get a perspective to become more self-determined, and more free and independent, and also to see the economic opportunities.

     So and, of course, my last point I think and this is one that I want to end here is we all are aware that local initiatives are important but in the end this is not a regional or local or national issue this is a global issue and we should think across borders.  And digital sovereignty if you have no control on the infrastructure, it is getting very difficult to make free decisions if you don't know whether the information you get is accurate, whether the tool you get is actually doing what you ask it to. So, of course, I think this is not a contradiction, but something that is a mutual dependency. 

The question is just on what do you rely to create this digital sovereignty in the sense that does this have to be your national government that you are forced to trust that will regulate everything for you or a more sophisticated differentiated network of trusted partner in your country but also with other countries and other actors that help you create this digital sovereignty on infrastructure level and regulation level. 

This is the not end of the story, this is again just the beginning that we need to discuss digital sovereignty but maybe also think about what does this mean and do we really just depend on our administrations on national level that will take care of it or there may be other ways.  In the end, finally we are convinced this is a big issue for us and it is or will be a big issue for basically every human being in the world going digital or is already digital.  How can we work together and create an informal exchange structure and informal network as a first step where people that are working on issues be it in different fields, health, traffic, mobility, environmental issues, how can we work together to support each other to share our knowledge to go towards digital self-determination and digital sovereignty in a sense that we also need to know where the infrastructure is doing and we need to be able to have confidence in it. 

From our side, this was an invitation to show what we are doing.  To see what others are doing and thinking and now we need to get together a network.  Contact us, share information with us and help us with your ideas on how to move this important issue forward.  So we are very happy to have had this discussion and see it as a good start for something that will probably hopefully keep us busy in a positive way for the next years to come.  Thank you all very much.  And looking forward to continue the discussion.

     >> MODERATOR: Thank you very much, Thomas.  Thank you to the audience for the active and lively interaction.  Apologies for taking 15 minutes of your life away that was not scheduled, but I think it was worthwhile.  It was a superb discussion and really insightful.  As Thomas mentioned, if you would like to continue this discussion, do get in touch with us.  We look very much forward to all of your points.  Thanks again and hopefully we can see us all each other next year again like to continue the discussion at the next IGF 2021.  Thank you very much and hope to see you all soon.


Contact Information

United Nations
Secretariat of the Internet Governance Forum (IGF)

Villa Le Bocage
Palais des Nations,
CH-1211 Geneva 10

igf [at] un [dot] org
+41 (0) 229 173 411