IGF 2020 – Day 12 – WS352 Digital Human Rights: Digital integrity of the human person

The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

 

>> GREGORY ENGELS: It's 12:50.  The official start time.  Should we start?

>> BAILEY LAMON: I guess we're going to have to.

>> GREGORY ENGELS: Okay.  Welcome, everyone.

Are we ready to go?  Question to IGF.  All right.

Welcome to the IGF 2020 session 352, Digital Human Rights: Digital integrity of the human person.  This workshop will introduce this new legal concept, the possible implications of data protection, how the concept could be introduced into the current legal framework.  The right to digital integrity is to be understood as the general justification for human digital rights, including the rights to data protection.  Every human evolves in a multidimensional physical and digital environment, if each individual is to keep their individuality and autonomy in daily choices, they must be protected and given effective tools to depend on autonomy.  Recognizing that the life has been digitally extended means questioning ourselves on what makes us human today, should personal data be considered as much of a component of a person rather than object that's owned by whoever collected it?  Should our digital integrity be protected?  If each human person already has the right to physical and mental integrity, for example, in the constitution, fundamental of rights, should they also have the right to digital integrity.

We have analysts here today that are going deep in discussion and ‑‑ I forgot to introduce myself.  I'm Gregory Engels.  I'm with Pirate Parties International, an NGO with a consultant status and I'm glad to be able to host this session today.

So the question, the question asked from the auditorium, everybody not speaking mute yourself, after that, I'll give the floor over to Alexis Roussel and after that we'll promote everybody to panelist and we'll have a group discussion.

Thank you very much.

>> ALEXIS ROUSSEL: Hello.

I'm Alexis Roussel, I used to work for the UN long time a ago, I was an eGovernment specialist for the International Court of Justice and then for the United Nations Training and Research and then I left the UN and went to the private sector.  I'm very active in the crypto currency and crypto adoption in Switzerland and also since I left the UN an advocate of data protection within mostly Switzerland and I have been working a lot on digital rights in Switzerland.

What I'll present to you, it is the summary of this work and how we are pushing digital rights within the Swiss framework and also externally within the European Union.  I'm going to share my screen with my presentation with you.

Like Gregory said, I will present a concept of the digital integrity to the human person and we see this as a fundamental new right that's going to be ‑‑ I hope, this is my hope, of course ‑‑ going to be integrated into most of the legal frameworks that we see which are adapted to this vision.  Of course, it pushes a specific vision and we're going to see there are different types of vision about the digital representation of the human person.

To give a context, we're really here in a generational gap and there was a key moment that happened in a trial, just to give you a taste of what is this generational gap Peter Sunde, he's a person that was managing the Pirate Bay, it was one of those big websites where you can share music, movies, content, it was seen as a copyright infringement and actually a lot of people in the newer generation were seeing this as a way to exchange culture.  There was a big clash between creating a new type of culture and respecting the ‑‑ respecting the copyrights.  We saw this during the trial when the judge said to Peter Sunde, that's ‑‑ he was meeting the other people in IRL, in real life, and Peter Sunde said no, we're not using this expression if real life, we're using this expression away from keyboard, AFK, for us, the internet is real.  There was a big clash between ‑‑ during this conversation between the judge and Peter Sunde.

The judge was representing the old system, old rules and was applying those old rules but the clash was still there.

Another big clash is a lot of people, at very high‑level, governments, companies, we're using this expression that personal data is a new oil.  Why are they using this?  Because they're trying to make sense, of course, of what is personal data and they see that when the business models around personal data allows a lot of creation of value, so once you make this consultation, once you make this remark, a lot of countries are trying to mimic what's happening in the Silicon Valley and say, okay, we also want to have a striving business which is building around computer science in our country and so we need some guidelines to build that industry and to have this ‑‑ these key elements in place for our industry to grow.  They're basing this on the fact that they believe personal data is like oil, it is the new oil of the 21st Century.  That's a wrong feeling A wrong taste for a lot of people.  Basically people say yeah, but we're not oil.  This is also pushing a lot of regulation right now, being pushed by this.

What we can see first, there is in the world three main approaches and we're going to talk a bit further of the approaches later in the discussion panel, but we can start to see there are different approaches on personal data and I'm going to link it with regions but this not very clear yet, some regions are mixed, it is not just about region, we can say, for example, that the first one, it is more like European African, the second one, more American and the last one, more Asian.

The first approach, it is that personal data are linked to inalienable rights, linked to the human person.  That's like the European or the African arch.

Then a second approach, it is that personal data are available to the market.  It is a free market.  This is the U.S., this is what's driving Silicon Valley and what we call the Gafa and they're actually using data as a commodity so that they can take, it acquire it, they can create it, enhance it, they can summit or they can feed it back to the person who actually will manipulate the person for their own good, for setting the products.

It is a very libertarian view.  It is completely free and it is up to the responsibility of the individual.

There is a third vision, it is what we could say more Asian in general, we see it also that personal data are part of the common good, meaning that the personal data that everyone is producing is actually a set of data that is shared by everyone.  It is up to the state and institutions and any of the institutions to actually take care of the data and managed this data for the good or the greater ‑‑ for the greater happiness of the whole society and we can see that.

We see those three different approaches and it is important to understand that these approaches, they're going to be ‑‑ they'll be the new lines of frictions.  We see already in the world, these lines of ‑‑ these lines of frictions appearing in the international multilateral system and you see countries moving from one group to another.

Here I'll propose the vision of the first one, how the humanist ‑‑ the markets, the humanist regimes, they can make it to the individual.  There's currently, and since 20 years, there is a search for a core digital Human Rights.  The search is very hard, and there is a ‑‑ there's a vision of saying that this core digital rights to be understood by all and enforceable also by all.  Until now, the best answer we had, it is the right to digital self‑determination, the right to digital self-autonomy.  This is terms that are not understandable and don't relate to things that exist already in the existing frame, digital legal framework so these rights about digital self‑determination exists in Germany, exists also in Switzerland.  There's been some law cases which are presenting them, but it is still very hard to express them.  There are many reasons for this.

But to give you a historical perspective in the world, Europe, Human Rights mostly, they have been shaped through a history which starts with this long history between printing press innovation until the 18th century, and then we have the industrial revolution where the fact that there is a technological innovation which allows the information to be shared differently in society allows more people to have access to information and then the rights of the people are transformed by this innovation.  The old regime, based on the oral ultra dirks a lot of oral tradition and institutionalization, it changes into a new form that we see in Europe as being democracies.

In the same way, the fact that there is an internet innovation which changes how the information is produced and shared between the society is shaping ‑‑ is shaping a new – and in a different world where there are questions about the Rights of the individuals and those are the ways that these rights will shape in the future, they will then shape the new institutions that we have in the future because our institution will evolve.

How does ‑‑ when using the ‑‑ this comparison between the two big movements that happened in the history, they can use this as a way to understand how those ‑‑ how those rights are integrated in our legal system and one of the rights that showed up, one of the most fundamental, it is the Right‑to‑life.  The Right‑to‑life is appearing in most constitution, most declarations without Human Rights, in all of them basically.  Normally it appears in almost all constitutions.  Why?  This is the first step for a state or institution to recognize the person on which you have jurisdiction and once you give the Right‑to‑life, usually the Right‑to‑life is necessary to allow the person to be autonomous and to allow this person to be free and in the decision process.  We saw in some cases, even if written in the constitution N a charter of rights, it doesn't mean it is enough for people to understand it and to say that there is a Right‑to‑life means that you can remove the death penalty, that's already a good step, but it doesn't mean that we guarantee the full autonomy of the person.  We usually subsist on more specific rights appear, just to give a shape on this, on how this autonomy is going to be preserved and what we see most, it is that we see the right‑to‑life combined with the right to have physical and mental integrity preserved.  We see this in the Constitution.  We see it in the European Charter of Fundamental Rights. 

What it means, it means that if the state is not respecting the physical integrity of the individual then it doesn't allow the person to be autonomous.  I'll give you a very blunt example, if I'm entering a voting ‑‑ if I'm going to an election, entering the voting and I'm being beat up by an official, I will not vote the same way as if I'm not beaten up.

That's an aggression against my physical integrity.  The same way with mental integrity, if I'm being ‑‑ if I'm being submitted to some kind of a propaganda which is subliminal, for example, I don't even see the images that I'm being shown, it changes my mental state, I can vote differently and now the same way, and we saw this in the U.S., in Europe, you can actually manipulate digital ‑‑ eliminate the digital space of the person and by manually predicting this, you can change the vote of the person, you can change the way that the person is choosing their inaction, you're infringing on this autonomy, we have to recognize this, we have to say yes, indeed, our life, our life has been digitally extended and our right‑to‑life should also be extended and that means next to my physical and mental integrity I also have a right to digital integrity.  By respecting this digital ‑‑ the whole integrity, also digital, this is how we keep the individual autonomous and free.  This is really where the example ‑‑ where the example of Cambridge analytics comes in, some people would say, yes, but the fact that this data was available on public Facebook, it is sometimes considered a public forum, it was not an infringement on privacy, and it was not always seen as this and it is actually still an infringement on your personal integrity which actually created a distortion in your own autonomy.

What are the consequences of taking such an important step?

The first consequences, it is that we can actually define what are ‑‑ what is personal data and until recently, even still today, there is very little definition of personal data, it is in the sense of how do you categorize them.  Some say it is a commodity that we saw on the market side and in the U.S., Silicon Valley dogma with say this, it is a pure commodity, we can just make what we want with it and sell it.

In Europe, in Europe and Africa mostly, the data protection of institutions, which are together and especially the French‑speaking data protection agencies, authorities which are together in an association, they have made the declaration in 2018 saying that personal data are elements of the human person, actually that means that they are exactly treated the same way as body parts and because they're part of the person, they have inalienable rights that are attached to them and they cannot be sold.  That's the critical thing.  As soon as you consider that data is part of yourself, then the data cannot be sold, it can be transferred in certain conditions, there can be exchanges of data, but the relationship you have, it is based on the data, and then it is based on informed consent, that's the krill critical thing, you can't take data because it is available, a commodity, but it is part of the common good and you just take it, and there has to be an informed consent to actually provide this data.

We can argue, for example, that harvesting data by nature is something which is infringing the ‑‑ it is harming the digital integrity of the person.  Another point, data is not something that which should be acceptable.

One other consequence of this, it is that there is a dogma in Europe and in most of the places where they consider that data is a commodity or if you want to have a responsible approach where they say, well, if you want to protect yourself, you can, you just have to claim ‑‑ you are the one that is supposed to defend yourself and know where your data is.  That's a very bad assumption.  It is assuming that you know exactly where all of your data is.  You can hear it in discussion when people are saying, oh, but if you don't want to have problems about privacy, rights, don't publish on Facebook.  That's a very common thing.  It is not because I'm not publishing on Facebook that actually Facebook doesn't have data, that's two different things.  People have no conscience of how much data is actually harvested and stored everywhere.  The rights that we have on our data should take into consideration that it is impossible for someone to know where the data is so we need to have a blanket protection, not a protection that can only apply when we discover where the data is and then you tie that to the specific institution.

No law should assume that we know where our data is.  Actually in reality, no law assumes that we understand where are our body parts, we're protected even if someone is in a coma and has no mental ‑‑ has no mental capacity to understand where it's body is, the bad of the person is still protected by law.  You're not allowed to chop it off in the hospital, for example.

I'm taking very blunt examples, but that allows us to understand sometimes.

Sometimes people will say okay, how does it go with GDPR, GDPR, it is a standard in the world right now.  Would it go with it?  Basically, I would say that most of it would go with it.  The right to digital integrity, it is a fundamental Human Rights.  It is at high‑level within the right‑to‑life.  GDPR is a law that applies in detail how data can be treated and what is allowed, what's not allowed.

That's the case.

In many way, we have a fundamental right, then we have specific laws that applies that.

A thing which allows the digital right, it allows us to do, you start to go out of the simple fact of trying to protect the right to privacy and it is not just a right to privacy actually that we're protecting, it is the right of the life and the digital part, digital life of the human person.

The GDPR, it is not ‑‑ it is not incompatible, there is one part that's rather incompatible, it is the material scope of the GDPR and it is GDPR, it has a very broad exclusion Article for the states, not for private companies but the states, basically as soon as the state, as an institution can claim that there is a security reason behind, even for just perspective manner, even just talking about security, it is enough to exclude the application of GDPR.  That would be an infringement against the digital right ‑‑ the right for digital integrity because a law, any regulation cannot ‑‑ if you make the infringement on the right, it has to be much more precise, just a blanket exclusion like it is done here.

Why do we need it?  I think there's a few elements that we can say.  Basically, if we need the right to digital integrity, this applies into the relationship between a state who has added this in its own constitution and citizens.

It forces the states to have a discussion, to have the interaction with its own citizen when they're not harming their digital self.  For example, they have to be careful in how they manage databases, they shouldn't allow things like Cambridge to promote things, they shouldn't like the French, for example, are doing right now, the French government, they're allowing the tax department to go in social network, to analyze the data the person to see if tax wise they're doing good or not.

I think that goes ‑‑ these are the kinds of things that are infringing on the digital integrity.  In Switzerland, we have examples of ‑‑ we have laws that are touching on digital rights but debate, it is not being held.  We have had censorship tools like the main censorship related to online gaming, online money that was set in place.  While we assume that we can say, yes, it is reasonable to protect the industry, to block external companies ‑‑ external countries to compete with regulated casino business, that's maybe a reason, the debate did not touch on the base that it could be infringement on digital rights, a censorship rule that was set up in Switzerland.  That allows the discussion we didn't have before.  When you have it on the state level, the state and the citizen, it forces the state to impose this to the rest of society, think about the digital right‑to‑life, the state will say, okay, we're not allowing death penalty, but then it also make as sure that there is a criminal code to avoid murder because a murder is not between the individual and the government, but just the individual, the two individuals, it is not in the interest of society to allow the murder.  The same way, the state is respecting digital integrity and then it must put in place laws where individuals between themselves are also respecting their own digital integrity.

I really see that the right to digital integrity, we said it in the introduction, it could be a general justification to the right to data protection but it is actually a bit higher.  The right to digital integrity, it is really about protecting the autonomy of the person and the right to data protection, it is actually how we're going to in practical implement that.  Yeah.  This is the link to the autonomy, to the digital self‑determination that we're looking for.

One of the boss possible implication, I'm working on two, these are the ones that I would point to.  The first one, the charter of fundamental Rights of the European Union in Article 3CNIL3‑1, everyone has right to their physical and digital integrity.  We have a similar Article in the Swiss constitution, every person has a right to physical liberty and to the physical and mental integrity and to freedom of movement and even the personal liberty and freedom of movements, they're rights that are aligned to the autonomy and the capacity as a person to be a part of, to be a citizen who can vote in a free, and good way.

What I'm aiming at, it is adding the digital part here.  Everyone has a right to respect who is in their digital, physical, mental integrity.  We have three current implementations that are ongoing.  Luckily in Switzerland we have a system that's a federal system and as each federal state, each local state, they have their own constitutions and there are Constitutional processes and there is three places that the work has started, the Canton of Valais, they're creating a new constitution, they have an assembly working on renewing completely their own constitutions and there is one Committee which is specialized in the fundamental rights that has accepted the proposal and has unanimously approved and it will be presented during the whole assembly and hopefully be voted in the next ‑‑ by the people of that state.

In Geneva, they had started a local initiative, a political party had started an initiative to make a referendum, a public referendum because of COVID they changed the strategy and now they're making a law to change a fundamental law in Geneva.

In Neuchatel, the work, it is just starting right now in the political sphere.  These countries are pretty small, open to these kinds of ideas and we're pushing that.

In each of the places, there is a constitution with the same Article that we showed you before, the right to physical and mental integrity are showing in those constitutions so we can actually update easily the constitution.

I'll leave it here.  That was the main presentation of this theme.  I leave it to Gregory.

>> GREGORY ENGELS: Thank you.

You can stop sharing now.  We're going to go over to Alexander Isavnin from the technical community and will speak about the technical aspect, about this and then we'll open the floor and promote everybody to panelist and we'll ask those questions in person.  We have some questions in the Q&A and we'll answer that later.  Thank you.

>> ALEXANDER ISAVNIN: Gregory, thank you very much.

I want to augment Alexis a bit, from 2013, in 2013 there was a chance to be away from the keyboards, and now with modern smartphone, smart watches, digital assistance, you are always at the keyboard, it is really, really difficult to get out of it.

I want to remind you about a little joke, about a person new to internet, okay, I downloaded the file from the internet, how to upload it back, internet put us into a state of if uploaded something to the internet you can download it back, you can have it forever, even these legislations of data removal, rights to be forgotten, it will not work.  If it gets to the internet, it is usually there forever.

I wanted to originally tell you about digital integrity and personal data, it is a common rule and, yes, in Russian Federation, our government tries to declare our personal data belongs of the state and our personal data regulation is written the same way.  In parallel, they always tried to propose a legislation that will allow the government to sell it to corporations.  Luckily for U such legislation cannot pass yet but such combination of eastern common good approach of personal data, of persons' digital footprint, compliant with Western U.S. and everything is good, it leads us to really bad results.  As an example, COVID‑19, when personal data of Russians are used as common good for COVID or something like protecting from the movement, for example, children, senior persons, they have to state who uses a digital footprint and it is free usage of public transport so the part of the ‑‑ it is the digital, it has been used to restrict rights.

I don't feel it is a major problem yet.  As a problem, there is a serious part of our integrity, it belongs not to just us but to our corporations.  I'm old enough, I lived in the pre‑internet era, when I look at my childhood, data, it is just boy scout badges, nothing more exists from me from that time.  Nowadays, we use services held by corporations as a part of our digital life.  For example, I think we all use some email providers, somebody uses Gmail, others use local mails and the serious part of our life are actually linked to this digital aspect of our life because this interaction, it is interaction between us and corporation, it is usually not considered at part of rights which need to be protected by the government.  Imagine if suddenly you lose access to all of your email, how difficult it would be, you would need to recreate a lot of not just creating a new mail account, you have to recreate a lot of links and relations that are set up in your digital era.  You need to recreate your internet governance also.  That's really, really difficult.

Talking in‑kind legislation, for example, I'm not sure that Google will do it, or they can suspend your account instantly or if you publish your photos and consider Facebook not just for communication but as your diary, you put up photos and achievements of your children, your travel reports, and suddenly Facebook can invoke user rights if they decided that you do not belong or justify false claiming against you.  I think a regulated protection of data ownership needs to be set up in laws.

Google, one of the few corporations from the times when they declared the corporation of good, it was download all your data, that's reasonable, I'm not sure that's ‑‑ that Facebook allows that, others do, somebody else.

In addition, or in discussion with Alexis, it is part of our digital intelligence that needs to be just not protected versus government, but also versus those digital services that were not existing in the times of creation of the Universal Declaration of Human Rights or setting up legislation as it exists now.  We need something in addition to governing data, not just governing, but distribution of our data, but big corporations and also our right of power and ownership of the data.  Again, with Facebook, remember, folks have the service agreement, it declares everything that's published on Facebook, it is the property of Facebook itself.  It is really dangerous, if we talk about Human Rights.

Back to you, Gregory.

>> GREGORY ENGELS: Thank you.

We had additional panelist that was preparing for the session, Jesselyn Radack, who is apparently not here I think due to technical problems.  But Bailey Lamon had prepared the session with her and had taken notes and can takeover delivering some insights from her point of view.

Bailey.

>> BAILEY LAMON: Hello, everyone.  My name is Bailey Lamon.  I'm a long‑time activist and a Pirate, I'm the Chair of Pirate Parties International.  You're not actually to be hearing from me so much on this topic today as Gregory said, Jesselyn Radack, going to be our panelist and unfortunately last minute she was not able to attend.

It is unfortunate, Jesselyn Radack, if you don't know who she is, she is from the U.S., a whistleblower attorney and she has spent her entire career, she's a whistleblower herself actually and she spent her entire career working on supporting whistleblowers, defending whistleblower, including Thomas drake, Edward Snowden is also a client of hers so she would have given you a perspective on all of this as a lawyer, as somebody who actually deals with these issues in a legal sense every single day.

Apology, because I myself am not a lawyer.  I'm an activist like I said.

I'm coming at you with this from more of an activist perspective and I can start with kind of touching on Jess' work and how it relates to all of this.  When we think about digital integrity, this is something that for years has been part of the discourse when talking about whistleblowers and the transparency movement and the fight for transparency and defending those who bring us the truth.  Sometimes that involves breaking the laws that are in place, but the information that comes forward is ‑‑ I mean, I can't think of a time when it wasn't in the public interest.  Right.

Directly related to issues of Human Rights.

If we look at WikiLeaks for example, with their releases, we can look at the Afghan war diary, the Iraq war logs that revealed the collateral murder video which revealed Human Rights abuses from the part of the U.S. government or military that none of us would know about if Chelsea Manning had not sent that information to WikiLeaks and it was released to the public.

So that's ‑‑ you know, that's one thing that comes with the digital age and with the internet, it is this ability to freely share information, and in some cases, information that we really need to know about.  We wouldn't have access to it if it was just revealed to us outside of government, outside of the institutions, legal institutions, outside of the legal system. 

Another thing on top of that with the digital age, we see grassroots social movements around the world being strengthened by this ability to freely share information and to connect with people around the world, activists able to get their message out there and we can look at the Arab Spring, we can look at occupy, Black Lives Matter today, save your internet, we see just like the capabilities that come with online organizing and having a free, open internet where people can come together and work on common causes and common goals. 

Human Rights, digital rights, when these things come up in conversation I notice that they tend to get talked about as if they're separate things.  It's like we're looking for ways to connect them but the truth of the matter is that they're already inherently connected.  You can't in this day and age, you simply cannot have one without the other.  You don't have Human Rights or at least full Human Rights if your digital rights are not in place.  Your digital rights, those are part of the basic Human Rights nowadays.  That's one thing, I feel we need to stop separating these things and we need to start talking about them as if they're the same thing because they are.  They're inherently connected.

We live in a time where just like the overall system that we live in, obviously, you know, we come from different parts of the world, it may look different, a little different depending where you are from, but I think most of us understand that the over recall system, society, the political system, it is becoming more and more driven by inequality.

At the same time, we see the digital world becoming more in a lot of ways unequal in its governments and at the same time, social issues kind of get brought into the digital sphere.  We have a small percentage of the ruling class owning most of the world's wealth and resources and at the same time, we have tech giants like Google and Facebook which are actually very small portions of the internet itself.  That's the thing that a lot of people don't realize either.  The social media aspect, the big tech giants, the websites, the majority of people are using nowadays, that's not the internet.  Facebook is not the internet.  The interrupt is a very big place.  These big‑tech owned website, the social media giants that we use all the time, they're tip of the iceberg when we think about the whole internet.  Just like we have a small portion of the rich owning so much of the world's wealth and resources we have these big companies, small portions that take up small portions of the internet owning more and more of our personal data and trying to control more and more of our internet.

You see that play out, you see how it benefits powerful people and institutions in lots of different ways.  For example, if you're on these websites, you see posts, discussions about certain topics censored on the websites.  You see pages of activists and activist groups being taken down and you see people getting censored if they criticize political figures or if they criticize political decisions.  Information on Human Rights violations by let's say law enforcement will get censored, domains locked so that you can't even share links on these websites.  Just, you know, prohibiting the sharing of documents that contain information that these companies, the people behind them don't want you to have access to, sometimes at the request of law enforcement directly.

We have an issue of big tech deciding what information is legit and what isn't?  Right.  Big tech, it encompasses a lot, who actually makes the decisions, right?  It is a few people sitting at a board room table.  Those of us that are trying to share information, well, you know, that's just ‑‑ that's what we're doing, right?  We're being treated as if the information we're trying to share often in the public interest, you know, isn't legit and these companies just have that control.

At the same time, they collect so much information about us, you know, our name, our mail addresses, our social media connections and a lot of cases, like our home addresses, and very personal information about where we live, what we do, it gets collected by these companies.  What websites we visit, which tells them so much about us as individuals, including things that maybe we consider very personal.  Right.  In some cases, secret to us.

Big tech knows about it.  That information is shared across the board.  That completely contradicts with our digital rights, it takes away our digital integrity and a violation of our Human Rights, right to privacy.  We know intelligence agencies and law enforcement uses big tech to spy on us.  If you're involved in Human Rights work, grassroots social movements in anyway, you know that the police and the spy agencies are probably monitoring your Facebook page, Twitter account, any online accounts that you have they're problem hi ‑‑ not even probably, they're watching what you're doing and making a profile on you.  It contradicts so much when we look at our basic Human Rights, what they are, our rights to assembly, our rights to protest, our rights to free speech, this results in self‑censorship and what Human Rights do you actually have if you can't even speak your mind because you're afraid of what might happen to you if you speak your mind? 

Human Rights Watch, I've good quote from them just about how, you know, on one hand technology and the digital age can be very liberating, at the same time can be oppressive to us so digital technology transformed the means through which Human Rights are both exercised and violated around the globe.  The internet has become an indispensable tool for the realization of a range of Human Rights and for accelerating economic development, yet every day, there are new examples of how digital technologies play a role in undermining Human Rights whether through a Prime Minister banning Twitter in Turkey, a death sentence for posting on Facebook in Iran, bulk electronic surveillance of American citizens by the NSA, a court ruling on the right to be forgotten and Google searches in Europe or a requirement that internet users supply real names to service providers in China.  This duel‑edged aspect of technology was conveyed well by a Tibetan Human Rights activist to Citizen Lab in Toronto, technology is this funny thing where it is a lifeline and then maybe it is your ticket to jail.

As much as we ‑‑ I think a lot of us here probably agree, we love the internet, we love technology, we think that the digital age is the same thing at the same time we're dealing with some serious issues with that, political and legal systems everywhere are still trying to catch up with this new age of technology and its implications of human beings.

I think Alexander touched on it briefly, no matter where you live, you have a constitution that is supposed to guarantee your basic Human Rights.  These constitutions I think this most, if not all cases were written before the digital age.  They're outdated in that sense and we're still figuring out how these rights that we're supposed to be guaranteed translate to the digital sphere and what needs to happen to uphold those rights.  You know, we talk about privacy, and that's a word that gets thrown around a lot as a Human Rights and it often as we see in lots of places doesn't actually get treated as a Human Rights.  The government will ‑‑ you know, these companies, like Facebook, Google, whatnot, they'll say that they care about your privacy, but they're still collecting everything that they can about you.  They say they won't share your information, do anything harmful with it, and you want to believe that, to make yourself feel better about it, but then you see the personalized ads on Facebook that are directly related to your last Google search.  They say one thing, but how is our privacy actually being protected?  That's something else.  That's a big thing here. 

Privacy actually has to mean something.  It can't just be this word that gets thrown around by those who want us to believe that they're taking it seriously when in a lot of cases they're probably not.  We need to update our legal frameworks around this matter and ensure that digital rights and Human Rights are treated as one thing, not separate things.

You know, a lot of ways, our data is an extension of us.  We need to treat it like that.

I work in the social work field and, you know, at a woman's shelter and we deal with data all the time.  We have our online program, we take our case notes, we have our files on our clients, we take their privacy, their confidentiality very seriously.  They're allowed to have files destroyed at the end of their stay with us if they want to.

There is always that concern about the software company where, you know that made the program where we make our case notes and stuff, you know, so it is ‑‑ we can take all of the precautions that we want, we can do whatever we want on our end to try to ensure the privacy of our clients but there's another aspect there that whatever we do, they still have their own terms of service, they still have their own policies and stuff of how they're going to deal with that data.  How do we know that it is actually being taken care of.

I live in Canada.  The Canadian government, they have invested millions actually in digital literacy to help people understand the digital world a little better, how to engage with it safely, how to identify the risks that kind of come with using the internet.  This is an important thing.  Especially if you're new, right.  If you're new to it.  If you think about elderly folks who didn't grow up with the internet, you know, just starting to use it for the first time to keep in touch with their families, especially during COVID time, it is a theme.

You know, people really need to be taught what the risks are, how to protect your privacy the best that you can as an individual and what to be careful of, things like that in Canada, we actually, the government is actually investing quite a bit in that digital literacy for that reason.

The Canadian government is also working on a digital charter to protect privacy and there's some principles here, universal access, that's one.  So all Canadians have an equal opportunity to participate in the digital world and that's, you know, a big part of Digital Human Rights, the right‑to‑life, you know, we all have the right to exist and to participate in the world.  Well, we also need that right protected to participate in the digital world and to have the tools that we need to have proper access and, you know, proper literacy around it and stuff.  Safety and security, it is another thing.

Being able to trust the services that you use, that they're going to keep you safe and they're not going to misuse your data and violate you in the ways that we see happening and that we worry about happening.

Consent, another thing, control and consent.  Having control over what data you are sharing and whose using it, what are they using it for?  Knowing that your privacy is protected, at the same time, transparency is an important thing and making sure that the companies are being transparent about the practice with the data, making sure that we have access to it and that we're free to share or not share as much as we feel is appropriate.

Another thing, you know, with everything kind of moving online as opposed to offline now it is important that people can access the digital services that come from governments nowadays so a lot of the services for Canadians, that's online now.  A big part of ‑‑

>> GREGORY ENGELS: Excuse me, maybe finish the sentence, but then we have 20 minutes so I would ‑‑ we have some excellent questions from attendees and I would like to spend some time addressing those.  Finish the sentence, and then we would like to move on a little.

>> BAILEY LAMON: I'll stop there.

I was just going through the Canadian Digital Charter and the principles of that, you can look that up pretty easily.  I apologize for talking too much!

>> GREGORY ENGELS: No.  You're on time.  We have 20 minutes.  That was it.  That's good.  Just a little concerned, we have a lot of good questions.

>> BAILEY LAMON: Absolutely.

>> GREGORY ENGELS: If you open the Q&A panel, you see the questions coming in and we have been busy trying to answer them.

I would suggest that we promote everybody to panelist.  It will take some time probably.  If somebody could maybe help me.

There's a button to promote everybody.  Great.

There have been some very good remarks, one had said that there's been court sentences in the past that had barred using computer, digital tools and in light of the declining digital integrity, as part of the human life, sentencing to somebody to stay away from the internet, it is kind of like a sentence to digital death which is ‑‑ when these issues were issued in the United States, in the year 2000, they were held by the Supreme Court and in 2003 they already agreed that it was too far fetching.  Being able to ‑‑ it was already in 2003, they need to be on the internet, to find the job without it, it was already unthinkable back then but probably possible.  If you talk about today, I don't think the actual current sentences is to sentence somebody to bar the internet, maybe in other countries, I'm not aware.

This is what's very interesting.

>> ALEXIS ROUSSEL: There are some cases that we have, they're more privately, and think of instead of a state, think of someone who has a profile which is very public on Twitter, Facebook, Instagram, and then suddenly it is cutoff by the social network because for some reason, it can be a good reason, a bad reason, but we had cases and it is anecdotal, of course, we saw this young lady who was having a million followers on Instagram and when she was cut from her profile she went on YouTube and made a video and she was crying.  She called, she said she called the police, that she has been ‑‑ that she‑been murdered digitally and she said that to the police, she was crying for help.  People were making fun of her.  The police couldn't do anything.  I think maybe the reaction, it was a bit much, maybe, but her reaction, you think it is natural and you see more and more and you will see it is not acceptable that people are cutoff and if they relate to their social network as really part of their life, being cut like this, it is like a murder.

That's the closest we have in the private sector.

>> ALEXANDER ISAVNIN:  That brings us to the importance of state protection power.  There was a question on whether we need to oppose Human Rights or upgrade it.  I think either one or the other.  The only rights which is a new right that was ‑‑ which appeared after moving from supreme case, it was created, something material and nowadays, we can share information much, much faster and it could be ‑‑ it could be a little expensive and still, there could be regulations coming from a print press times and actually as part of trying to work on changing the situation.

What do we need?  We need to force our states to look at the frameworks and methods of protecting the rights.  In many cases, our states prefer to take the position of hostage in a sense ignoring the digital consensus on digital acts of huge cooperation.  There was a lot of money spent on Facebook, to something like pressurized the parliamentarians to create a new legislation against them, it is already happening so, in this case, the United States government should pressurize the corporation and protect the Europeans and the European state members from their own corporations..

It was mentioned in questions, the privacy regulations, personal data regulations, the state security, they're extensions everywhere and that is extensions for the GDPR, extensions for the Russian ‑‑ of personal data regulation, and we really don't know what could help, even in fair Western countries like the United States, a program can appear for phishing where we create surveillance against their own citizens.

We need to increase our literacy and using this literacy to improve our control over what governments do.

>> GREGORY ENGELS: We have around 50 people here and we have 25 minutes left ‑‑ 23 minutes left.  We have plenty of time to take some questions and if not, somebody raising their hands, I'll just read out some of the Q&A.  They're actually very good.

>> ALEXANDER ISAVNIN: We have had a great discussion in chat while no one was panelists and no ability to voice their thoughts.  Now don't be shy, please.

>> GREGORY ENGELS: Yes.

The connection and sale of the personal data, the model for many corporation, some have compensations for the removable of the income sources or in order to achieve the goals of Digital Human Rights and it was answered something about that he will answer again and there is the remark, she hates to be so negative but the social media companies set their own rules and there is a difference in jurisdiction terms between legality and accessibility.

Definitely, it is not about making a social network, a ‑‑ nationalizing the social network, no, absolutely not.

It is I think important that the economy, the business, everyone who is making a business is doing this in a way which is respecting the individuals and respecting the autonomy of the free will of the person for engaging in the business relationship with them.

This is how our economy in general is made.  In the case of those service right now, they're based on acquiring data and processing data and pushing back the data in a way to change behavior without you noticing that you're changing your behavior and pushing you to buy something.

The amount, the impact on the autonomy, it is actually larger than what people expect, and if you talk to people who are on social network, most of them, they'll say that they're fine, they are still free and they decide what they want to do.  Cambridge Analytica is a good example of this.  There is a disconnect between what most people think and the reality.

Also, we start to see tools appearing that allows or models, they allow to create the same services without this data part, which is unnecessary, serving a video stream, you don't have to have a full analysis of the person to be able to serve the data stream and before until now, there is the only model, it is the way to monetize and there is a different way to monetize.

I hope this answers the question.

Just to go a bit further, to some kind of analogy we can make, think about ‑‑ it is a bit harsh and I'll still do it, think about slavery, slavery was like you take people to work and you take possession of people to have the workforce, to do something else and here you take data, you use it as force to actually create revenues.

There is, for example, a model which exists and which is being researched in France and at University, I can put the link, if you're interested, where people, okay, because data is part of the person, well, we can maybe see that the relationship between the data, if the data, if it is working for a company, then maybe it is more like labor relationship than actually ownership relationship and then you have other types ‑‑ you can still pay for the price of labor without having a more respectful communication between the two.

>> ALEXANDER ISAVNIN: I can also answer that question.

Collecting and using personal data, by internet corporation, it is not the main source of profits.  They started to use personal data as ability to monopolize.

E markets and to get advantage over the services they don't connect to with personal data.  I don't believe that I'm saying this.  If this will be regulated out by the states, this corporation, it could be ‑‑ they just ‑‑ there will be search engines as founded and there are extra multibillion profits that goes I don't think for common good, they'll just decrease but nothing disappears.

>> GREGORY ENGELS: We're discussing here now that people do not see the Q&A content because since they have been promoted to panelists, they do not see the older Q&A questions which is strange.  I'm busy copying them back into the chat.

Go ahead, please.

>> ALEXANDER ISAVNIN: Yes.  An interesting question that's regulation of data protection may vary from countries, some countries may not have much awareness on this issue and the citizens where the data is leaked, shared, used by third parties, having rights obtained, how do we promote sound legal framework and implement it worldwide?  First of all, the Internet Governance Forum is an excellent place for promoting ideas of such framework.  So we're exactly in place.  As members of local community, we should not just promote digital literacy but the knowledge of what you're doing and maybe make more studies.

For example, my personal ‑‑ the internet, it is on since 1996 I think, when I first time registered a domain name and database for public.  Now it is open and trying not share additional data and really caring for where I share my data.

In general, you see if you put your fingers into an electric plug on the wall that would be bad, I think additional education needs to be done from the child times on which data can be spread, how you can use internet for communications and that's a way that we should really promote by all of us.

>> GREGORY ENGELS: Yes.  Thank you very much.

>> ALEXIS ROUSSEL: I would like to add, you talked about literacy and I think that's very actually very important, the digital literacy.  What I see, it is at least in Switzerland, the education right now, it is on digital, it is on digital education, I see with my own kids, it is made first by people who are not born with this, it is normal, adults are not always born, they don't have the digital literacy themselves, it as bit hard, it is something that takes time.

Most of the digital literacy, the digital education right now has a purpose of knowing how ‑‑ the tools you're going to use in business, you know, knowing how to use Word, Excel, such tools.  It is not yet ‑‑ it has to be evolved to how to manage your tools for your life, how to manage your social network, how to manage interaction, how to manage cryptography, cryptography, it will be important in terms of signing, proving, contracts.  All of these elements of how to live a digital life is not taught at school and that's critical.  It will take time, of course, these changes, they'll take time.

>> AUDIENCE: I had a question, you mentioned that when it comes to data regulations, in Europe, France has some adoptions in terms of property law regulation and they are adopting the regulation to data regulation and there are ‑‑ there are so many issues on the patients, the data regulations, when you try to adapt property law into data regulations there can be a way to look at the localization because this time, when you try to adapt with the proper tools it can be a very core, nationalization of the data and actually it is another example of the data regulations, because when data localization, it is activated, it is impacting the science development and investments and any other kinds of science‑based initiatives.  I wondered what you really think about when you think about what laws should we adopt for the data regulation.

Thank you so much.

>> ALEXIS ROUSSEL: Thank you.  Thank you.

So there are many aspects, and it was also ‑‑ someone else, the question of other, it was also linked to that, there is several things.

First, the issue about using property law for personal data.  I think that I view it as a bad idea.  Some people, they see it as a way to define a value on data so that people can realize that personal data has value, but I think that this is more like going into the approach of value as a commodity and if you give a value and you put property rights, you can sell it, if you sell yourself to someone else, usually it always ends bad.

I wouldn't go into that direction.

The second thing you say, it is about a phenomenon we see as documentarization of the internet.  That happens in different aspects and it is not only permanent data, and personal data will push further sadly on this but we saw it with moral concept, I shared with someone, I shared a very old jurisprudence of law in the year 2000 I think, Yahoo, it was, between France and the U.S., there was a clash because everything ‑‑ the objects, they were Nazi‑related in France was prohibited in the U.S., it was for free speech, for French people, the objects, they're available on the internet and the internet has no border. 

So here we see a clash of jurisdiction and we saw this between Middle East and Europe and China and the U.S. and all of these clashes of jurisdictions and then with the approach on personal data that I described, we will see this internet growing further I think.  The approach of, like, protecting the personal data the way I hope is going to happen and very was limiting, personal data, we will see happening, what we have seen happening, it is that you will have states organizing data collection of other states, trying to understand how the whole individuals, the whole society of other states are happening and they're saying the U.S. is trying to listen to everyone, we just discovered that they were ‑‑ that the U.S. was asking Denmark to spy on the social networks in Holland and they were hording data.

All of this, it will indeed lead to reducing the digital footprint of an individual with such an approach and hopefully if we do this, we won't have to ‑‑ it is a real debate.  I see the fragmentation moving on.

>> ALEXANDER ISAVNIN: In Q&A, you had a question, you directed it to Bailey, Civil Society, other entities like academia and others work for some distance and equally sharing the digital materials and the data, it will be successful, first of all, it is successful because as we know from multistakeholder approach, which was actually not legalized or settled by the United States but just written as observation, so currently the internet is approached from academia, technical community Civil Societies and NGOs and as we started, we have achieved from a free‑press society, even in academia, you know that old style print pressing, publishers, they're trying to hide scientific research behind the corporates.  The situation is changing, so academia is fighting towards this part and as far as I remember, they were successful.

The same thing, the force against the Big Data evolution.  No, it is not.

Actually, if you're not doing it for us, well, keeping in mind, you could not cooperate during this forum, there will be an illusion.  As we see, after pressurizing on Facebook, very famous Cambridge Analytica saying they're starting to work, they're just not one kind, but there was a company set up by Russian teenagers that were harnessing data from social network, including Facebook, and also selling it to the Russian government.  Facebook started measures which they had been expelled from the social network also.  We have to do something, otherwise it will be an illusion.  We should not be shy ongoing against this and governments in a legal and peaceful way. 

>> ALEXIS ROUSSEL:  I just want to quickly ‑‑ so ‑‑ sorry, I use the word marketization rather ‑‑ I used both, fragmentation also, I'm sorry.

It is indeed a word that's been used from the beginning and for a long time, and I'm trying to go away from this.  Thank you for pointing this out that we should go away from using that word which has a very strong meaning and when I was ‑‑ I lived with this situation very close to my heart.  Thank you for this.

>> GREGORY ENGELS: A short last one?

>> Thank you for your answer, Alexis Roussel.  Another different example, a new application called Upti, and actually the application, it allows to the end users to sell their individual data to social media companies and other entities.  It is an application, UBCI, in the United States, United Kingdom, elsewhere, I think that the education, it can be ‑‑ it can be often seen in the future but there's also an open door for the clashes against the national law regulations.  Some of the national law regulations will be selling the individual data and other countries cannot allow persons thousand handle their individual data.  It is a complicated issue as we all see from the start and thank you for encouraging us about not being so shy to the government and other strong entities and as I am a resident of Turkey, we have so many cases of censorships and actually I am actually afraid of any kind of targeting and we don't have very much space for free speech, but I hope that sometime in the future my people in the Middle East and Africa, in the southerner parts of the world, they can achieve some achievements for Human Rights, et cetera.  Et cetera.  Thank you very much for sounding me, for making this available to me.

>> GREGORY ENGELS: Thank you.

We have 2 minutes until the end.  You want to answer that ‑‑ I also have two things to say.  We'll go a little bit over time.

>> ALEXIS ROUSSEL: Two things:  First, I don't like the distinction, but definitely there is a gap between countries which have power in the industrial technical measure side and countries which don't.  I think if the countries which are weaker, let's say, in that sense, they're ‑‑ they have control on the personal ‑‑ they have the population that has control on the personal data, there is so much power in your own personal data that's now taken away by Facebook, Google, other institutions, that if you make sure that this stays closer to people, you get the most value locally, I think that's one thing.

The other thing, it is about the companies allowing you to sell data, I think this is a very hard debate for sure to happen.  While I don't want to see happen, it is that those companies are selling this to another third company which you don't know.  That should not be allowed.

Usually this is where you lose control.

Are you allowed to sell data to one specific company for a specific service?  This is a moral ‑‑ you ‑‑ this is an equal balance that will be found in each society and it will have different use, you know, within France and Switzerland already, we have different views on some specific laws.