IGF 2016 - Day 1 - Room 4 - WS38: Security, Privacy, and the Ethical Dimensions of ICTS IN 2030

The following are the outputs of the real-time captioning taken during the Eleventh Annual Meeting of the Internet Governance Forum (IGF) in Jalisco, Mexico, from 5 to 9 December 2016. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

RAW FILE

 

2016 INTERNET GOVERNANCE FORUM

SECURITY, PRIVACY, AND THE ETHICAL DIMENSIONS OF ICTS IN 2030

DECEMBER 6, 2016

9:00 AM CT

>> Good morning, everybody, thank you so much for showing up bright and early at 9 a.m., this is a very unexpected surprise, given the what we have here today is we are going to discuss security, privacy, and ethical dimensions in ICTs in 2030.

Today we have a group, quite a group of panels, a diverse group of panelists, so ‑‑ the important thing is, from our view, this is, this session is about participants.  So we have no slides.  Our speakers are only going to speak for about five minutes and then we will open it up for questions.  This is your chance, I get to talk to them all the time, so I get to have these discussions, this is the chance for all of you to ask the questions you want to ask them.  And believe me, you will really enjoy what they have to say.  First I would like to introduce Meher here.

>> Good morning, everybody.  I just wanted to ask you to basically describe a little bit what you are doing in your work and also some of the challenges you faced and also some of the things that you find such as the education along with connecting people and how that plays into having more beneficial future for your country.

>> All right.  Thank you very much.  I'm very honored to be among here today.  I feel really privileged to have the opportunity to talk about the project we are currently doing.  For those of you who don't know Tunisia, it's that little country on the top of North Africa, yes, you do know it.  Good to know that.  So, yeah, one thing you have to know about this country is that we don't really have lots of natural resources and since our independence, like 60 years ago, we focused on two main points:  Education and health. 

And ever since our independence, that path kind of deviated a little bit.  There became a gap between ‑‑ a digital gap between the city and the rural areas.  So a lot of people don't have the minimum knowledge about internet, internet connectivity, so, when we started about, our project, our main project, it's called ‑‑ in Arabic it means connectivity. 

So, what we are doing is, we are trying to not only pay attention to rural areas, we don't have access to internet, but also revolutionize education in this country.  In this little five minutes, I'm going to try to walk you through how we envision our country in 2030, as far as ICTs is concerned.

We are going to start as I said connecting schools in this country, especially in rural areas.  We are going to set a deadline, connecting two schools for 2016 and then we are going to expand we are going to be connecting 24 cools, one school per state before the end of 2017.

But the project doesn't stop there.  We are going to expand and reach out to the whole community, and of course why do we start with education?  As I said, education is one important pillar in this country and we have to focus on that. 

If you introduce something, if you introduce a revolutionary technology to children, through their enthusiasm, their passion, they are going to pass it on to their parents who are definitely like leaders, community leaders who are ‑‑ businessmen, so, internet connectivity will expand in different parts of the country in many fields. 

What we are going to be doing, we are going to provide a simple solution that consists of providing periodically, like updating content, like educational content, sharing it in schools, for ‑‑ and you focusing on stem education.  We have been, we went to a school in a rural area in this country and we gave them like a little scratch workshop, programming, and there was this 14‑year‑old boy, he had no ‑‑ he didn't have a computer, but he talked to me about Python, C plus plus, and I'm a secondary engineering student, and I just learned these things. 

Imagine if these children have the right circumstances, have the right technologies and so on, to learn?  Imagine what they could do.  Then not to mention those who don't have computers, in rural areas, imagine what great minds might be hiding over there.

This is basically what we are doing.  And as I said, the project doesn't stop there, we are going to also be doing innovation hubs, a place for children and everyone who have great ideas as far as ICTs are concerned, provide them with the right tools, everything that they need to start the projects and so on.

This is an interactive workshop, so feel free to interrupt or ask questions at any time.

>> We will get through the panelists first and then we will get to that point.  One of the things you ‑‑ I know we had talked about is, not only are you connecting the schools, but the gender balance is very close to 50/50, sometimes even better than 50/50 in certain situations.  Could you maybe talk a little bit about how you all are doing that?

>> Absolutely.  If ‑‑ gender equality is something obvious, women have the right to vote a very long time ago.  Through our project, we are going to be trying to focus a lot on gender equality.  And in fact, we are not really ‑‑ pushing this, we are not really pushing ‑‑ it's like we are ‑‑ people are ‑‑ children are pulling it.  And especially girls. 

When we gave this workshop, girls and children, they don't really have a great idea about the internet and what ‑‑ the opportunities and the possibility that it offers, and she said, like, oh, we talked about, we talked to her about how she can create a website, and she said, okay, I can sell my homemade products on‑line.  She is already a social entrepreneur, and if the right skills, imagine the future she could have and the change she could make in her community.

>> That's great.  That definitely goes towards achieving the SDGs on gender equality and learning for girls and women.  That's amazing.  Next I would like to turn it over to Greg here.  And Greg is a cyber securities expert.  I don't know if that does it justice.  In the context of cyber security and what it's going to look like in 2030, how do we weave in that fabric of trust where it will be beneficial and productive down the road and what do we need to do now?

>> Thank you, Justin.  I'm Greg Shannon, we are one of the chief scientists within the software institute and I have been the chair for cyber security expert.  The fabric of trust is what people are discovering, they engage ICTs, it extends their boundaries and their ability to engage. 

It's a much more nuance, complex challenge when we are dealing with things like denial of service attacks coming off of internet of things, worrying about fake news, worrying about, you know, does our system, can we trust our systems, can we trust the way the infrastructure is put together?  There are four components that go into that.  There's security, privacy, resilience accountability.  And I think various cultures and societies are recognizing that accountability is one of the key things. 

From a technology point of view, which is the IEEE.  What's the science and technology we can provide in this ethical challenge, in this policy area, it's about creating technologies that ensure that there's a accountability, you don't necessarily have to take someone's word that yes, I went down the checklist, it's baked into the technologies. 

So the notion that things are secure from the start privacy ‑‑ there are going to be things that we don't yet understand, but we need to be resilient to that and then the accountability, as society, as we have issues of right and wrong, we can go back to those who break that trust as they use the ICTs.

We are seeing this evolution, we ‑‑ I expect it to continue over the next, over the coming decades.  But it's really about us all being mindful of that and recognizing that we have to critically question, can I trust this, what does it take for me to trust this more? 

There have been some cases in Africa, in particular, in terms of delivering ICTs into the population, where properly engineered systems that take into account, say a security, and the notion of prepaid cards, and using proper encryption, proper protocols for ensuring that someone can't steal your prepaid card, or can't create fake prepaid cards, make and ecosystem based on ICTs that everyone can trust, whether it's the incumbent organizations, the government or the consumers. 

So really this interweaving of security, privacy, resilience accountability is really essential to having something that we all can trust going forward, especially as we bring on the next billion individuals and billion devices.

>> Thanks, Greg.  That's really interesting.  And when you talk about accountability, I guess whose responsibility is it to hold people accountable?

>> I'm going to punt on that.  I'm going to say the science technology is to help identify who should be accountable, kind of what part of the trust process is broke?  That is up to society, different nations, different cultures, that's part of governance issue, how do you resolve that issue, but it's about, to be accountable.  You have to be able to point at some entity, some rule that kind of failed in providing that trust.

>> Thank you, Greg.  I'm going to shift to Louise now.  Yesterday we were having a good conversation, and you were discussing the different perspectives on this issue, and so I would just like to let you pick up that conversation that we were having.

>> Great.  Hello, to everyone.  My name is Louise Marie Hurel.  Center of technology and ‑‑ first of all, I would really like to thank the youth observe tore I that also gave me this opportunity to be here, it's a pleasure, thank you so much.

Well, first of all, I would really like to talk a little bit about, some questions so that we can have a nice dialogue over here, I think that's the whole point of this session.  So, well, talking about the relationship between privacy, ethics, and security and ICTs, is already challenging.  The thinking about the complex interaction between these issues, in 2030, is taking the debate to a whole new level. 

So, which brings me to the point that I want to make here, which is talking about perspective.  And I think at least we can see 2030 within three different perspectives.  And the first one that I would like to highlight is the temporal perspective and in this case, well, there's so many questions that arise when we think about it.  How are technologies being built nowadays?  Is it mostly to feed a supply and demand relationship?  Is it to be competitive? 

Are we talking and taking the time to build ‑‑ taking into account privacy as a fundamental asset rather than purely economic.  And the second perspective is this ‑‑ perspective.  This perspective is translated as I see and this is a trend that I also would like to highlight over here, for example, through the internet of things, we don't see any more this difference, this base difference, like I'm going to talk to someone in another country, is it not more about shortening distances, we see ICTs is part of our lives, taking over our homes, our hospitals, it's taking every little aspect of our lives. 

So it kind of shift into our perspective of seeing how the relationship between ICTs and this ‑‑ dimension.  And the third one, the most complex of all, is the governance perspective which Greg was already talking about a little.  It's undeniable these are multifaceted dimensions. 

Data bases, AI, rather than thinking of one solution for all of the privacy and security problems, we should think about how to promote dialogue and confidence between stakeholders, how to engage and include the different stakeholders. 

The same as we are trying to make sense of technology, private companies are trying to make sense of how to promote technological innovation in most economic dynamics, so governance are on the other hand trying to make sense and juggling with the challenge of drafting normative frameworks, they are ‑‑ withstand the shift in technological changes. 

Well, both temporal and spatial perspectives are normally taken for granted, especially when we talk about internet expansion, globalization.  When looking at these perspectives, we are seeing less temporal spatial reduction and more process of translation of the around ‑‑ into bits and pieces of computer managed information.  And as individuals, we are more distant from what is actually being decided.  Sometimes we are rendered to a system algorithm ‑‑ even as fragmented bodies of information or collections of what we are. 

In ‑‑ it seems as if we are, part of this big wave, this undeniable trend of an interconnected life in the making.  I see it.  The bottom line for me is that it isn't simply due to the fact that it is out there, that we should blindly accept it.  Notwithstanding, it's a technical, devoid of social political and economic, and subjective interactions. 

But the question I would like to point out today and concluding also is, are we building sustainable standards and policies capable of upholding trust between individuals and ICTs, because what I see today is that the future, we are, where misconceptions regarding privacy and security, surveillance not only as a means to collect and store data, but as a process of consolidating what I believe is a trend for the next few years, and unfortunately it's kind of a pessimistic point of view, but a surveillance capitalism. 

With this in mind, I would like to conclude by saying I would finalize with this brief collection of ideas, because I think that's what it is for us to debate.  Saying that perhaps there is no end to this in the making, because there is no absolute control of the multiple dynamics in these three perspectives and the many other perspectives that we can highlight.  However, we can think critically and act to change paths through which we can navigate and consolidate collective understandings of security and privacy multiple dimensionally.

>> Thank you, that was excellent.  And fair to say here optimistically pessimistic?

>> I think so.  I think in the end ‑‑ I ‑‑ your experience, I hear experience such as Meher's, this brings us hope, other perspective how to include privacy and security is bottom up rather than decisions that are taking for us and that we don't have any control of it, so in some kind of way, I'm in this gray zone of optimism and pessimism, so, yeah.

>> I think a lot of people are.  So, and we do have a remote panelist.  Are we ready to ‑‑ is the audio working for them?

>> Hi.

>> Hi, John.  So, I'm going to let you introduce yourself, but first I just wanted to set the stage here for you, and that is you do a lot of work in the ethics of artificial intelligence and autonomous machine systems, and I know you are doing extensive work right now, and my question to you is, pretty much, what do we need to do now, right now to ensure that the world in 2030 is productive and beneficial for all of humanity?

>> First of all, let me know in you can't hear me or you can text me if there's any technical difficulties.

>> Will do.

>> And second of all, I'm honored, thank you so much for inviting me on to the panel.  Thank you so much, and also, really, really appreciate the comments that the other panelists have given so far.  So a lot of what you hear me answering to your question, Justin, is yes, and, to the great comments that have come before. 

I wanted to say although in terms of my background currently, I'm the executive director of a new program from I trip Lee, global initiative for ethical considerations and artificial intelligence and autonomous systems.  I'm speaking for myself as John today, not formerly as representative of that company as it were.  I think the thing about 2030, which is really critical to think about, there are a couple of paradigm level aspects of life today that we need to take into consideration with regards to how we want to move forward as a society.

One is the idea of how we determine value in general.  And I have been working a lot in the past few years in what's called the beyond GDP movement or gross domestic product movement, in the sense that GDP, while it's an excellent measure for financial aspects of what is value around the world, aspects of species or money, it was not built when it was first designed by Simon in the 30's, it was not designed to be a measure of holistic well‑being. 

This is something he specifically said.  He said don't let this become a measure of happiness or well‑being, and it now is see deeply entrenched in our mind set, I think around the world, that GDP is sometimes considered the be all, ends all of value.  It's really important to remember, especially when issues of like autonomous systems or autonomy comes up, in general, and again, I'm not in any way trying to sound pejorative or negative, I'm trying to make a statement of fact. 

When there is any economy in whatever part of the world that is using GDP as a primary measure of value, a lot of the modern idea of GDP has to do with exponential growth.  So by exponential growth, I'm not talking about year over year, normal growth, adding in cost of living.  I'm talking about a sort of IPO quarterly shareholder profit mind set which is not wrong or evil, again, I'm not interested in using those terms.

However, it is critical that as a society, we realize that if an organization, whether it be corporate, non‑profit or any other, or even an individual, if their focus is primarily on exponential growth, they will likely choose say a measures of something that might have autonomy that could serve that growth fast err.  In many examples, that may be great, that may be the right choice, meaning the ethical, moral, and economically sound choice. 

However, in my work and the work I'm doing for IEEE, that it should not be the de‑factor choice.  And by that, I mean now is the time for humans to determine what are the values and what are the measures and metrics of value that we want to incorporate into the machines and systems that we are building so that by 2030, we can look back and say great, we imbued the right values in these systems. 

And so for instance, I love what Greg said earlier about the accountability.  A lot of times you hear talk about transparency with algorithms, which is important, but I would say accountability is even more important to echo what Greg said.  Accountability like the idea of the internet of things where there's technological interoperability, accountability especially like in the standards world, when you present a standard, you also could present a certification system around that standard. 

And what that simply means is, you are translating, as it were, how that standard is important to the users, but also providing a specific way for that standard again, as it were, to be interpreted.  So when it's interpreted, the certification, not just have a checklist to say hey, I did these things I'm supposed to do, but the certification is designed to let you know, these are why I'm doing the things I'm supposed to do.

The other quick thing, two other quick things I will mention:  One thing is, you hear a lot of talk about algorithmic bias.  As we move toward 2030, bias is not necessarily bad.  We are all biased, because we are humans.  And by that I mean, we all have our subjective truth.  We come from different places.  We are male, female, we are a certain age.  Again, these are not, quote, bad or wrong things. 

But the transparency accountability systems and work in algorithms can help you recognize we are biased based on the group that you are targeting or working with those had algorithms is negative.  These are established pre, before you put the algorithm into place, that there may be racial aspects, meaning negative racists or aspects to these. 

This may be in terms of, aspects of the algorithms that are targeting one specific wealthy part of a population or the world, that means the algorithm is not given will universal value.  There's certainly a lot of issues with algorithms with male and female issues, meaning gender equality. 

We have to take the time beforehand to establish whether it's IRB's or other ways that we can determine and say these are the types of tools we would like to use to help the programmers and the engineers and the scientists creating these things.

Last quick comment that I will make now, again, sort of mirror what Greg said, but this is the work that we are doing in the global initiative that I'm the executive director for, is to really help determine, I'm sorry, help provide a new way of innovation for how systems and technology are built.

And in general, and I will close here, I have about another 60 seconds and then I will stop talking, but you got my on my soapbox, so thank you.  Again, in general, I'm not interpreting in any way, and I'm not trying to be pejorative, but in general, hour a lot of technology is designed today, the de facto way it's designed, goes the idea for a new phone or a new widget comes from senior management, we want to have a new version of X phone. 

The design for that phone is sent to engineers or programmers or technologists to start building.  Where there can be ethical difficulties, where a programmer or engineer is working on the phone discovers aspects of how that product or service will be implemented with the end users of that system, but then they realize there's an ethical problem. 

We made this phone for moms and there are no privacy settings for moms.  Now the engineering going back to management and says what about this issue for moms, now because of the paradigm of GDP, in general, there becomes a pressure to say, from the management, hey, listen, make the phone the way we designed it, we have to get our quarterly numbers in, the phone has to get made.

And then what that happens, the engineer or programmer in general has a couple of choices.  Build what they have been told to built or become a whistle blower, that system doesn't work for management, those individuals are pressured to get the job done but they may be risking a lot of aspects of how it gets done, it doesn't help the engineer, in the person in the middle, they feel threatened. 

It doesn't help the end user, the mom doesn't have the privacy she wants.  The work that we are doing is about, there are no whole ways of methodologies for design in general, they are called things like value sensitive design or values based design, and they are really modeled after the privacy by design paradigm that you have talked about. 

And on top of, and I'm happy to talk more about this later, Justin, but on top of the privacy and security and identity issues for individuals being sorted out, the logic here is, empowering along with the existing codes of ethics, all of these organizations have used for years which can be most of the time very helpful, this new process of actually designing technology means that management along with marketers, programmers, whoever, before a product design is sent to programmers or engineers, use these new very rigorous ethical methodologies or applied ethical methodologies, like value sensitive design, et cetera, to really best understand the values of the end users using the technologies or products, that way you can build at the front end of the design that goes to the engineers, et cetera, is one that you have done your best to professionally align will match the values of the end users. 

If that's the type thing that we can get going universally for anyone building these technologies today, that's our mission in the initiative and that way it takes pressure off of everyone in the ecosystem.  By the way, the ends point, the shareholders where they are making money, whether there may not be ‑‑ that build trust, that do still make profit, and also this is a term I have been using to talk about sustainability is ethics is the new green in that sense. 

10 years ago companies realized that their products, when they were making their products that were going to help the environment could be something that could be defining their brand, not by spin or bad PR, but by letting their customers and stakeholders, ends users, employees know that we care about the environment, we care about the planned.  That became a huge part of defining their brand.  Now that same message, instead of it being just about the planet, it's about people.  We want to sustain the values and the trust of the people that we care most about.

Anyway, I hope that helps.  Thank you, I'm really honored to be here today.

(Applause).

>> Well, as you can hear, they enjoyed what you said.  I did have in a sentence or two, a quick question.  Is it fair to say that you think that corporations may want to incorporate a chief ethics officer as in their ‑‑

>> Yes, and thank you for saying one to two sentences, that was very smart of you.  Yes, although I would probably say chief values officer, because sometimes I found in corporate circles, again, I totally understand this, I worked as an EVP at a top 10PR firm, the word ethics is associated with, only with things like compliance and risk. 

So when you say the word values, I think that's a more holistic, exciting way to then move that individual who would be the chief values officer to have the same level of importance to say the chief marketing officer which is where I think they definitively need to be for the organization.

>> Thank you.  I just want to ask the remote moderators, do we have any questions ‑‑ no?  Okay.  I know I said I was going to get to the questions by 9:30, I'm a little bit over that.  So I would like to open up the floor for questions.  I challenged you yesterday at the booth.  So ‑‑ so, Julia, go ahead:  Press the center one.

>> It's not a question, but it's a food for thought for everybody.  My name is Julia, I'm from the youth IGF program and I'm friends with Louise Marie.  And one thing that got my attention is, these issues you present, education, ethics, security, and innovation ‑‑ technologist, this ‑‑ how innovation is driven and how we should be driven, like this is curious, because we usually, when we talk about technology, and specifically about ‑‑ technologists, we ‑‑ neutral institution and ‑‑ genius institution, not ‑‑ culturally, but in our speech as we do it, but actually, this is not true. 

We have innovation, not as an ‑‑ itself, but the ‑‑ as means where humans ‑‑ issues.  And this is kind of arduous, but at the same time this is not, we prefer not that in order to technology to work towards empowerment of the whole, and society, it should have the ‑‑ represented and taking action over it, and not only a part of society, not on the part of the state, not only a part of the private sectors, and sometimes not even a part of specific society, so to be short, in order to ‑‑ public policies and responsibilities towards ethical issues of technologists, we should instead of ‑‑ this is a part of the stakeholders, and finally, I believe that ‑‑

>> Let's stop there.  And ‑‑ 

>> Okay.  Sorry.

>> That's okay.  I know you are very passionate -- could you turn your mic off?  Thank you.  Greg?

>> So, I will pick up on the innovation angle that you mentioned, that, yes, it's not ‑‑ it's not its own end, it's towards a particular purpose.  I think what is interesting about the ICTs, with trust, with the trust worthy environment, it can facilitate innovation, because really what is key there is diversity of thought. 

And as you are finding in Tunisia, being able to give people from diverse backgrounds access to technologies, you come up with solution to problems that society has.  I think that's what is exciting about ICTs, it provides a mechanism to get there.  The diversity of thought is really the key to there.

>> Anybody else want to comment or I will take another question from the floor?

Okay.  We have two.  I saw your hand first.  Please keep the questions brief and let's have it be a question for the panelists, thanks.

>> In other fields like in law, there's standard rules about, you have got to see a professional and they will do some things for you and not others, they won't ‑‑ cook the books for you.  What's the equivalent space of technologies for a technologist?

>> I hope others have answers to this, but ‑‑ John might want to join:  I will rephrase the question, what shouldn't the technologists do?  I think the short answer that I found professional I will is not let the non‑technologist be deluded about what they can and cannot do.  I think it's kind of honesty about the technology, and I think this has important ‑‑ this plays out important ways. 

If you look at the position that Europe has on genetically modified crops, a lot of that what a fallout on honesty about mad cow disease, about other threats that were popular at the time and it caused society to say no, we don't want GMO's, it's where in the world there wasn't an issue.  So ‑‑ what you know about technology, what its limits are, what its constraints are, not letting non‑technologist be diluted is really an important role.

>> That's really interesting.  And I guess I kind of get ‑‑ it kind of gets to another point ‑‑ yeah.  But ‑‑

>> Justin?

>> I knew you were there, John.

>> Well, I guess one thing, and this may not directly answer the question, so forgive me if it's a bit tangential.  When you say artificial intelligence, when you get practitioners in the room, that could mean a world of different things, it could mean cognitive computing, deep learning, machine learning, we talking ‑‑ but as of today, when I say artificial intelligence, to me, it's a synonym like for the internet. 

When someone says there may be artificial intelligence or autonomous technology in X, in 2030, it will be hard to find things where they are not there, like maybe grass, but even grass, or trees will certainly have sensors in them. 

Anyway, all that to say one thing I think especially for this panel, to really contract thoroughly, especially for kind of average citizens and I'm from the states, so I will talk about the states, is when you hear the word privacy, in terms of personal data, it can turn people off, because they are tired of the conversation. 

Whereas with AI and autonomous technology, everything starts with the data.  And what that means is that we have to get, in my opinion, we have to get out of the framework of thinking about the word privacy regarding personal data as being something that is only in one sense left up to an individual's preference.  Because I can prefer to share my data with everybody, advertisers and whoever, a different person may think I don't want to share my data. 

That's great, again, it varies based on regions around the world and countries in -- westerners in general tend to be a little bit more individually focused about their data, sometimes in the east, it's more community focused. 

Nonetheless, all that is a preface to say fundamentally today, there's a massive asymmetry in terms of how data is gathered by systems that track people versus how individuals, whether they prefer to be an individual, community, whatever, how individuals access that data, talk about their data, and provide conditions about how they would like that data shared. 

This conversation usually gets a lot of emotion going, because specially people think you are anti‑corporate if you want to control the data, that could not be farther from the truth.  Individuals need to help corporations and organizations define what their data means as it is attached to their identity.  Because in general, there's two ways to track, right? 

Or I should say right now today, we are tracked thousands of ways online, in the real world, CC TV cameras in the real world, online through chat box, algorithms, we are tracked a thousand different ways.  It does mean that others, other actors and organizations are in one sense literally defining our identity for us to people outside of who we interact with. 

And why that is so critical in the next five years, let alone by 2030, is when virtual reality and mixed reality becomes ubiquitous, you put lenses over your eyes or speakers over your ears, where you are literally putting that lens in front of everything you see. 

Where individuals have not defined their data, I'm John, I'm a male, I'm this religion, I'm that, and you attach it to identity, I live in the United States, et cetera, where I haven't done that with a proven identity source, whether it be a US passport or the UN, then it is not done.  It's binary, it is A or B. 

Where we are going to have a really interesting and tough time in five years, when a lot of people put on these lenses afternoon glasses and realize, wait a second, I'm not in control of my identity.  This is not an issue, say, like a refugee, this is a data refugee.  We will not have access over the data, this is important for us to manager, it's a paradigm shifter that I want to make sure we talk about.

>> Thank you, John.  I do want to kind of spin this question to the two of you over here:  So we heard about the question was based on lawyers can do certain things and they can't do certain things likely on the stand and things like that. 

From your perspective of educating people on ICTs and when you were talking about governance and everything, how important do you think it is that the technologists explain the actual technology as best as they can so A, people feel comfortable using it, and B, policy makers have enough understanding of the technology to put forth proper regulations?

>> Well, from my experience, we are working with education, educating children.  So providing the technological means like the computers or whatever technological solution we are using is not enough.  We have to educate the teachers who are going to be using it to teach the children.  So you have to teach them very well how, what is this technology, how it works, because we are talking about something very delicate here.  Educational content. 

So if you don't provide something very secure, something that you understand very well and you are aware of everything that might come up on it, so it's ‑‑ like educational content must ‑‑ there shouldn't be any interference, like the information, has to be very secure and controlled by the government and whoever is responsible for this. 

Yes, the technologist must explain everything regarding this technology, especially in this field because we are dealing with something very delicate which is, of course, education.

>> So, that is a very challenging question.  First of all, because, from personal experience, we had this meeting in the middle of the year, the youth ‑‑ IGF, and we got some students that were still at school, and we talked to them and we had this great discussion. 

And one of the first things before I answer the question is, I'm a quite skeptical of reinforcing technical and non‑technical, I think it brings such a huge gap in terms of practical issues when we start talking about that.  But on the other hand, in thinking about making comfortable for both sides, I don't know if I have an answer for that, I don't know if I have a solution for that. 

But I think it is part of a process of building trust between both because this dichotomy, this relationship that was built between non‑technologists and technologists brings them so far away from each other.  So when they sit down at a table, sometimes they don't even know where to start.

So I think one possible way is to promote simply dialogues and ‑‑ I know universities try to bring people that weren't actual work in this side of writing code and being part of this in the making of ICTs, and people who are actually, they don't know anything about ICTs, they don't know how ‑‑ it's a ‑‑ box for them, for many policy makers. 

I would say from personal experience, there are remote states in Brazil, it's difficult to get information there, not because we don't have internet or ‑‑ it's also because of that in some sense, but it is actually difficult to communicate sometimes with these policy makers.  There's no funding for these ICT guys to go there or no funding for different initiatives to roll around the country and start promoting thee kind of dialogues. 

Before this question of promoting a comfortable space, I think we should think about practical measures on the ground to promote ways of getting them to the table and just talking.  Because I think we are lacking so much on that.

>> Okay.  So, I know we ‑‑ Jim, we have a remote question, so, hold on one ‑‑ no?  Okay.  So I saw ‑‑ okay.  Okay.  I have four hands that I have seen so far.  I have here, here, there, and here.  So, we are going to go in that order.  And then I will take ‑‑ I will look, depending on time for a new Q.  You have the floor.

>> Hello, it's Yolanda.  I just have a question, it's very simple, really, what is privacy?  Is there such a thing as privacy?  The minute you have an e‑mail, you say you accept the terms and conditions without even reading it.  Is there such a thing, and what do we mean by privacy. 

The second question is, talking about personal data and all these things, but where is Google and Facebook at the table?  At the same time, we need to talk to the people, the companies who are holding our data.  So I think, is there anyone here that could probably share, it would be great, thank you.

>> We are going to go to Louise and I know John just texted me, he wants to answer this, too.  So, believe me, I'm not the ‑‑ I'm just texting with John over there.  Go first, please.  Please.

>> Okay.  So, the big question, what is privacy, I don't have an answer for that, definitely, I don't have an answer for what is privacy.  But when you talk about terms of service and being more accountable, I think that is right on spot, on the accountability and transparency, because we don't have any ideas, most of these ‑‑ they change whenever they like, so there's no accountability, we don't know actually what is happening, what are the practices behind those standards and regulations. 

I think it's part of a process of really trying to build trust.  I don't know how to get them to the table, because there's so many problematic aspects in the middle, especially, for example, in Brazil, with the what's app case, for example, it's difficult because it created a very huge gap between the policy makers and those law enforcement agencies, and the companies, the private companies, there's no dialogue. 

It's really difficult when you have these kind of situations.  This is one point.  And the other one, I don't know how many of you saw, but yesterday night, I think it was 11UTC, Google, Facebook, and Twitter, and Microsoft, if I'm not mistaken, all the huge companies, they decided that they are going to start trying to take down content related to terrorists. 

So, but there's no transparency, what are the standards being implied, so it's kind of ‑‑ I was talking two days ago with people from the ‑‑ it's shadow regulations.  So, there are so many opaque spaces, but just trying to add to the point.

>> That's great.  Thanks.  And I'm going to go to John briefly and then Greg.

>> Yeah, I think it's critical we stop using the word privacy.  I'm not interested in defining privacy anymore.  Here is why:  700 different definitions of privacy and they all may be valid based on the user.  I'm not trying to be facetious, it's an important word. 

For example, in the United States a few years ago, Google, and I a lot of Google, I'm using G mail as I speak, well, not as I speak, but it's a great company, they have a wonderful program called street view that maps the planet.  However, in the states, they struggled and I think still, struggling with a legal case where their cars, as they drove around could mine unencrypted Wi‑Fi data.  Now a lot of people, least my friend in the states don't know what that sentence means. 

However, it may be a bad analogy.  But I can decide to leave my house unlocked, anywhere in the world.  In the United States, New Jersey, I can leave my home unlocked.  That does not mean it's okay for someone to steal my things, those are binary decisions, legal.  My house is unlocked, you may call me a fool, you may say hey, maybe lock your door, but it is still illegal for someone to walk in and steal. 

Think I that's a metaphor, I may not encrypt my Wi‑Fi, but that doesn't mean ‑‑ I need to be protected.  The privacy, a lot of times, gets us back to preference.  We need to move beyond preference, modern ‑‑ if there's a 50-page document that someone needs to read to get the new version of iTunes, that means it needs to be updated. 

We are talking about machines having human ‑‑ it is also time to update the nature of how people understand and connect to the companies they want to generally trust.  Google, Facebook, all of those folks can be the leaders in this area.  And again, I do not mean to sound condemning street view, but it's a great example, to move behind the privacy debates, we protect rights first and the preferences come after.

>> Thank you, John.  Greg?  And then ‑‑

>> So, I don't have a definition for privacy, although the one thing that I think is helpful to consider is, if you look back a thousand years ago, 10,000 years ago, and trying to understand what privacy means, it's a fairly modern concept.  If you are living in a tribal community, everybody knows what you are doing, they know everything about you, and you are not going to escape that for better for worse. 

And I think it's not clear to me that the anonymity that people, also goes with privacy often is really, it's a modern experiment, this degree of anonymity.  I think ultimately it will be a social contract, it's a set of norms that's going to evolve, and in 2030, it may be quite different than what we have, than what we think of today.

>> Louise, I know you wanted a quick follow up.

>> It's a really quick follow up from John.  I would like to work with a metaphor that you were talking about, John.  And while I think ‑‑ when we think about our lives and our homes, we are thinking about when we walk into our home, we are thinking we are in a safe space, a lot of locks and doors and we feel comfortable there because we are safe. 

But just adding to this metaphor, when we think of IOT, there's no in and out, there's no boundary.  It exceeds the notion of a private space.  So I think when we think about shifting ideas and notions of privacy, we should think also about how diluted it will become, it's not more of a spatial thing, it's much different.  It's not home as a safe space, and if we don't try to tackle the question of trust, I think it is really critical. 

So I just wanted to point out this home as a safe space and as we continue on this trend and undeniable trend of hyper connectivity, I think we won't have these spaces, if we ‑‑ like, people that are being born now and they are living in this world, hyper connective world, if they are going to have the same view that we have today, it's going to be okay to them that we are just not having home as a safe space and everything is going to be connected and that we should just try to fragment this privacy as part of different gadgets and devices.  So just food for thought.

>> Okay.  Meher and then ‑‑

>> A quick remark.

>> Louise was talking about how everything is connected to the internet and everything is faded away because you decided to put your belongings into the internet.  In 20, 30 years, we are going to be talking about nano IOT, that mean parts of your body will be connected to the internet.  If you took that decision that you want certain parts of your body connected to the internet, so, I think there's no longer any privacy whatsoever.

>> Okay.  And then last, very, very, very briefly, John texted me.

>> Smart to give me three very’s.  No.  I get to choose what my identity goes, in the future, identity is what we should be thinking about versus privacy.  To the last comment, I couldn't agree more, I may be sharing aspects of the neurons in my brain, sharing aspects of my identity, but privacy being dead or the home space, it's a great example, but the home space has nest, it has all the different connected device, the home space is not quote private. 

However when I put on virtual reality going ‑‑ a personal cloud, a personal management system, I am allowed to and I believe I have the right to create an algorithmic version of my identity where it's not necessarily that I own all the data, but I do have a voice, I have a subjective voice, I get to be a citizen of the quote world, capital W, in any environment for any time in the future where I said who I am.

>> Okay.  Thank you, and first off, that was a great question.  And you, over here, you have got a tough follow up.  And also I neglected to mention if you could state your name and affiliation before asking your question.  Thanks.

>> Hello, I'm Nicholas, Germany.  Nowadays a lot of people provide personal information, this can be utilized in the sense that ‑‑ are not emotionally attached to their private data somehow.  How do you think that we as designers of technologies ‑‑ can build the emotional tie between ‑‑ personal data.  If there is a chance to ‑‑ such a thing.

>> One approach is to give people examples.  I think also it goes back to the sense of education.  It's about educating, you know, students early about what the consequences are of that data, and to build stories around that connection, and to show how the mechanisms of the internet take your data and create a visceral, something that you viscerally may be concerned about, and it ‑‑ goes to John's comments about identity, about how sharing your data does define your identity. 

Whether or not you have control over that, that's another issue.  But to help them understand, the choices they make will influence that, and how that actually carries out.

>> Thank you.  So, let's see, I have a cue here ‑‑ oh, wait.  I was waiting for to you text me back.  I know you wanted to touch on this briefly and then we will move to the next question.

>> Yeah, I know I'm overly passionate and I get to be remote and be passionate, so thank you everybody.  I think, when I talk to people about this, right now, today, I go on a plane in February to travel to London and I live in the states.  I have a blue document that's called a passport.  It doesn't ask me about preferences or whatever, but it does give me a tool that right now today is recognized around the world.  Is it perfect?  That's another discussion. 

Attaching aspects of my identity to that type of recognized document is what I'm saying we need to consider and implement for the future.  Because, for instance, what can be tracked about me, a lot of times inaccurately through the states through third party data brokers are things like where I live, my gender, my race, this is ‑‑ why I bring this up, I work in the AI space, and erroneous data is one of the problems of creating did systems, where an individual can say hey, all these individual companies that want to build this great stuff for me if I can't access and be allowed to correct errors, you are going to be building off of erroneous date. 

The second one is, you may track everything I do, you still may not know unless you ask me for instance how I identify via my gender, am I male, female, do I identify different way?  You will not know my faith.  You may track me going to church, but I may do that because I have a job there as a janitor.  I don't actually believe in that orientation.  These are things algorithmically.  If we don't start asking people about the values, oriented questions, we build systems that are based on tracking from outside in and don't ask the inside out.

>> Thanks, John.  And to everybody here, I'm going to close the cue for now because we have a bunch right here.  So but first, we do have a remote question, I believe.  Okay.  Great.

>> So, it says recently in the Portuguese internet governance initiative was discussed that the argument that users must have full control over their own data implies along side a better premise of users and consumers to this process through proper skills and training.  It is clear that a greater responsibility and accountability of actors who provide services and products over the internet is required. 

And the, by design and default should be a private sector priority which could reduce government are regulation.  In the meanwhile, are we living as ‑‑ once said in a fear and convenience internet environment regarding to privacy and security.  And he has got a question mark at the end of that, I guess he is asking for comment on that point of view.

>> The negotiation of controlling your own data, I think this probably disagrees with John's view on this.  One, we do not have the technology today to give the consumer that choice.  It does not exist, unless you more or less create a very outdated, old and broken system, and I think most of us would not want to use it, it would be so constrained.  The technology does exist.  The second thing I think, it's a question about what does it mean to control data? 

If I have a camera here and I accidently take your picture, does that mean you control my picture?  I don't think that socially we are going to kind of accept that, so I think there are some hard questions we have to ask when someone says I want to control my data.  Well, do you really understand what that means?  I don't think we do.  That said, it's important we do have that conversation, but I think it's, you know, we can't assume the answers are easy or obvious.

>> Just a quick follow up.  I totally agree with your points, Greg raised.  And another thing I would add, imagine if we had, and I'm not saying I'm not in favor of some control, it does have beneficial aspects, but just trying to challenge the, this question, if we had control of our data, total control of our data and we decided to use it for economic purposes, if we want to earn something, if we wanted to sell our data, if we had control and we wanted to just share our data and receive money as in return, so, I don't know if that's the way, I don't know if that would be beneficial, and I think that it's a very challenging question.  But yes, I just wanted to add on that.

>> John?

>> First of all, something, again, I couldn't agree more, it's challenging.  But it may be the nature of my work the past couple of years.  I will have a phone call with someone who is, like, who will say is to me and they mean it, and they are the expert, within 20 years, this device will have human level ‑‑ that to me sounds like a challenge, that's a challenge.  By the way, I'm not, again, trying to be facetious in any way. 

It is a challenge to make these updates to how we think about data.  However, to one of the last points, instead of saying can I control my data, I think the more important question is, will I be allowed to inform my identity.  And what that means is, I can right now, today, it's hard, and people won't necessarily listen to me, but I can, even at an algorithmic level, by the way ‑‑ has done a lot of wonderful work in this sways. 

I can set up terms and conditions about how I would like my data to be used and shared.  I in the states can say here is how I want to share my data, I can attach it to my identity, it can be algorithmic so it lives in the cloud, when I go to another country, I can see what I want, it doesn't mean it will happen, but it's still there, out in the either, these are the terms and conditions about how I would like my data shared. 

Especially with commercial usage, just because laws right now in the states largely favor me not having access to my data and it was only two years ago that in the states I had access to my medical data.  So let's also move the conversation beyond privacy and talk about life or death literally.  Only two years ago did HIPAA say I could have access to my data where I wouldn't get it just beyond the facts or a written piece of documentation from my doctor. 

We have to, we must move beyond these ideas that just because these things are difficult, we shouldn't really deal with it now, which I'm not saying any of you are.  Again, let's move the conversation from controlling data to teaching individuals the essential paradigm shift of understanding their data, it's a primary asset of their lives, we have to empower the corporations. 

It's a great way to say, if they are doing it, how can citizens be empowered with something similar?  But again, the privacy conversation to me, it's something we need to evolve and help the citizens have these tools to identify themselves, they have some say over how other data is used, just like you were saying about accountability, my data is used here in a way I showed you I don't agree with.  Then the accountability is traceable and usable.

>> Louise, a quick follow up on that one.

>> I completely agree, that, not just because it's difficult we should give up on that, I think it's important to keep on thinking about the challenges and opportunities that come with these debates.  And just trying to bring another perspective on the subject of controlling data and after making a quick comment to what John said, first of all, I think one thing that we should probably think about is data portability. 

If we perhaps in thinking of controlling data.  Having some kind of control over our own data, we could, like, think of ways of ‑‑ using this app or this platform, but really I want to go to the other platform, this other service.  So I just want to have the terms of service that allows me to get this data, and just shift it to another platform.  I think this would bring a lot of the economic competition.  On the other hand, I think it's a nice idea to have control over your data. 

But on the other hand adding toward what John said, and really quickly, the idea of having terms of service, individual terms of service, I think it's interesting, but on the other hand, I think about education, about people who don't have the slightest idea of what terms of service is, and who don't have any kind of digital education.  I think this is another challenge that comes with connecting the next billion.  So, yeah, it's just difficult.

>> You know, and this is what I find interesting, I keep hearing in this conversation, we have had a couple questions on people's data and privacy and everything, what about children?  And the fact that guardians and parents are posting data, and perhaps some of them really, truly have really no idea what's done with the data.  They just sign up for Facebook because it's fun, and they are posting a lot of information about children from the time they are basically born.  And so I don't want to steal from the audience questions, but I would like to throw that out for a brief answer on that, if you have any thoughts.

>> I think it depends on the level of awareness.  For example, in my country, most people are not, like, they are not technological savvies, they don't know much about the internet.  As you said, when they joined Facebook or anything, they just post it ‑‑ they just join it for fun, like, I don't know, for e‑mail account or something, they just do it for the service.  But they don't really know what's going on down there, behind the scenes. 

But recently there has been lots of, like since the revolution in this the revolution of 2012, there has been, well, lots of problems because, lots of websites were closed and the famous 404 not found, and people started knowing what is going on behind the scenes, and they began to come in more careful while posting stuff and so on:  I think it all depends on the level of awareness that people have, like, for our case, I think it's our responsibility, when we brought this project, so we have to make sure that people understand, as we said earlier, that they understand all the terms and they understand that we are going to be providing them with ‑‑ content that is controlled and not something just random.

>> Okay, John.  Moderators hate me, they say things like be brief.

There is something call the COPPA Act, kids under 13, very specific ways they are supposed to be addressed.  That's law.

>> We all know that under 13, they get around that one because of the laws, so, loose, let's say, right?

>> Exactly.  Exactly.  I think one thing to think about with the term children, and another reason why I think adults, so critical to inform them about issues about data, especially parents, is by the time you put on a pair of virtual reality goggles, they can say I'm 30 and they are 13. 

This, again, is why, for me, and I'm not in any way using this term lightly, when we put on these goggles, and we look at ‑‑ because literally, if I'm John, I would put on mixed reality, not ‑‑ mixed really goggles where I can see the real world with digital data ‑‑ I could potentially see other John's as I walk around. 

I could say I'm the real John, and in I don't have a formal way through a government or different agencies that I can points to, yeah, look, here is my passport and driver's license, and I can say I'm the real John, then we won't have that.  And it's the same for children.  And again, I don't use this term lightly, but I quite literally think that's the same a human trafficking, the data version of human trafficking is when our identity ‑‑ right now we think of identity theft as people using our credit cards. 

It's going to be much more severe in five to ten years in the nations or places where people have these advanced mixed reality type tools.  It's going to be pervasive that people will start to say that isn't my identity, it's been hacked ‑‑ the entity is not the real me.  That is why we need to do these things I'm talking about with not controlling your data per se, but people think that means intellectual property, control the ownership, but one should be able to say, this is my identity, I get to control or at least inform how that's related to the world.

>> And I know in the cue, I had a question right down here:  You raised your hand a long time ago.

>> I still remember the question.  I'm from the protection authority, and we are very much connected with the data protection authorities around the world, about 50 of us, even more.  We have a network of regulators. 

I was very interested to hear the panelists views about what's the most crucial, urgent, effective thing that we regulators can do in order to prepare for this facing future that you have described here brilliantly and movingly, and I would really like you to pick one, and perhaps legislate, enforce the sanction, raise awareness winner the public, help the industry towards incorporating technological standards that have a privacy protections in them or anything else that you can think of, but taking into account, we have limited resources always, what's your number one most important thing that we should be doing right now?

>> Okay.  And I would say as we answer this question, so we ‑‑ that was a very good question.  Your answer and a sentence or two and then we will move on.

>> I'm writing this down, actually.

>> Since you are thinking about a real answer, I will give you a meta answer.  What's the criteria of a good policy of a good regulation?  I think that's one of the things I see as a challenge, what's the objective of the policy and how do you know when it's successful, what's the evidence in terms of pilot, experiments, research, science, it says this policy is likely to be successful to achieve the objectives stated. 

Many policies are stated with the objectives implicit, so it's difficult to assess whether or not, before it's implemented, whether or not it's going to be effective.  It's kind of a meta issue from my point of view.

>> Okay.  I have a question right to my left here.

>> So, my name is Marcia Hancock, I met with the ‑‑ my role in this is to look at children and the implication of the connections.  And I just loved hearing John, repeat over and over, people will make choices about our child because of what data has been collected about them and for them.  People make choices about us.  So, about our opportunities. 

And I think one of the first places, a follow up question to your, where could we start, the one place that we could, the Venn diagram in our cultures is that we care about our system.  People are choosing commercial vendors to help teach their children how to use the data.  If we can agree on creating some transparency and accounts built on children, students, data, we can start there.  Because we don't have to boil the ocean, we can take a small piece.  Unlike adults ‑‑

>> I'm going to ask you to come to a question ‑‑

>> The question is, what would it take in your location to be able to begin with transparency accountability with the data used in schools?

>> A teacher picks, some random teacher will just pick something.

>> So can you rephrase the question, please?

>> So, another way to say that is how are you, in Tunisia, selecting process that uses data of your students in a way that you understand?  We know when they go on a field trip, but do you know where the data goes?  People make choices.

>> Well, I think the very simple way to do that is to work with the government, because everyone, like everything that is official has to be in relation with the government.  So we can use, like, data centers or something that they provide. 

We are not going to be improvising, or have, like the student's information, we are not going to be treating this information, like we do the initiative and we expect and push the government to help give us that kind of support, because even parents don't really trust that ‑‑ like one time we did training in a school, and the teachers were really kind of concerned, like they didn't know what their children were studying and even if we had like summer camps, they don't really trust so easily. 

They have to go out there and see what's going on.  So I think the easiest way is to do it with the government.  If your child is heading for a field trip or something with the school, then that's okay, that's not really a problem. 

And what we have, what we envision, what we plan on, like the information that we plan on gathering, how do you say it, like the school, the children's results, like the advancement, their marks, so it's not really that kind of delicate information.  So it's like ‑‑ it's a way for a parent to see how their children are advancing, their marks, their absences, stuff like that.

>> I really like the way you scoped the question in terms of the schools.  I think, I kind of agree with you in terms of government, though I think in the US, the notion of school boards and local governance creates a real opportunity that the school itself could be the repository of the data and be the one that is, that disperses the data and that gives parents access and control and gives the local administration ‑‑ it's not necessarily at a national level, it could be at a local level.  That's a really interesting way to really ‑‑ (inaudible).

>> Yes.  Yes.

>> So the school could become the agent.  Yes.

>> And can I just adhere as a resource, at least notice states, there's a company called personal, PERSONAL.com.  They actually have a program for students and educators where they teach kids about how to create what is essentially a personal data cloud for themselves and that way the kids, in conjunction with the kids and parent, can release the data in ways that everyone feels comfortable.

>> Thank you.  Okay.  Well, I thought we were going to have to wrap up, but I think we can take one more quick question.  No, we have ‑‑ I had the cue going, unfortunately, I'm so sorry.  I saw you earlier.

>> Thank you very much.  I'm really glad to be here.  I tried to be very brief.  It's a very interesting discussion.  I have been following, I would like to join our ‑‑ colleagues, as from the data protection, authorities point of view, it's really important to give us some feedback, how ‑‑ because we could serve you and we could serve other users as ‑‑ who have ‑‑ data protection, which is an international ‑‑ which is open, and we are, we have now a rising number of states joining. 

It's getting to be international.  So, some weeks ‑‑ issue recommendation on big data, it will answer, at least try to answer all your questions, what is privacy, what is personal data, and ‑‑ environment.  So on, and so forth but what we are really looking is dialogue with your community. 

Two fold, my intervention was two fold.  I wanted to thank you for this interesting panel.  I would really recommend you to use ‑‑ as recommendation of international organization.  It can be very useful for you, and to enhance dialogue between the communities.  Thank you.

>> Actually, because it's about time to wrap up.  I'm going to ‑‑ moderator pre‑prerogative.  That was on one of my list of wrap ups, cooperation amongst all the stakeholders in a meaningful way to come to these conclusions, and have these ‑‑ make these decisions.  And in schools, how the data is used.  That's one of the reasons why things like this is vital.  The key is to keep the conversation going. 

The topics that I saw, that really need to be discussed and further in depth, a quite a bit a depth, account it, transparency, educating them on the aspects of technology so they can understand it better, educate the policy makers so they could make better decisions, and that we really need to consider what is the definition of privacy moving forward considering the IT aspect at the nano ‑‑ and so these are the things that we have to do. 

And this conversation must continue afterwards.  I encourage everybody here to take what you learned here, I hope you learned something and I hope that you got something out of this, take this back home and take it ‑‑ and use this, and build on this, and ‑‑ because we need to have a solid 2030.  We don't want to have the world be a mess. 

And now is the time to act to do that.  And so I would like to really thank the panelists here, Greg, Louise, Meher, and John, and I would also like to thank ‑‑ oh, the moderators and all the staff here, and I would like to thank everyone here.  Your questions were phenomenal and your answers were phenomenal and I truly appreciate it.  Thank you.