IGF 2016 - Day 3 - Room 10 - WS187: Smart Cities and Big Data: Boundless Opportunities?

 

The following are the outputs of the real-time captioning taken during the Eleventh Annual Meeting of the Internet Governance Forum (IGF) in Jalisco, Mexico, from 5 to 9 December 2016. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

>> Yeah, it is working.  Thank you very much for coming.  I'm sorry about the lack of space.  We were meant to be in a much bigger room but at the same time, it shows there's a lost interest in this topic which is very encouraging for the next hour and a half.  This session was proposed by the organization of Columbia, an organization that promotes the right to privacy across the world working with international partners. 

I'm really happy to be with this incredible panel, great diversity in terms of origin, stakeholder groups.  So we're going to be hearing from these different speakers about the work that they have been doing around Smart Cities and big data.  Before we start, I wanted to provide two definitions to make sure we're all on the same page in terms of what we're talking about.  So Smart City being an umbrella term used to define the use of technology data to improvement functionality and the functioning of a city.  And big data being the application of analytical techniques to search, aggregate, and cross‑reference large data sets in order to develop intelligence and insights.

And so, when it comes to Smart Cities, the topic that's emerging in a lot of countries in different sectors, different propositions, different actors as well being involved, public/private partnerships, and so today we're going to be exploring a little bit about the impact of these different initiatives, how they're designed, developed, under what umbrella purposes, who's involved and what might be some of the concerns as well that are already being discussed or some of the concerns, actually, that are missing from the current debates.  So we're going to start with our remote participant who is Gemma Galdon Clavell, from Eicas Research and Consulting based in Spain who's going to be talking about the work that they have done on this topic.

Okay. Gemma, we don't have the sound yet.  We're trying to fix that.  Gemma, can you say something so we can check?  No.  Okay. Well, we'll try and fix the sound.  She might stay on the screen.  But we'll get somebody else to start.  Maybe if we can start with Amber, who is from the Center for Internet Society based in India.  They, themselves, have been doing research on this topic in India as well and they'll be sharing some of their observation on this topic.

>> AMBER SINHA: Hi.  Thank you, Alex.  My name is Amber Sinha.  I work for CIS India and my research for the last one year has been focused on big data, Smart Cities, national ID projects in India.  And I wanted to begin by talking about ‑‑

>> GUILHERME CANELA DE SOUSA: Hello?

>> AMBER SINHA: Sure.  Is that okay?  So I wanted to begin about talking about a few assumptions in the discourse around Smart Cities which is prevalent locally and very quickly try to examine some of those assumptions, so let me introduce the Smart Cities context in India itself.

In 2014, the central government in India announced the Hundred Smart City ‑‑ is this better?  Oh, it's not on.  Yeah.  So in 2014, the central government in India announced the Hundred Smart Cities mission, and the particular thing about the Smart Cities mission was that it failed to provide a definition of what Smart Cities were.  But, it talked about various initiatives which we can generally see as part of the Smart Cities discourse globally, some of them being smart parking, intelligent transport systems, smart lighting, waste management, telecare, citizen safety, and smart grids and meters.  So, the first sort of assumption that I wanted to talk about was the idea of data‑driven technologies being a political neutral.  And that has been a large part of the discourse around big data.  However, the very limited examples we have of actual applications of big data show that they're not neutral by any stretch of the imagination.

The following factors have a great impact on what factors could influence the inferences we get from big data.  The design logic flow of the algorithms, the inequity of the data sets.  To give an example, a lot of Smart City initiatives rely on crowd sourcing of data, and that sort of ‑‑ and the idea is that if we have enough data then it is exhaustive in nature and just in some sense self‑correcting.  However, I think ‑‑ if I may point to the example of the Microsoft Chat Bot which upon its introduction on Twitter within a very short period of time, it turned into a misogynistic racist chat bot and assigned the worst of the internet.

The second is when we're looking at solutions to drive decision‑making for upper an planning sources.  In most cases, aggregate data will be sufficient.  However, we see a complete lack of discourse on the need for solutions to ensure that individual data is not maintains or not collected.  And the final point that I wanted to make was with regard to how Smart Cities impact citizen participation itself.  So, much like other global initiative, there is some talk in citizen partnership built into the Smart City's mission in India. 

However, I think we need to remember that any efforts to create systems which involve every day participants need to be cognizant of the fact that user public spaces evolve with time.  So, the ability of citizens negotiated that space is very important and any technological solutions that form the interface of the citizen engagement has to keep that in mind, and that has been sorely missing from the Smart Cities discourse so far.  In conclusion, I will say that one should be wary of one size fits all technological solutions without clear examination of what local contexts are and how technology can best be used to solve those social issues.  Thank you.

>> ALEXANDRINE DE CORBION: Before going to the next speaker, just a quick follow‑up question, Amber, as we're seeing these Smart Cities it's often embedded within a very dominant narrative, either government or a specific stakeholder.  What's been the development around Smart City and the use of big data in India?

>> AMBER SINHA: I think in India also much like the rest of the global, in ideal, most of the discourse around Smart City and big data has been fairly uncritical.  The Smart City's mission, for instance, is still at its early stages but we already see the power of the discourse being used to circumvent certain processes.  The Smart City is a central government and we fear it might be used to wrest power from the local and state government as has been for the last 50 years of we also see industry participation without really going into the question of what Smart Cities is.  So in our research, one of the things we've pointed out time and again is before we actually embark on initiative of this larger scale, we need to truly understand what you mean by Smart Cities and how then different stakeholders could participate in that.

>> ALEXANDRINE DE CORBION: Thank you.  The next speaker is Jamila Venturini from FGV in Brazil.  They've been doing this research in Brazil and sharing their observation as part of their work.

>> JAMILA VENTURINI: Thank you.  Well, actually, we haven't dedicated so much effort into researching specifically Smart Cities.  I'm going to be sharing something we have been thinking about based in other research we have been doing.  So the first thing I would like to highlight is that Brazil is marked by great inequality and that includes geographical inequality meaning that different cities in different regions have different stages of development in terms of information policies and digitalization policies.  We also observe that the Smart City narrative to talk a little bit about what Amber was saying, it gets sometimes ‑‑ the idea and the concept sometimes get confused with other digitization initiatives, electronic government initiatives, transparency initiatives, open data and participation initiatives and all that, it's kind of unclear in the general discourse, right?

The other thing we observe in Brazil is that it seems like we as a country, at the national level, we still have a long way to go in terms of policies for access to information and digitization in general.  We do not have a unified data protection law.  We have some laws that are insufficient to deal with this new scenario, and at the same time, despite all that, the narrative of Smart City gets power also within this context of lack of proper politics, lack of clear debate on the impact they might have.  The events, the world cub and Olympic games were a great for the development of these.  We can observe several talking about these smart initiatives and international consultant firms doing rankings about what are the most Smart Cities from Brazil, what are the examples, that we still have a long way to go before we have smart policies. 

We see some push for that, and we can ‑‑ I would say that we have two examples, interesting examples.  One is the installation of the unified command and control centers in all cities that receive the World Cup games and that is in Rio that is a center that unifies data from different sources, private and public.  And we can see that we have right now is state and public administrations, local administrations in these sparse initiatives is not a coordinated effort.  Like Amber said it is in India.  There are sparse initiatives promoted by private and local administrations together with the private sector that allows the state to collect information and data directly from the citizen absent in direction to these smart devices that are distributed in the city and at the same time indirectly through partnerships with private companies. 

Now moving a little bit to the challenges that we see, we see a great lack of transparency regarding these initiatives, even the ones that are promoted by the state itself for the collection of data and the ones that are result from public private agreements.  I have several examples.  Maybe I'll leave that to the general discussion afterwards, but just to stress that despite the fact that transparency is not enough to balance or to solve all the problems that we see with this type of initiative, it is fundamental to make the debate, to elevate the level of the debate and to allow different stakeholders to participate in equal terms in this debate because now days we know little about these initiatives, especially what are the exchanges that are happening between public administration and private entities and what are the terms and the rules that are governing the processing of this data, right?

>> ALEXANDRINE DE CORBION: Maybe just a quick follow‑up question.  You mentioned the public/private partnerships.  Who are the different private actors involved?  Is it Brazilian companies or is it foreign companies and how is that an additional challenge?

>> JAMILA VENTURINI: Yes, just to focus on the partnership, it was developed with IBM.  We have record of partnership with Ways for exchange of traffic information.  We know that this is not protection now days.  The information was all analyzed, but we know this is not protection now days.  We did several questions for information to the public administration to Rio.  Rio publicized itself as a Smart City with a lot of solutions for urban but at the same time they have not responded to any access information requests we did on any topic.  So, there is a big challenge on that.  That's what I mean saying that we have this contribution, we don't have the basic type of information policy implemented and we are already talking about smart solutions, big data, collecting more data without the proper legal framework that would protect that.

>> ALEXANDRINE DE CORBION: Thank you.  We're going to try and bring in Gemma in Spain.

>> GEMMA GALDON CLAVELL: Hello?  It works now?

>> ALEXANDRINE DE CORBION: It's working.  Perfect.

>> GEMMA GALDON CLAVELL: Okay.  You'll have to excuse me because I don't really have a voice.  I'm running out of energy with the year and I'm running out of voice but I hope that you get the message so thanks for the invite.  I'm very glad to be able to follow you and contribute to the debate.  Just to give you a bit of background, I've been working on cities and technologies for the last eight years.  Initially from a more academic perspective, quite critical with what was happening between cities and technologies and how technologies were being used not so much to solve problems but it's a reason in itself, as an end in itself, so buying technologies for cities just because it existed. 

Not because there were problems that could be solved through technology.  But lately I've been more and more involved with working with cities on trying to improve precisely that thinking how can we make technology for the people.  How can we stop doing data despotism, which is saying that we do everything for the people with the data of the people but without the people.  And interestingly, in Barcelona, there was a change of government two years ago and technology played quite a big role in that.  In Spain, technology has failed to win the hearts and minds of the population.

Just to give you an example, four years ago, there was a plan to build a factory of technology, or makers to use.  This Faplap was supposed to be installed in a poor neighborhood in the city, and the neighbors stormed the place because they did not want a Faplap.  They wanted a food bank.  So it was hard for the population to see how technology would solve their everyday problems so that raced a lot of discontent.  There were several Smart City development that gathered a lot of protest against them, one of the mandates was to rethink the smart strategy of the citizens so in the last few months we've been working to create new paradigms, new ideas around technology and cities making sure we put the cities in the center of innovation. 

Making sure we do responsible big data, if we have to do big data, and making sure that the use of data that we make is always in line with the city priorities and the citizen's priorities.  What's very interesting in this process is also to see how the Smart City worked beforehand so getting to government and actually opening the databases, seeing what was there, looking at the contracts with a private contractors that were developing the Smart City beforehand.

We've seen how the city fails to secure public ownership of the data that the citizens create.  How companies fail to give the city any kind of insurance on how the data was going to be managed.  We've seen contracts with no mention of data.  Contracts on centers, for example, where personal data was not mentioned.  And we've seen basically a mess.  How can you do responsible big data when you are unable to map the data architecture of your department?  How can you do anything responsible with data when you are unable to map the data life cycle of the citizen data that you deal with?  And that's not something that happens in Barcelona.  That happens everybody.  That's been our experience with every public administration we've worked with.  Data strategies do not exist.  What we have at the public level are a series of practices with data, and just different people working with data ‑‑ thank you ‑‑ different people working with data at different levels but without talking to each other, without mailing sure that systems are interoperable.  Without ensuring that we have overall control of the data that we manage.

So the challenge we have now in Barcelona or the different cities we work with is can we create responsible data strategies that win back technology for the citizen?  And when we use the resources of multinationals ‑‑ because some companies have a lot to offer ‑‑ but can we make sure that whatever they have to offer, we put it in relation to the city's needs and that the city's needs and the ownership of the data by the citizens can lead technological innovation at the local level.  That's what we have and hopefully you'll be seeing how to move beyond the corporate driven definition of Smart City to talk about a Smart City that works for citizens.  So we talk a lot about digital sovereignty, for instance.  That could be one of the paradigms you might want to play with or explore.  Definitely we need to move beyond what we have at the moment that is not working.  Thank you.

>> ALEXANDRINE DE CORBION: Thank you, Gemma.  A quick question, as well.  I think it's really interesting how you're working with the city and the city departments so I was just wondering, are you seeing, then, the emergence of specialized offices or departments within city authorities to work on these issues?

>> GEMMA GALDON CLAVELL: Could you repeat that?  It's very hard for me to follow.

>> ALEXANDRINE DE CORBION: Sorry.  So you mentioned you were directly working with city authorities which I think is a really interesting approach to the problem because often it's just the government or private companies so are you seeing the creation within city authorities of specialized offices or departments working on these topics?

>> GILLIAN CHAPMAN: We are doing this at the Barcelona level.  I haven't seen anything similar done anywhere else.  That's something we want to explore but we definitely see a lot in terms of thinking about data strategies from this perspective we see cities don't really have the resources or the knowledge to approach Smart Cities or technology in a responsible way and I think there's going to be a lot of help that needs to come from outside to help local departments deal with technology better so we see efforts here and there but we feel very much alone at the same time, so it would be great to have more spaces to cooperate and share experiences on how this can be done.

>> ALEXANDRINE DE CORBION: Great.  Thank you very much for joining us.  Please stay on the line so that if there are questions from the audience later on in the session, you can take those.

The next speaker I wanted to go to is Max Asange from Google who's been working on Internet of Things program at Google to share some of the work and the perspective from industry working on this issue.

>> Max: Good morning and thanks for the invitation.  As you mentioned, I work on the research and development side, so I have to point out that I'm not one of the people working on our Smart City's efforts but I've read up a little bit and I'm happy to share our general approach and then some examples that I think will inform our discussion.  Most importantly, I want to thank the speakers that already spoke before me.  I think this is a great example of a multistakeholder dialogue.  I learned a lot about pieces.  I just Googled a responsible data strategy, stuff that we need to figure out together.  I think the most important message I want to bring to you is, Google doesn't eat your data. 

We're trying to develop projects that serve you that win customers and users with.  And I think what the IGF is a great example for is that we can identify the emerging issues together and then work on them together to solve them which sometimes gets lost in the debate about, they're failing to do that.  They're failing to do the other.  We're trying to innovate, working as good as with can with the different stakeholders to understand what's going on.

The second thing I want to point out is that Smart Cities and Internet of Things, which is my field more generally, has a bit of a bad name and that's really not correct.  It really has the potential to transform people's lives and to help us work on really big problems like energy and better water use and other thing like that.

Now a little bit of a view into the machine room at Google, what we are working on and what we see as priorities to give you a bigger picture.  Of course the change toward the assistant is what's really driving our company.  That means machine learning, and yes, using data to make better decisions and to help the individuals and organizations make better decisions.  There are a number of efforts on the way to unbiased machine learning and to get a better understanding how we can make sure these learning algorithms are indeed as ethical and balanced as possible.  Similarly, we'll come to an example later, differential privacy techniques are something to make these data sets accessible and useful while acknowledging that data, even in anonymized with traditional methods can be tracked back and deanonymized.  It's a revolution, I think working with some of the best researchers externally and internally to move the field forward.

With that, I wanted to point to a couple of examples that Google, or better set Alphabet holding company are involved with.  Sidewalk laps is a company in New York working internationally to bring Smart City solutions to cities.  They have a link, NYC project where they're setting up kiosks around the city in New York to give high bandwidth data and internet to people, let them recharge their phones, stuff like that.  A company they just recently bought is called Flow.  They do try to supply unmatched parking spaces and manage demand for that as well as de-siloing and understanding of city data and assets.

A bit more detail on two examples.  One is from Nest, where many of you will remember the Aliso Canyongas leak that happened at the beginning of this year, end of last year, they had to shut that down so the energy company in California was actually in a shortage of energy and they contracted Nest and offered subsidized Nest thermostats in order to optimize heating and cooling in that area.  That saved up to 50 megawatt and 1 kilowatt per home on a reliable load reduction in that area.  Just one example on how it can work.  Now the example I want to draw a little bit more into is the Google mobility project because I think it actually shows what can be done and how this can be done well, and I'm interested to hear if you have any ideas how do it any better. 

The idea is to get traffic data.  Obviously, many of you have Android phones.  You get these little updates.  Traffic is high, traffic is low, these kind of things.  What we do there is gather the data, aggregate is and make it accessible to universities in Denmark, Sweden and Netherlands that use it to optimize the flow of traffic and the use of their infrastructure with that data so you get improved travel times, less congestion, which means less pollution, for the city's reduced cost of infrastructure and less outage of tunnels, et cetera.

The team actually, I think, has been in contact not only with researchers but also civil society to make sure we only show aggregate anonymized synapses and using privacy settings to make sure those data sets are not going to be de-anonymized.  I think it's a good example of how we want to be part of the solution and how these things matter as well.

Just to give you a little twist of how I think it can go wrong and right.  We had a cookie directive in Europe some time ago and now we all have to click when we go to the website, yes, I want the cookie.  Well, Google is working on macaroons, which are smarter cookies.  Better definitions, how long, who is authorizing them so the trust level is higher and things like that.  So I think if we work together, we can actually come up with better solutions than to try to put it in your face and tell you, this is going to happen.  That's not really helpful for anyone.  One last point just to get the conversation started, I want to challenge the colleague from CIS.  You said we need more local solutions.  I don't think that's really an option in the times of globalization and the internet.  Whatever solution you have will be accessible and used elsewhere, so, while, of course you have to work locally, I don't think local solutions are the solution.  Thank you.

>> ALEXANDRINE DE CORBION: I'll give an option for Amber to respond in a second.  I just wanted to hear from the last two speakers.  Make just a quick follow‑up question as well.  We're not getting to the final stage of auditing or evaluating any of the initiatives being pushed forward as well so we don't have the lessons learned and we can't keep ‑‑ we don't have the information to understand what worked, what failed and how to improve future systems so I was just wondering if that's something that's being included in the process as you're designing, developing, and rethinking some of the your initiatives.

>> Max: Of course all the projects are evaluated.  What I have to admit is that Google is a very decentralized piece and alphabet is even bigger so I don't think there's one technique used across all these different boards.  We can certainly benefit from knowledge sharing not just inside but also outside the companies.

>> ALEXANDRINE DE CORBION: I'm going to hand over to Niels ten Oever from the internet task force to talk a little bit about challenges we're having from lacking discourse on these issues.

>> NIELS TEN OEVER: Hi.  Let me first clarify that I'm not speaking on behalf of the Internet Engineering Task Force before I get into trouble with a lot of nerds out there, I don't want that.  I'm with article 19, a freedom of expression organization.  And I think the question here is can we build a Smart City with dumb, invasive, and pervasive things.  How can we do that?  So, for example, I have a smart meter in my house.  This smart meter which is simply an electricity meter that sends data to the data company, did not use transport layer security, did not authenticate properly with the energy company, and was not encrypted.  So I could do a distributed denial of service attack on my smart meter and send information to my energy company on its behalf saying that I did not use energy.  So, is that so smart?  Well, I guess you know the answer.

So, in many ways Smart Cities and the Internet of Things is a step back from the internet we have because systems are implemented in a very limited way with devices that have no way of updating or adding security measures.  And this is the case, by definition, because smart objects are devices with constraints on energy bandwidth, memory, size, and cost.  So, they are very primitive computers, which is sad.  And this is bad for the internet itself.  As we've seen with the bot net attack where DVD recorders and CCTV cameras attacked a domain name system provider so that's a problem with surveillance at a whole other level when the CCTV cameras attack internet infrastructure. 

In the early time of this internet, we had this meme that no one knows you're a dog.  That's like morphing on the internet, no one knows you're a fridge.  Quickly morphing to on the internet, everyone knows you're a fridge which is problematic but these problems are very similar to problems we've already dealt with before in industrial control systems where we operated bridges with Windows 95 and because it worked, we never updated them.  And they're like, hmm, maybe not such a good idea.

So we should approach critical and potentially deadly systems such as cars with a different approach than tooth brushes and light bulbs.  And this is something that is masked in the Internet of Things discussion, and I think we should be very careful in this language with big data, open data, e‑government, smart cities, because it's language that potentially hides problems that are hidden underneath.

And we already have great standards for authentication, communication, security, and privacy and they are defined in bodies such as the Internet Engineering Task Force, the Institute for Electronic Engineers, ICANN et cetera.  And these bodies are slowly starting to understand they have an impact with human rights especially with increasingly our lives being mediated more in the internet.  In ICANN we now have a Human Rights by law to see how the properties of the domain name system impacted our lives.  Internet Engineering Task Force P, we're currently ‑‑ and in IEEE there's a great standard coming out on standards and ethics on artificial intelligence but this is obviously the beginning.  I really think operators should implement something about how they impact Human Rights but it's all going to get more complicated when things are really getting smart in the age of virtualization and deep learning because then we go beyond algorithms because the algorithms are changing themselves with data sets, as is already mentioned and these models are not necessarily understandable for us. 

And in times of quantum computing where things are not binding but we calculate with probabilities, changes are fast and we're not able to understand these complex systems anymore.  Actually, we're already designing robots that we did not program but that have a model that work pretty well so let's see how it goes and if we mediate and do our policies based on that, we might be outsourcing quite a lot of what we do and we should not outsource or ethics to algorithms, companies, or otherwise.  Everyone has a responsibility in that and I think everyone should take up that responsibility and we cannot look only at governments to do that.  Every professional user should take ‑‑ think of its responsibility for its impact on the infrastructure and Human Rights.

>> ALEXANDRINE DE CORBION: I was just wondering from your perspective at Civil Society, what's currently missing from that discussion.  Who's missing in those rooms and how can we be addressing that moving forward?

>> NIELS TEN OEVER: What I think is, who here is not technical?  It's not true!  It's not true.  You all have a device with you that this is probably the last thing you see in the evening and the first thing you see in the morning.  You probably spent more time with this than with your primary partner.  You are all practically cyborgs.  You have integrated technology so deeply in your life that you need to take responsibility for it so you are technical and you need to take control of that.  That is what we need to change.  We need to take back control of our devices, of our lives the same way in which we take control of our democracy and our governments.  That way we should regain control of our digital lives.

>> ALEXANDRINE DE CORBION: Thank you.  I'm going to our last speaker who's been waiting patiently for his turn.  We've got Mr. Guilherme just wondering what is UNESCO's role as an organization in this field?  That might surprise some.

>> GUILHERME CANELA DE SOUSA: Thank you.  Good afternoon, already and I think it was interesting that we were moved to this tiny little room and it's so crowded because this is exactly part of the problem of the cities and the big cities all over the world is that people are filling in these cities and this raises issues about security, accessibility, education.  And regardless of if they are smart or know, the issue of city asks perhaps one of the most important issues for the next 50 years.  And if we don't address the urbanization questions and if our big dream about Internet of Things is getting our wine refrigerator better or whatever thing, it won't change anything.  No?  So, but of course Mayors all over the world are scared with those challenges and there are other names, digital, wise, innovative, accessible, creative cities are being sold all over the place to those Mayors.  If you go to those big Mayor meetings, you will see companies selling to them all those marvelous things.

So, what is the big issue here?  The big issue is that we perhaps have nice projects but we don't have policy for this.  So you can have this wonderful technology, computer from Mars, if you don't know which city you want to have, it's not going to work.  And we don't.  What we want those things for from a policy perspective, so I will give you three examples that illustrate.  Governments all over the world are spending billions of dollars on these things they call ICT for education, but they don't have a clue what education they want to offer to those children. 

So what's going on?  The result connecting, offering ICTs in schools and improving the quality of the education, all the research that is available about that is saying have no impact.  And that's not because technology is not important.  Of course it is important to offer as it is in education, but if you don't know why you want to offer ICTs in education, it's not going to work.  So first we need to understand what kind of public policy we want tore this.  Second, it was abundantly mentioned here privacy, ethical issues, et cetera.  Lots of those solutions being sold to Mayors is about safety and security but no one is really asking what it means to have all those cameras all over cities and people following you all the time, so what is the policy again for that?

Third, participation.  This should be a two way thing.  So you have those e‑government initiatives all over the place.  So, for instance, if Amalia is speeding up and she receives a fine, she can be monitored.  She can be sanctioned.  She can receive her fine and she can even pay the fines through internet bang without leaving her place.  However, if she says, look, I was not in Bogota that day.  I was in Guadalajara and I want to complain.  That wasn't me.  She needs to go walking to the place and complain, and this is not smart.  This isn't participation.  This is about raising more money more quickly.

So the third element, we need real participation as someone already pointed out and a two‑way policy on those areas.  So big issue, we need to go from project to policies and this being challenges for scaling apps, so we're talking about challenges for cities like San Paulo with 25 million people so these tiny little projects in one school that are super beautiful, but how with you scale up for school systems with 350 million students.  So that's the real challenge.  And then of course things mentioned here.  Monetary evaluation, are these things really working?  

Participation, access to information.  And I will finish this with I think Jamila mentioned, all this about open data, big data, of course it's all important.  But this does not change the reality that citizens must have the rights to ask for information, regardless the information governance are already putting forward freely.  These are two different issues.  So, this idea that open data and open government will diminish the importance of access to information is simply not true.

The right to access to information is important regardless how clever, smart, and big data and whatever we are.  So, UNESCO is trying to dialogue with our member states to offer these kind of concerns and perhaps the good news is that talking, for instance, with ministers of education all over Latin America, they are concerned about that.  They want help because they are seeing that the policies they have investing billions of dollars in the last years are not really working, at least in the way they expected it will work.  Thanks a lot.

>> ALEXANDRINE DE CORBION: If I can follow up with a question, Guilherme, in terms of Smart Cities, we're seeing others playing a well, here, actually, in the state of Jalisco to develop a Smart City.  I'm just wondering how that falls with work of UNESCO and seeing what other intergovernmental entities are playing in this field.

>> GUILHERME CANELA DE SOUSA: ITU has developed a work trying to coordinate the different government agency's effort and I think we could do better but at least we are trying to coordinate in the efforts and of course the different agencies are acting under their different mandates so in our cases, education, culture, and also all the wishes related to freedom of expression and access to information, but then in UNICEF, how children are connected to this issue.  It's a new thing but at least we're trying to coordinate and not overlapping a lot of the still happens.

>> ALEXANDRINE DE CORBION: Before we go to the audience for questions, I wanted to get maybe the panelists ‑‑ and Max has already seized on that opportunity, to challenge and ask each other questions.  Because often you're on a panel and you get all the questions from the audience and not from the other panelists and that can be frustrating.  So I'm going to let Amber respond and then open up to the other panelists if they want to respond to anything that has been said.

>> ALEXANDRINE DE CORBION: Thank you, Alex.  I think it was an interesting point that max made.  Let me elaborate a little more on the need for greater local context, which is especially acute in the technologies like big data.

So I think there is some research ‑‑

>> AMBER SINHA: I think there is some research done by tailor and Schroeder which talks about in particular mobile practices in India which is a source for big data which talks about in India, there is very often a practice where mobile phones get passed across family members over short periods of time.  Another thing in India is there are multiple sims registered under a person's name and at what point the person and number being used changes more than it does globally so that was one example where there is lacking study in terms of understanding particular local practices and how big data algorithms should account for them.

Especially in the case of big data because of this whole discourse of when you have exhaustive data sets, you can do away with domain‑specific expertise.  I think in a that case it pictures much more important to look at those factors and make sure there is no imposition of homogenized standards.

Another point I'd like to point out was the kind of behavioral science being used in order to draw inferences from certain situations.  Those are also things where there is not a lot of research in various countries and how those insights can be extrapolated in various local context has to be very carefully looked at.

>> NIELS TEN OEVER: Just very, very briefly, I think we were on the same page.  My point was that the technology resources have to be the same.  What we can learn from the data is highly ‑‑ Human Rights, I think, are the universal standard that we all have to be held against.  If we start to say, yeah, well we have this special practice in India, we have to account for that.  I don't think we're having the right argument.  So while I'm all for understanding and understanding best practices and then hope flip feeding that for global solutions again, that's a great thing.  To excuse individual practices that might not be okay on a global standard I think is not the right thing.

>> ALEXANDRINE DE CORBION: Anyone else would like to respond to any other fellow panelists?  If not, we'll go to the floor for questions.  Amber?

>> AMBER SINHA: I think this is a question for both Guilherme and Max.  With regard to, I think you mentioned certain initiatives within Google itself.  Towards unbiasing algorithms.  I'd like to understand a little bit more about that and what kind of approaches are being taken because the limited study of existing data sets that we've undertaken in India, for instance, have extreme sort of instruct ram inequities, and especially when we're talking about learning algorithm as you mentioned.  It becomes much harder to audit code in that sense so what sort of approaches, if you can elaborate more, are being used to unbias algorithms.

>> MAX: Let me take a two pronged approach.  One in the direction of Niels who said, we don't understand the models.  I don't think it's that icy.  I am with you, Niels, when you pointed out the ethics of personal bodies.  I think that's right.  This is a world with a very distributed labor system and everybody contributes something to society and everybody has to be held to the same high standards and then to particular practices that are put forward for that individual profession, right?  All doctors should have the Hippocratic Oath, and if not, we are in trouble.  So I think that is the right approach.  I would argue that the colleagues that actually are the experts actually do understand certainly the models.  They don't understand each outcome individually, but they do understand the models and of course they understand much better than all of us do. 

And with direct response to your question, I am not a machine learning expert.  I am fascinated by it, but I'm certainly also not, this is so cutting edge stuff.  There is a lot, what I have to say about the machine learning piece that I'm very positively surprised given how competitive that field is how much is getting published and really discussed out in the open.  There's a great paper about safety risks and robots for machine learning that Google has contributed to where I think they're really talking out of the machine room and trying to socialize and find the solutions because there's obvious risks.

>> NIELS TEN OEVER: Well I'm so happy that Max tells me there's a solution to this problem I've been looking on for a real long time because I don't understand ‑‑ let me be clear.  I've been working on computers for a while but I have no idea how the full Linux operating system works because these are millions of lines of code.  If we look at a networking stack, so many standards (Niels speaking) are not for the police.  Their permission is innovation.  Everyone can do what they do so the claim that we really understand how this all impacts Human Rights, I think, Max, you should become a special adviser on that because maybe we have some understanding of security, but if we understand security so well, then why is it still such a mess?  Why haven't we implemented for perfect forward secrecy, right?  And this is obviously security.

Then many people have mentioned privacy which is potentially even a bigger mess but then the right to anonymity online.  I think we can almost agree that full anonymity is almost technically possible even though it's been acknowledged that it's very important for human expression.  We can take it further.  Algorithms.  Nondiscrimination.  Air BNB.  Uber.  We've seen violence online.  Right to security but also the right to freedom of association.  I think these are all things that are currently under threat by the acting protocols and it uses a line, so if Max and Google can help us understand that and also mitigate that better, I'm very happy to put my hands on Google.  Until then, I'm not sure if you're not sucking up all my data.

(laughter)

>> MAX: So the two points I made was the division of data, not anybody understands the whole world.  Not that I understand everything or that we do as Google.  But on your point, I think the most important piece we should never forget is that it's getting better, overall to ten years ago or five years ago, the practices are getting better.  We're learning more and how to do it.  Yes, there are big mistakes all over the place.  Human history is full of it.  First it goes wrong, then we fix it.  But if you are an optimist, which I hope many of you are, the world is improving and the systems and technology is getting better over time.

(laughter)

>> JAMILA VENTURINI: Just to take out the conflict here ‑‑ I'm in the middle of it.  No, I just like to comment some things.  One is that in the past few years, we have seen the collection of more and more data so of course you can learn how to deal with that data better and better but it's not like it's better than a hundred years or 15 years ago because the model was completely different, right?  It wasn't based on data that much.  On the collection of data that much.  Of users data and personal data.  So that's one thing I want to investment the other thing, trying to dialogue a little bit with what Niels was saying before, we have done research, but we can see that. 

I was talking about transparency from side of the state but if we look at the conflict where these private and public situations are growing, you see that from the private side, there is little transparency including the existing documents that we have, existing legal documents that we have.  So if, for instance, we find that it's even difficult to identify which are the documents that are relevant for each of the services that we interact, that's the first difficulty.  But if we do and if we find the time to read all of them, we will find such a language that will make it very difficult to understand in terms of not only legal language but also computer science language.  Big companies have good practices on that and we could say that that is getting better somehow.

But at the same time, these contracts are written in such a generic way that it allows an amount of uses of data without the need of specific consent so virtually, you give consent to several things to that you don't even know the impact they have.  What is the impact of the use of these technologies.  I wanted to bring that up because if we think that private or public services really offered through private platforms or that you implement solutions to these platforms, such as analytics, for instance. 

That was one of the questions we made.  The use of analytics in public services or web pages and we find they are being used and this means transfers of data that sometimes may not be included in the policies someone is making or all the structure, I guess it was Gemma that was commenting about, you don't have an idea of the workflow, where is this data going to, where is it coming from, which are the entities interacting and under which conditions, if it's the law itself, and as I was saying, Brazil would lack several rules on that original contracts that are ruling them and how do you deal with that?  How do you identify that just to add another challenge to the discussion.

>> ALEXANDRINE DE CORBION: I'm going to give Niels one minute and then we're going to the floor so prepare your questions.

>> Oh, that's not fair!

(laughter)

>> NIELS TEN OEVER: To speak with ‑‑ I'm a pessimist because of intelligence but an optimist because of will.  I think what gets us together here at the IGF is the thing that the internet is not done.  The internet is an opportunity and hopefully it will always remain that, never to be finished and something we create but that mines while we're creating it we should keep in mind what we actually want it to be.  Do we want it to be this economic money making machine.  Do we want it to be a spying platform or do we want it to be a Human Rights‑based approach and a part of our public space?  If we want the latter, then we should take this with us and all design options and we should discuss more with engineers, with companies, with operators, to ensure that we really can instill this thinking in the design and operation of the whole world and I'm so happy that I can discuss and work on this together with Max so we can create a great future.

>> ALEXANDRINE DE CORBION: Great.  So, if there are any questions, if you did say who you're directing your question to.

>> Good afternoon, ladies and gentlemen.  My name is Jerry Ellis.  I'm here from Dublin, Ireland from the Dynamic Coalition on Accessibility and Disability.  I'm blind, myself.  I'm interested to ensure that the new technology, such as Smart Cities, doesn't make the same mistakes as we do with traditional buildings and traditional cities.  We build them for people who are young and fit and strong and in the top of their health, and then everyone in this room has two choices.  You either grow old, or you die.  Which would you prefer?  So, what we need to do is we need to build cities which would accommodate people with different needs at different stages in their lives, whether they're born with disabilities or acquired them as they grow old.  80 percent of disabilities are acquired.

How you do that is you use recognized design criteria like universal design.  That means getting people with different needs in at the early development stage, then you build according to standards which include accessibilities, and then when you're testing, you use testing personae or real people with older needs, older people with disabilities.  And then you don't have to retrofit.  If you retrofit, it is really expensive.  If you design in the needs of the population you are serving, it is cheap.  So, that is my wish for Smart Cities.  Thank you

(applause)

>> MAX: Well, I couldn't agree more.  Accessibility is good design.  It actually makes it accessible for everybody and I'm working with Vince who himself is hearing impaired and always has stressed and is a promoter for accessibility and is championing that inside of Google and the business community.  And to connect that to our earlier conversation, it's also a right.  So I think you do have a good argument here, and just to complain what I meant with, it's better than a hundred years ago, a hundred years ago, we didn't have human rights so you couldn't even have a decent discussion about what we are having a discussion about and that is what I mean with, it's getting better.  The whole concept of privacy, if you think about how it was a hundred years ago in the small village, everybody knew what everybody else did.  The whole idea that we can protect our privacy and that we have such a thing is new, and I wanted to point to some of the initiatives that you pointed out that, you know, are failing that are not good.

I think that community has talked about creative comments for privacy for at least five to ten years, probably, at this point.  Why is it not happening?  I mean, that is really ‑‑ they're very concrete things that Civil Society and scholars can do where companies are waiting for it, so to say.

>> I don't want you to be the one here in this panel saying all the things but I think we also need to understand the past dependence of histories, so when you say 100 years ago, we didn't have Human Rights.  The first freedom of expression law was established 250 years ago in Sweden so that's not necessarily true.  One of the things with Smart Cities, it's precisely ‑‑ and I'm not saying you are saying that.  But this idea that people think the world started after Star Wars, and that's not true.  It not started after Apple and Google, it started a long time ago and those things have a long history of people who died for them.  I think we need to be very careful in understanding those processes.  Of course, I fully agree on the comment with accessibility.  I think this is a gigantic issue but we need to address it, also take into perspective of the poorest countries.  Sometimes the solutions on accessibility are built for rich cities.  For high income countries.

So, we need also to look for those who are not able to pay for studying and thinking those solutions, hmm?

>> AMBER SINHA: I also wanted to sort of add to that a little bit.  I think it's sort of interesting that we mentioned ‑‑ we took the discussion towards Human Rights and how they have developed.  I think another thing to make note of is how the emerging technologies have an impact on Human Rights it elf is.  So the very idea of Human Rights as we understand today and as we've understood for the past hundred years hinges on the idea of informed consent.  We have technologies to stop that to an extent where there is automated decision making where it is very hard to include procedural fairness and which cannot be questioned.  So, in that sense, I think when we're talking of the tradition of Human Rights, in what way it is impacted by technological solutions.  I'm in the saying we shouldn't have these initiatives, the formation of Smart Cities and big data are immense and we should definitely use them.  However, I think it's equally important to not sort of be in this hurried stage.

It seems to be a global phenomenon to implement these solutions right away without understanding what their implications could be.

>> JAMILA VENTURINI: Yes, that's perfect.  That was what I wanted to stress.  It's not a yes or no question.  It's just the way it's been introduced and discussed until now, at least in our countries, in developing countries, has been complicated in terms of participation and what are the conditions that we have to evaluate the impacts that these technologies might have on our cities and several other aspects of our cities, and of the exercise of our Human Rights and our collective rights, also, right?

And just to stress, there are several initiatives, as you mentioned, on creative commons for privacy.  They are continuing to be discussed.  There are several challenges for that, and I imagine you might know.  Otherwise, we would have better models.

And I am aware of that as a problem.  I just wanted to point out how we have serious challenges ahead.  There's no simple solution, but it seems like the idea that the private sector and companies are more and more responsible and impacting on the exercise of Human Rights in various levels, it's getting more evident, and it seems like we have to think about solutions and the adoption of the already principles that were mentioned by Niels and how we implement that, how we think about that seems to be more and more important in debates about Internet Governance.  That's what I wanted to add.

>> NIELS TEN OEVER: I'm really enjoying the discussion and so many great things that have been said.  I'd like to go back to the question from the audience about disabilities and giving access.  I think we have a bit of a fetish or a preference to only think of users but as long as we do not ensure that all people from all genders and all geographical locations become part of the developing community, we will keep producing the white male privileged power structures because we simply do not have the perspective that eithers have.  That's why I do not understand such a small part of the world has English as their first tongue, but the language for internet protocols is English.

So, it's not only ensuring that the internet contents and visible parts become English, but also those become available for other people so the internet did really become a true global network that is built, maintained, and used by us all.

>> ALEXANDRINE DE CORBION: Okay.  We've got Gemma who wants to jump in to respond to the discussion as well and then we'll go back to the floor for the question at the back.

>> GEMMA GALDON CLAVELL: Hello?  Can you hear me?

>> Yes.

>> GEMMA GALDON CLAVELL: I just wanted to follow on the issue of accessibility.  Because I think it's really interesting what happens now from a policy.  If someone comes to us and says, let's use technology to improve the city for people who are blind or have hearing disabilities or whatever, and we go to a corporation, they will ‑‑ they may offer us something, but they will have a double business model.  They will sell us a service, but also collect the data of that person for other purposes.  So, we have a problem because from a policy perspective, we are not being offered solutions that only address the problem that we have.  Anything that has to do with data refuses to give up on the possibility of having a second business model that is completely non‑transparent, unaccountable, and that we do not control.

We have a massive market problem.  The market is failing to provide cities and governments with technologies that we need.  And we need it technologies to solve people's problems but when they refuse to just do that because, since they're at it, they will get all your information and they'll do big data with your health information and then give you a different premium on your insurance.  It's so hard for companies to give up on that that we cannot buy the technologies that we need.  And I would call on all the entrepreneurs out there, all these SMEs, the small companies, all the people working with technologies to start working with technology to solve problems and give up on data.  Make money on selling the services, not on reselling the data of the user of your service.

We have a ‑‑ really, I can't stress how big this problem is.  We have a massive problem that we cannot solve with the current market solutions because large corporations refuse to give up on the double business model.

>> ALEXANDRINE DE CORBION: Thank you, Gemma.  Question in the back?

>> HENRY CRUZ: Hi, everyone.  My name is Henry Cruz.  I'm part of the youth IGF problem and law student of the University of San Paulo.  I am using Google translate, so sorry for any mistake, okay?

(laughter)

I want to make a comment and a question.  The comment is about the concept of Smart City.  What do you mean by Smart City?  And what measures do we need a city like that?  I ask this because I have the impression that a proposal to build an intelligent city is often used, not to improve the experience of the city by its inhabitants, but to promote sociocultural and surveillance by deployment of the technological surveillance and repress press data.  The host of the Olympics are representative of this.  Athens, Beijing, Sidney, London, and Rio de Janeiro have undergone huge changes and equipped with surveillance equipment for the security of the games.  According to a report from NGO's Access Now and article 19.

In addition to the disproportionality of the means employed, the proposal is not well justified.  There is little accountability around the supervising of these events.  Most important is that the technical apparatus continues to be used collecting data in a non‑transparent way.  In a resident way, this is serious because this surveillance and data collection are done without the legal protection of a general law of protection of personal data which has not yet been propagated in the country.

I do not doubt the intentions of those who claim Smart Cities but I believe we have to be careful how this concept is operationalized.  In the resident case of data collection, in the context of World Cup and the Olympics, I see at least three categories of data.  Data collected and released by the government, open government data, which are collected and not disclosed for reasons of security, and data collected that, while not being security, are not disclosed by the government.  On this, I would like to ask Jamila, especially, if there is any initiative in Brazil trying to identify the reasons why confidential data, since secrecy is not usually justified by authorities, and ‑‑ okay.  I am finished.  Trying to find reasons why confidential data is not ‑‑ and measures in data that are not confidential but are not disclosed.

>> GEMMA GALDON CLAVELL: Thanks for the question and comment.  One thing is that the class fiction of information in Brazil allows the data as secret data in specific situations.  The access to information law in Brazil was passed five years ago, if I'm in the really wrong.  Guilherme can correct me.  I think it's from 2011 that means we have a long history of secrecy and the implementation of this is very difficult itself.  We have been researching this several times.  We have been trying to assess how access of information is being complied by different levels of the government and we see that it's being poorly complied.  This is one thing.

The second thing, very quickly, in the context of the mega events that we had in Brazil until now, specific administrative norms have authorized little or less transparency regarding the acquisition of surveillance technology.  That's why we have difficulties in accessing that information, the same administrative rules allow this to be secret.  And I don't have information as to what are the uses of these technologies now considering that the state already have access in that we have some history of abuse, regarding.

>> GUILHERME CANELA DE SOUSA: Very briefly, there is one element of his question that is very important.  Smart Cities, a concept we call a burst concept.  Everybody wants to be smart.  You won't say, I'm a stupid city.  Same thing with democracy.  No one's saying, I'm a dictator, so the point is not only the definition in the paragraph which are the indicators we use to define or understand if that particular city is actually smart.  Again, all the things we have been saying during this session, privacy, accountability, we need to really work more on what are the set of indicators we want to use to assess if that particular city is smart or not.  Thanks.

>> My question is about a gap that I think exists and makes things very difficult to solve between innovation and technology, and the speed of innovation and the speed of regulation so if Guilherme, especially could try to give a glimpse of, is there anything we can do about that, like the complexity of things around the world and technology in a way that, I really doubt you ever, ever be able to cope with, regulating and innovation.  And if it's true, how can we mitigate this gap?

>> MAX: So, while the Mike is getting to Guilherme, I'll keep it very smart.  Ideally, we do not need to regulate.  Ideally, we would instill these values into the people that produce it and we need to understand it's between architecture, law, the market and public opinion.  So if public opinion and architecture already has strong values we do not need to come in with laws and regulations so maybe it's good that laws and regulations are a bit slower.  If that really doesn't work, then we can come up with laws and regulations.

>> And that's why it's called multistakeholder governance and you get together and talk about those issues and identify what's wrong and how it can be solved together and companies and citizens implemented, and everybody gets it done without the need of long, very specific laws, as pointed out, if we had had human rights for a horse, I don't think we would have been all that happy today.  So ask yourself what you really want from the technology and what rights can be extracted from that.

>> Yeah, I'm not sure of this.  I mean, I think that the subprime people involved in a crisis and said, ideally, we don't need to regulate.  And then what happened.  So I think it's real that technology is moving faster than that, but I agree with Niels on one thing he replied is that the principles are basically the same.  Access to information, transparency, Human Rights.  So it's harder to regulate with these different things, and it's part of life.  Life is hard, I'm sorry to say to you guys, but I think we should stick to the principles when we have doubts about specific regulations.  But when it's taken to the principles, we can say, okay, freedom of information laws are important and the cities must follow, then, as well.  Privacy data protection in line with, they are important regardless of what we are talking about or a new technology that we are not still envision so I think the response for that is that we need to stick to the principles that we are already agreed on.

>> ALEXANDRINE DE CORBION: Are there ‑‑ sorry.  I was going to say, are there any more questions?  We'll take this last question unless there are any burning ones after.

>> MARIA PASCANALES: Hello, everybody.  Maria Pascanales.  I was thinking of doing all this conversation from the Human Rights perspective, one piece that I'm missing in the discussion is like the power of all these connection of data for Smart City to empower the citizens itself.  Not just about how this big data can be used for making better choices to favor the citizens thinking that the city or the authority of the city are the ones that have all the information and have the final criteria in deciding what is good, what is bad for the citizens themselves.

But, all this data has huge power to provide for some of the benefit that we have talked that can exist coming from these technologies and from this data harnessing to empower them and make choices and in some way change the way in which the city is ‑‑ and all the necessity of providing some mechanism for the citizen to get back their data, move their data for other uses that can be important for them.  So, I'm missing all the issues about how the city in all the initiatives that involve the use of the big data can find a way to make the data fully accessible, fully portable, and fully interoperable.

>> ALEXANDRINE DE CORBION: After that, I'll be asking for completing remarks from the speakers where they'll have a 30‑second elevated pitch with an unlimited budget to create a Smart City that they want to see.

>> He already has an unlimited budget.

>> If I only had.

>> GEMMA GALDON CLAVELL: On the issue of regulation, I just wanted to stress something that I think had to be mentioned.  Can you hear me?

>> Yes.

>> GEMMA GALDON CLAVELL: On these regulations, don't forget that privacy and issues we've been talking about are not just individual rights, they're also collective issues.  The reason why you can't have the company and individual coming up with the regulation is because they belong thinking about the collective.  When you share your data on your phone, you exchange your data for the data of everyone you connect with.  So someone has to be thinking about how to protect the collective, and that's why self‑regulation doesn't work because it continues to ignore the collective of all that are involved in technology.  Think about an automated car.  Who is to make the decision of who to kill first?  The driver, or the passer by?  Who decides this?  How do we make that decision?  If you let the company and the buyer of the car make that decision, they will decide that the algorithm always kills the passer by.  That's why we need someone thinking about societal issues.

>> This is my 30 seconds.  So I do not think that governments necessarily have the best solutions of we've seen just horrible IP bill passed in the United Kingdom.  If we give governments the opportunity, they do not all necessarily have the best principles for their citizens at heart either.  So I don't think regulation is a solution.  It may be one part of the solution but we need to keep on working on this together, and money is also not the solution.  Money got us into the economic crisis and regulation may have partially helped us out of it but we're definitely not there so it's something that we as mankind need to think of the future that we want and instill those values in every step that we take there and not get distracted by money or regulation.

>> Very quickly on the car company, it was not helpful because the company will not make the decision by itself but in dialogue with the stakeholders, so we essential will not kill all the by standers.  That's in the going to be helpful either.  I really think that's too simplistic.  It's an open discussion, and the most important part is that a lot less people will get killed and have traffic accidents if we have self‑driving cars.

>> ALEXANDRINE DE CORBION: Okay so I'm going to bring it down to 30 seconds because we're low on time.

>> MAX: I can do it very fast and similarly to Niels.  I don't think we need all the money, but to do what we always do, trial and error, have different experiments.  I think money would be helpful to support Civil Society and researchers to ensure that we get all the different perspectives and that's it.

>> AMBER SINHA: I think the challenge is, we need to go for development, ‑‑

>> GUILHERME CANELA DE SOUSA: I think the challenge is we need to go for development, democracy, and civil rights at the same time.  And that's the difficult thing.  If you manage to do this, we got it.

>> AMBER SINHA: And I think what any smart cities needs to address fundamentally is what I would term as the big data right where data collected from the subjects are not used for their ‑‑ big data divide where data collected from the subjects are not used for their benefits so I think that value has to be IP stilled in a Smart City where every decision being taken has to be taken keeping in mind the data subject as an individual and as collective that their interests are taken care of.

>> JAMILA VENTURINI: Okay, in line with Guilherme, I think a Smart City would be Democratic, environmentally responsible, and I would hope some money could solve that but I don't think it will so it needs also participation.

>> ALEXANDRINE DE CORBION: You have 20 ‑‑ no.  No response?  Oh, she can't hear us.  Okay.  Okay, just to wrap up because we've been told we have one minute here so I won't do justice to the amazing expertise that was shared from the audience but I wanted to pull out two to three things that came up to mind and was needed for further discussion.  One is around understanding the systems, models, codes, from an economic, technological, legal, sociological perspective as well.  Going back to the basis, why do we want this system in place, for whom, created by whom, and what do we want it to look like?  Bringing back the citizen at the center of that, the human perspective to all this decision making, and then when thinking about regulation.  Yes, there's legal frameworks but also the need for ethical standards in how the technology and systems were developed and I think what's really relevant to the last basis around regulation is we need to take stock of what already exists that we can pull together rather than taking new laws that will take a while to draft and implement but just to understand what's already in our reach, what already applies to the concerns we have, and how do we move forward with that.

So, on that note, hopefully a positive one, we're going to close this session.  Please feel free to reach out to the different speakers as well afterwards.  Thank you

(applause)

(Session was concluded at 1:05 p.m. CST)