IGF 2018 - Day 3 - Salle VIII - WS346 Refugee Rights and Emerging Technologies: Building Digital Futures for all

The following are the outputs of the real-time captioning taken during the Thirteenth Annual Meeting of the Internet Governance Forum (IGF) in Paris, France, from 12 to 14 November 2018. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> MARIANNE FRANKLIN:  Okay.  I would like to call the meeting to order, if I may.  Good morning, bon jour, Madam and Messieurs.  Good morning, ladies and one gentleman.  My name is Marianne Franklin and I'm chair of the Human Rights Community.  Now that we've gotten past security, I would like to beginning this the session Refugee Rights and Online Environment Building Digital Futures for All? With a question mark.

     We have a high level panel and this panel is comprised entirely of women.  Sadly our U.K. representative is taken ill, unable to join us.  So our governmental stakeholder community, they had the right to reply and are not able to join us.  This is a politically sensitive issue, I just note that for the record, if I may.  We wish Andrew Toft a speedy recovery.

     So we have a small group here.  I would like, I'll introduce the panel and then I'm going to ask everyone at the table to quickly say their name and affiliation so we can have an open discussion that I think will be very nice.  It won't take too long.

     But just before we get that started I would like to introduce Astrid Van Dyck from the Google Corporation.  Thank you for joining us, our technical community voice.  You are not expected to speak for all. 

      Okay, and Valentina Pellizzer, APC Women; and Jean Guo from Konexio from France; and Eimear Farrell from Amnesty International. 

      I'll give a brief introduction as to why we are here and how this connects to what the Rights Commission has been doing.

     Let's say into the mic, you need to turn it on, name and affiliation.  Thank you.

     >> Hello, my name is Valentina.  And I'm a student in communication.

     >> Hello, I'm Faith.  I am from the Hong Kong youth Internet Governance Forum.  Actually, I run a student campaign that advocates refugee rights in my school.

     >> I am Esther and I'm from a French agency.

     >> Hi, everyone, I come from Ecuador.  My name is Danella.  I started the Youth Observatory programme.

     >> Hi.

     (Garbled audio.)

     >> I'm from the -- University and I am a student.

     >> I'm Elena and also from -- and a notetaker from this session.

     >> Hello, I do information on human rights and we are here to state our position.

     >> MINDA MOREIRA:  Hi, I'm Minda, here representing the International Rights Coalition and I am currently the co-chair with Marianne.

     >> Hello, everyone. I'm the product manager for an NGO fighting against sexual exploitation of children and Internet project about refugees.  That's why I am here.

     >> MARIANNE FRANKLIN:  This is a fantastic range of interests and strong representation of youths.  Elena, I would like to note that for the record.

     >> Hi, I'm Alex Baldwin with Google.

     >> MARIANNE FRANKLIN:  Two Googles!  That's great!

     (Laughter.)

     >> MARIANNE FRANKLIN:  Look, I just want to start formally a little bit and then I'll explain why we are here.  Today over 65 million people are refugees or internally displaced people from the UN High Commission on Refugees.  Another report looks at how the Internet can transform humanitarian action.  This report found that Internet access has become, I quote, as important as food or shelter.  This is the record, the most important nonfood item, with a natural disaster, global warming.

     The so-called refugee crisis, how it affects connectivity and accountability issues.  Very much integrated in issues around dealing with first contacting the emergency levels of human displacement at refugee camps.  There are a variety of digital tools that can collect data.  We need to notice this about data collection.  I want to flag that also if you can.  In order to respond to --

     These become life-long leads as children have been born in these particular kinds of so-called temporary dwellings.

     So we have an issue here about where human rights are also recognised by the UN, how they work or do not work in the case of anyone who had to flee their homes.  I am using general terms because "refugee" is a specific term.  Those who seek asylum, who those have the status of refugee and all those others who are not yet granted any status and some political discourse is unwanted terrorists and all sorts of other terms which I will not indulge in right now.

     So we have 65 million people who have by international law human rights.  However, if they are designated as refugees they fall under the International Convention on Rights of Refugees.  We have a double problem here.  Our reports are showing, the ground is showing basic rights which people access, to keep their phone to be able to access social media platforms to contact family and friends, download photos, keep in touch with legal counsel, ability to enjoy themselves is being withdrawn within certain parts of the European Union, the U.K. and other parts of the world.  In other words, you become a refugee and it appears de facto your rights have become denied.

     I'm saying this because this is actually on record.

     So we have a difficult task before us today.  About the way that these tools and social media platforms (garbled audio) delegate a double life of these technologies.

     They are labeling people who do not fall easily under the citizenship rubric.  What happens if you are not permitted to be a citizen?  We are actually getting to the heart of human rights, values, looking at the heart of what is begun and we have data on the speakers.

     Going on to follow the introductory workshops we had in '16 at the European Dialogue on Internet Governance, two workshops.  We were able to bring in a number of refugees, former asylum seekers who now want to be regarded as newcomers because they are making new lives in new countries.  Some of them are working and some are activists and they are able to let us know that these people have lives, future hopes.  And that is an important point.

     Because of time, costs and overhead we were unable to bring anybody here today.  I feel that is unfortunate.  They may be online participation to permit (garbled audio) -- we have people working with grassroots groups there. 

      Without further ado, I'll let you know the order of speaking, if I may.  Please welcome to the table, we will start with Valentina Pellizzer because she has some visuals for us.  And then I would like to move to Jean and then Astrid and then Eimear.  We will have three minutes each, anyone at the table.

     Quickly, tell us who you are and what your -- the newcomers.  I want to have your name and affiliation on our record.  Thank you.

     >> Emil, I work for the agency of human rights.

     >> MARIANNE FRANKLIN:  Taking your jacket off.  Tell us your name and affiliation, okay?

     >> I am Audi.  I work for the public policy agency, central, defending human rights in the digital environment.

     >> MARIANNE FRANKLIN:  Very start.  Two to three minutes for the short round.  Valentina?

     >> VALENTINA PELLIZZER:  Okay.  Good morning. 

      (Poor audio.  Captioner switched to Webex)

     >> VALENTINA PELLIZZER:  I come back to work with refugees when we had in the Balkans a new refugee route, people from Syria, Pakistan, Afghanistan.  They were all coming, and from Africa because that was through the Mediterranean.  People drowned in the Mediterranean, the biggest cemetery we have on earth.  And they come. 

      So refugees are a very complicated conversation because we try to terminate them.  When I say we, I say we with this skin, this skin is white.  So I want to take, I think there is a conversation that is behind everything.  It is the conversation around whiteness.  Europeans don't talk about whiteness, they talk about sexism, but they don't talk about the whiteness is at the core of why refugees exist.  Before talking about technology today, we need to understand the cause of yesterday and that is what we call colonialism, capitalism now.  If we do not acknowledge the responsibility of the white European in colonial times we cannot talk about what is happening today.  Because today it exists, it exists because of the greed of white people and all the cultures generate.  I will not go through that, but I think it is important. 

      Two years ago we did, I participated in an event talking about the future and we were talking about the infrastructure, the connection in this connection.  We were talking about drones, migration and digital care.  If we can scroll down to see the full title, I suggest to you if you just search for this title, you will see.

     What we discover all together was a mixed group of people, was that the reason for the infrastructure of data surveillance, surveillance through data, it is paid with tax money, with the money of the public to surveil and to collect data about peoples, about the refugee to come because while people are moving from one place to another in mass migration, they are still not refugees.  Freedom of movement is what I enjoy every moment.  It's what each and every one of us enjoyed coming here.

     The freedom of movement for mass migration doesn't exist.  It is criminalised.

     There is an infrastructure of data surveillance, collecting with drones, monitoring each and every step of the people.  Each and every data about themselves.  So their shadow is so, so heavy that it is a cloud.  It is above us all.

     But when we go to the other side of the infrastructure of the connectivity, the infrastructure of care is fragile.  Infrastructure of care, it is based on civil society organisation, on activists who put wifi hotspot to help people on the move to be in contact, in contact with their friends, with their family, to try to save their lives because we need to be aware that people move not because it is a pleasure.  They might reach pleasure are you are, but when African people walk by foot to reach Europe, it is more than one year walk.  It is not a one-year walk we follow on social media because there is one person who decided to walk the entire world with the passport of privilege.  So I think that it is important that we reflect on these two extremes, the polarity of fragile, and refugees and people on the on move.

     Let's call them people on the move who continue to be informed, to understand where they are, how they can arrive where they want to arrive.  It is very fragile.

     Then we have public-private partnership in big camps.  Camps are not the place of choice.  Refugee camps are places that you cannot live.  There are people who live in the refugee camp their whole lives.  I remember when Palestine in '67 was founded.  I am 51; I was born in '67.

     In that camp we are public-private partnership, public-private is a polarity.  It is a big corporation and the government.  What is the civil society?  Where are the citizens?  Where are the refugees themselves in all this polarity?

     We cannot have bilateral partnership without having a third party monitoring it.  It is not about, of course the civil society, it is not the resource, the skill and the knowledge.  The civil society should be the providers of solidarity and civil society means the refugees themselves.  They are the citizens and they have agency.  Some of them walk from Sub-Sahara to the future, with far more urgency than any of us who just go crazy if our plane, bus, train is late one minute.  Thanks.

     >> MARIANNE FRANKLIN:  Thanks, Valentina, you've thrown down the gaunt let.  We are moving now to Jean.  Yes.

     >> JEAN GUO:  So I am happy to be here today and exchange with everyone and present a little bit also the work we do at Konexio.  Our mission at Konexio is to make sure that those who are most vulnerable, including refugees have the opportunity to have access to digital skills and rates.

     For us, I can start with some of the facts that we see and some of, we do studies within the students that we work with.  Then also just there's a lot of data outside as well.  We find that up to one-third of disposable income amongst refugees is spent just on connectivity.  So to pay for phone plans, to pay to have access.

     If you ask them what is the one possession that you can take in extreme circumstance, a lot of them will say my mobile device.  Because for them, that is connection to the world, connection to resources, connection to information.

     And then just within the student population that we worked with, three-quarters of them do have smartphones.  And I'll get into maybe the different levels of usage of what the functionality behind is as well.

     But I think for us what is important, I'll go into some of the important distinctions and then into what I think are three key rights we should have with, in regard to what refugees should have in terms of their digital rights.  But I think the first one is to make a distinction between usage.  Often types when we think someone has mobile connectivity, they are connected, that's great.  But I think we also need to go into the nuances and realise there are different evidently levels.  Even among some of our students who are a little bit older, they only use the smartphones to make calls.  They don't use that for maybe researching information.  They don't use it in the same way a young 20-year-old person might.

     There is obviously differences in that, but I think the first is to make a distinction also between what layer and to what extent people are really using that.  When we ask questions, when we do surveys there needs to be a little more nuance into usage as well.  Not simply are you connected, but in what ways are you using those types of services and connections.

     Second, I would say that I think there's also a lot of, this has been something that also has been thrown out when I attended conferences and other things.  It is important to start talking about, as you were mentioning, protection of personal data.  For example, we were at a hackathon recently in based in Rome at the Vatican where one of the issues they were working on was migrant rights or migrant refugee rights.  There were individuals, groups who had come up with devices or they had come up with applications they built at the end of this hackathon that were supposed to really think about how can we hack solutions.  But I think a lot of that needs to come along with the first layer also of education and information that some of the information you are able to provide needs to go through a layer, would this put someone in danger, for example?

     There was, for example, an application about facial recognition where they would take photos of someone and try to match that to find, reunite family members, but I think there is a layer of data privacy and data protection that needs to be thought out when we develop solutions.

     Also I think we attend hackathons, attend these conferences where some of the goals are to create solutions, but there also need for sure to be people who are there and who have gone through the experiences to tell them what is important to take into account.  What are the dangers and risks associated with connectivity.

     Finally, I think in regard to some of the rights, I think there are three key ones for us.  The first is the right to have information, to know, for example, when students are based here in Paris, we work with hundreds of students here in the local communities.  The first one is to make sure that they have a right to information, to know where they can go for basic services, where they can find shelters, to where the NGOs that work with them on the ground are, where they can find meals including donations.  I think having them be connected is very important.

     The second is education also.  To go a little bit further than just the mobile connectivity and think about what is available as well.  We talk about upscaling today.  For us, core part of what we do at Konexio is through digital skills, digital literacy, we inform and up skill individuals to make that bridge towards long-term integration.

     So giving them opportunities to gain these skills, where they become productive Members of society and really work in the ecosystem to stay and to actually be fully integrated is very important.

     The last one is also for us, that's a step further for those who do obtain asylum and have opportunities is to open up more opportunities for them to have professional integration.  In France, unfortunately, there are barriers in the labour market that make it very difficult for certain individuals to be able to access that because, for example, when we talk to a lot of companies, recruiters, they say because we don't think they speak French and we get a lot the question, are they able to work?  You know, for them we wouldn't be sending them candidates if we knew that they had the status necessary but there is a lot of doubt.  That is not representative just of the community here but also probably of other communities as well where that is always called into question.  But providing more pathway for them because the brick of professional integration is at the same time social.  It opens up to another network and economic in the sense that if you are not able to earn a stable living, you are not able to go out and meet people and become part of the community.

     >> MARIANNE FRANKLIN:  Thank you very much.  Very informative, thank you.  And inspiring.  Thank you.

     Moving to Astrid Van Dyck from Google.  Thank you for being here.

     >> ASTRID VAN DYCK:  Okay.  Hi, good morning.  Thank you so much for including Google in this panel and I'm glad that you brought up partnerships immediately.  That's a huge part of how Google is contributing to this massive problem and finding solutions.  I'm eager to get into the conversation about data and protection and integrating refugees.  I want to start with basics about why Google is part of this important conversation.  Thank you again for including us.

     Google's basic business mission and function is to provide information.  And through our basic product, search, but also maps and translate.  That's the heart of what our business is, is providing information.

     And we are committed to doing that for everyone.  That is in our 18 to 20 years of existing, the existence, we have been reevaluating what that means.  For the most vulnerable and most broad, how we can make our products accessible for everyone including refugees.

     Since 2015, Google has been providing support for 800,000 refugees, which sounds small compared to the overall numbers we are talking about, but through a few different ways in terms of access to information.  So immediately right after the crisis, as you mentioned, people want their smartphones.  Some of our basic products, maps, making low bandwidth maps so that you didn't need a lot of data to have maps.  We have a tool called People Finder to help people find each other when they are displaced.

     Google translate is also a basic product that was, we found we could immediately deploy and improve in low bands width environments.  Those are some of the things we did immediately in 2015 as the crisis was emerging.

     In addition to that, we provided access to some of the camps and Chromebooks through partners.  We work with net hope, with IRC, with another organisation called libraries without borders.  And Tyron.

     In addition to access, educational resources.  We found were really important.  Where Google could add value.  We found and are interested to hear your perspective a few years later, but when some of the refugees were actually getting education or taking classes, there was no way to get credit for that.  You would lose a whole generation even if they are taking classes.  Online was a space where you could have accreditation and have organisation of the education that they did get.

     So that was a way that we found we could partner and provide value.

     I think that we are committed to finding solutions to this massive problem.  And we think technology can provide help and find solutions.  Moving into the massive power for good that we think data can have.  Of course, there's huge other elements to that discussion in terms of protection of rights and protection of price, but we found in many -- we are finding in a number of global challenges such as famine prediction or health, the power of the data we can find to predict patterns and then therefore find solutions, it's an incredible opportunity for society.  We need to be part of getting that right without the problems, the down sides of problems for privacy an rights.  We would love to talk more about AI in this panel and how that can help.

     Then another part of the discussion from Google's perspective is, as you mentioned, integration.  So a lot of what we are doing on a variety of our platforms but in particular YouTube which Google owns, is to provide a space for what we call counter narratives, positive narratives about immigration.  Lifting up voices on the Internet.

     So to counter some of the more negative, you mentioned terrorism, those narratives out there.  Finding refugees, voices, we have a programme called creators for increase, people within the community who are trusted, lifting up their voices both online and being voices for change around the world.  We think the best solution to negative hateful narratives is more narrative, positive narrative.  That is another part of the discussion where we think Google can add value.  Thank you for including us.

     >> MARIANNE FRANKLIN:  Thank you very much.  Very informative.  Turning to the formal speaker, Eimear from amnesty.  Thank you for being here.

     >> EIMEAR FARRELL:  Thank you, and thank you for the invitation.  I work for Amnesty Tech, the technology and human rights programme at Amnesty International.  And I wanted to start off by saying something positive that actually I was going to quote from the UNHCR report that Marianne already quoted for us.  You will perhaps hear more of the negative perspective, but I will end on a positive note.

     We have heard many examples of tech for good and how technology can really be used to improve the lives of refugees and migrants, but I think it is also really important to look at the need to not do harm in this area.  So maybe if I can first focus on that part just to say that, so you probably know this quote from William Gibson:  The future is already here.  It is just not very evenly distributed.

     I actually think the future is already here in refugee camps and reception centres around the world because I think what we are seeing there is the testing and trialing of technologies on very, very vulnerable populations, and where there can be quite serious, sometimes life or death consequences for individuals.

     So we do need to approach these questions with care.  And so to maybe give backgrounds on why amnesty is interested in this topic.  At the UN level, this year two very important global compacts were negotiated, global compact on migration and the global compact on refugees, both including language around data and around biometrics.  We together with a number of other NGOs and humanitarian organisations and NGOs advocated for improved language around privacy and data protection in these areas, especially in the case of biometrics.  And what we are seeing increasingly is the use of technology and biometrics around border management, around management of migration flows and around management of life and registration in the camps.

     So what we are advocating and have been advocating is a human rights based approach to the use of these technologies.  That means looking at questions such aspiration off the people who are affected.  Equality and nondiscrimination.  I'll explain why those are important particularly in the context of biometric in a moment.  Also transparency, right to remedy, and accountability.

     What we realised during our advocacy efforts were that actually not so many NGOs were looking at these questions.  So what we are hoping to done and we are at the very early stages of asking these questions is to start organising a coalition of groups of interested actors in this space to start trying to unpack and understand what are the gaps in knowledge and what can we do in terms of having more practical information and examples of implementation and what the risks and the harms are, and also proposing other solutions potentially.

     So some of what is driving this, so we are seeing politically the securitisation of immigration and migration.  We are seeing a drive towards using technologies, towards innovation in humanitarian contexts.  And we just need to bear in mind that digital transformation is not only about tech.  So there are other aspects that we need to be paying attention to.  Where refugee camps become testing grounds for these technologies it is not a marginal issue.  Once these technologies, for example biometrics have been tested, they are often rolled out in other contexts.  We see governments around the world implementing national ID systems and in some cases with the system in India we have seen quite serious privacy challenges to those.

     Just to say this is actually a question that affects all of us.  And we also need to -- digital identity in particular, there is momentum towards providing digital identity.  And that can be a good thing in many contexts.  It can give refugees and migrants access to important services that they need.  It can empower them in various ways.  But digital identity, ideally we need to put this in control off the people whose identity it is and there is a push at the UN level with SDGs, SDG goal aims for digital access for all.  We see many vendors entering into this space and pushing certain solutions which may not really have been tried and tested.  So I think we need to exercise some caution in this area.

     So what we really want to do is look at how refugee camps are becoming labs for some of these experiments, for some of these technologies.  And so also looking at in many cases in the camps, governments are partnering with private sector organisations.  Often there is not transparency about the types of contracts in place, the data sharing agreements, and we don't always look to what potential uses of these data may emerge in the future for example in potential convergence and linking of these data sets.

     One thing we would be interested in doing is trying to create transparency around the contracts that are in place.  There have been audits of the world food programme, biometrics and UNHCR, those audits were independent and critical.

     Also just to remember that the private sector, states obviously have responsibilities in this area.  Private sector are also responsible, the UN guiding principles on human rights outline a lot of those responsibilities.  But we can't -- so the business model for some of those private sector organisations may be quite different from the government, what the government might intend or what the humanitarian organisations might intend with the data storage and how it is used.  Afterwards, we don't know how secure some of the data storage is.  So with UN biometric data, this is stored centrally, the biometric samples are stored centrally in Geneva and in the cloud but often there is not sufficient training of the staff who are doing the uploading and so there are potential security risks around that.

     From the perspective of surveillance, these are obviously very vulnerable populations, some fleeing on the basis of their identity.  So their identity is critical to their protection.

     And governments we know anecdotally that in numerous cases in the camps in Jordan, the government has been putting pressure on the people monitoring the camps for access to that data.  Again anecdotal evidence of refugees being tracked afterwards.  We know that has happened in other countries.

     We would like to explore this more.  It would be great to have your views and your inputs.

     >> MARIANNE FRANKLIN:  Thanks.  Very substantive contributions.  We have about 15 minutes.  I'm wondering, we'll open it to the floor.  Comments brief, please.  It would be good if you have a question.  We are trying to come up with some concrete action, recommendations out of this workshop because this is the third workshop we have hosted.  We would like to have it on the record of the IGF.  This is what we are asking governments to either acknowledge or to do.  It is what we are asking the tech community, and this is what we are asking civil society groups too.  This is our challenge today in 15 minutes.

     Before we get there, we have ten minutes of questions or comments from the floor.  I believe we have remote participants; is that correct?  No?  Just checking if there are, do we have any comments?

     Okay.  Let them know they can comment.

     Anyone from the floor like to comment?  Please.

     >> AUDIENCE:  Emil from the Danish Institute for Human Rights.  I am pleased that the guiding principles were mentioned now.  I have a question to Google but also perhaps to the other panelists on, first of all, if Google actually, when you develop your, you said the People Finder, for example, do you already maybe -- maybe you're not the person to ask, but do you already have kind of a human rights lens when you develop these programmes as a start?  Or is it more your internal ethical kind of standards?

     Also for the other panelists, do you think to have a human-rights-based approach to all these technologies is a good start and not go with a kind of all Facebook mantra of move fast and breaking things.  Rather slow it down and start with thinking about what are the consequences as a start?

     >> MARIANNE FRANKLIN:  We will just gather some questions, but this is slow tech.

     A couple more.  Before the respondents, could we have a couple more questions from the floor because of time?  Yes, please.  Thank you.

     >> AUDIENCE:  Hi.  I'm Faith.  I'm a representative from the Hong Kong Youth Internet Governance Forum.  So from where I'm from in Hong Kong even though it has one of the smallest refugee populations in the world the acceptance rate is astonishingly low, the acceptance rate for refugees is 0.6 percent compared to some European countries where it's 75 percent, a huge disparity.

     Also because not all asylum seekers are granted status in Hong Kong, only granted $3,000 a month which I think is completely inadequate for them to obtain basic necessities.  Because of that, I think a lot of times the government might not be doing enough for refugees.

     So I would just like to know a question to all the panelists, how would you collaborate with the government?  What expect methods are you going to implement in collaboration with the government to ensure that the rights of these refugees are protected?  And in terms of digital rights as well. 

      Thank you.

     >> MARIANNE FRANKLIN:  That's the big question.  Bear that in mind.  Thank you very much.  One more question from the floor.  I think I saw another hand somewhere.  No in.

     Oh, yes, one more.  Oh, over that way.  Go for it.  Two more, yeah?

     Go, please.

     >> AUDIENCE:  Sorry, it is not really a question.  It is much more a remark because I heard that I think it is from Google you told about the positiveness of material that we can find on YouTube.  I want to mention we have a negative aspects of these narratives from our association.  We know from experience that narratives can be used by traffickers, for example, to attract people from Nigeria, for example, to France.  You have like girls saying that France or Europe in general is wonderful and then you can find work everywhere, et cetera.  Then they are moving, flying to France and when they arrive, they are being trafficked.

     So I would like to mention the negative aspects of the narratives just to -- sorry, I'm an advocate.

     >> MARIANNE FRANKLIN:  Important point.  The first question is directed to Google.  I would like to give Google the chance to answer.  Yes, please, Astrid.

     >> ASTRID VAN DYCK:  Thank you for the question.  Yes, we apply a human rights lens to all of the products we launch.  We consider many obligations we have.  But human rights is a big part of that for all of our products.

     Recently June, our Google CEO announced seven principles around AI that Google is committed to.  They are all worth talking about, but the one I wanted to flag is, and these are principles being applied to every product.  They are evolving.  We know this is a dynamic space that is going to continue evolving.  We all want it to evolve.  As you said it is not just about technology but about people.  But we are committed to these.  And they also include things that we will not use technology to do.  We will not use technology for surveillance or for anything unlawful.  These are guided in large part by UN principles and our commitment to those, and inform those.

     So our applications we will not pursue include war or harm, or anything that contravenes our international law or public good, including human rights.  I would point you to those principles.  If you haven't, there's seven of them.

     Again, it was many, almost a two-year process to find how to distill those so that we could have the benefits of technology.

     There's a lot, it's always evolving.  We need your input.  We need partners to keep making sure we learn in this dynamic space and keep growing moving forward.  Yes, we take a human rights lens to all our products.

     On the YouTube, it is a very important point on trafficking.  I'm happy, we've done a lot.  I can speak mostly to the U.S., but of course it is a global context on what we've done to fight trafficking.  We have an entire, one person in Washington actually focused on trafficking and everything we have done to stop from our own products.  I think that's what is an important point I want to make when we talk about data and the risks that come from data.  Our first approach is to make sure our own systems are secure so that everything that we have and our data that we have is as secure as possible.

     I think that's a really important part of this.

     I think in terms of suggestions, Marianne, that you are saying I think there already has been talk about principles of data immunization which we've also announced in addition to our AI principles, privacy principles this summer.  So those are also worth looking at and our thoughts on data minimization, only using data when necessary, limiting its use to its most important function.

     But I think for suggestions, thinking about group privacy and testing that, sort of best practices would be a suggestion I would make.  That is the term we would use, group privacy.  Maybe there's a better interpreter.

     But thinking how you can make it context specific and weigh the benefits with the risks and how to have best practices around that.  That's a suggestion I would make.

     >> MARIANNE FRANKLIN:  Thank you very much.  Other responses?  We have, of course, the ongoing concrete suggestions.  We would like to get everyone from the panel to offer one from their view.  And also if you have a moment about the slow tech challenge.  Is it time to slow down and put the brakes on this crazy AI-driven Robocop out of control car we're in, if I want to put it in a doomsday thing.

     But the slow-down and concrete suggestions.

     Jean, off you go.

     >> JEAN GUO:  Yes.  So I will also respond to that and also the government collaboration question.  I think that is a huge obstacle no matter what community you're in.  You have to work with.  I think to respond on the, if it is, like how should we think about developing solutions.  One of the things, because being in the digital space in the tech community here, one of the things we don't see enough is the inclusion of refugees at the table.  Like making sure that when you are designing solutions you're asking, is it actually useful for them?  Taking very much a user experience perspective because sometimes there is a tendency to go for what looks cool, what sounds like it incorporates innovative technologies.  At the end of the day whether it's innovative or not, it has to respond to the basic need of is this actually helping improve facilitating communication or some other function.

     So sometimes some of the solutions we see that are best implemented are those that are quite simple to actually use.  Does not require a lot of code or does not require a lot of technical solution.  I think testing those, doing beta tests and making sure before you deploy.  For example, we work with, there's an interministerial admission for refugees here in France.  We are working with them.  They are launching platforms.  They launch tools.  But we tell them you have to slow down and think about testing this before you kind of launch it at full capacity.  What if you see the take up rate is very low because of the fact that it is not that useful.  So I think that is one point.

     A quick point about the government collaboration.  I think sometimes it can be related down to stereo types too and whether there is obviously political pressure, economic pressure that governments are facing from their perspective of welcoming a lot of newcomers, essentially.  And so I think what helps, I think back to perspectives, it's important to show more positive perspectives because sometimes in the media what we see, in France at least, there are a lot of negative perspectives.  You see people, we have (French phrase) camps by train stations.  Those images are not positive.  It fosters a negative image in the media.  Part of it is thinking about how do you transfer them and show more positive examples in terms of also the economic productivity of the people there.

     >> MARIANNE FRANKLIN:  Thanks so much.  We need to hear from Eimear and Valentina and we have a remote participant who wants to contribute.  Keep it brief.  We have five minute and I have a horrible feeling we are going to get kicked out of the room.

     >> EIMEAR FARRELL:  On the slow tech, maybe it's more responsive tech which may amount to the same thing.  And also bringing in more voices, bringing in the voices of the affected stakeholders and involving them in the design process.

     Looking at how people are already using tech, looking at whether there are low tech solutions, working with donors and funders to make sure that they are funding the right type of solutions.

     Also disrupting technology, bending it towards justice.  So using technology in our own human rights work and for solutions that will help further the rights of refugees and migrants.  You know, there's this idea about technology being magic and people can often be quite critical about that, but technology can be magic if we work collaboratively and if we are reflective about it.

     Then on the concrete recommendations?  Yes, so basically to continue the discussions around global standards on these issues and to make sure that refugees themselves and migrants and the advocacy groups that represent them are setting the agenda in these discussions and that we frame data protection and innovation as mutually reinforcing and not as in contradiction to even other.

     >> MARIANNE FRANKLIN:  Thank you so much, very helpful.  Valentina?  We would like to hear from the remote participant to finish the round.

     >> VALENTINA PELLIZZER:  I think the responsible use of data, there are two corners.  One is the corner of the user.  There are many people that collect data on refugees.  There is no reflection.  Oxfam has done that and reflect about the responsible data, how that is collected, what should be the policy.  Because people use their own phone and they take pictures, they take information, they take things and then share.  There is no -- if we don't go from the higher level, the expert level, the understanding, a company maybe will think, then the user and the user can be the NGO international, also the UN officer.  If you do not have a real understanding of what means responsible data, they allow it all over the places, is one thing.

     I talk about the social contract of auditing all these things.  The auditing has to be done by refugees themselves, civil society.  Because we need to understand this is not an issue between the governments because government most of the time criminalize, criminalize refugees.  So we cannot have this idea of hosting, happily hosting governments.  We need a third-party.  The third party are the parties themselves involved, the refugees.  Who recognizes their own lives.  Then we have the audit of the tools and audit of the process and then we have something.

     Something about the positive narratives.  Until we are in a click economy the top news will always be the one that continues to criminalize.  We have a discussion that we need to do, how news are indexed and what is pushing that up, because we can create narrative.  If the narratives are not pushed up on the top, then we stay on the bottom and no one will know that the migrants are good people.

     >> MARIANNE FRANKLIN:  Lots to report.  Even if we have to end the question on a question, let's hear from the remote participant.

     >> ONLINE MODERATOR:  It is a question from Ruth Hamill who says:  I have a practical question from the Kurdish asylum seekers in U.K.  With Google Translate introducing Kurdish Sorani, Kurdish.

     >> (Speaker away from microphone.)

     >> AUDIENCE:  Sorani --

     >> MARIANNE FRANKLIN:  The Kurdish language?

     >> I don't know that dialect.

     >> ONLINE MODERATOR:  She is saying at the moment only Kurdish language is available and this is a big problem for Kurdish asylum seekers.

     >> I didn't know that.

     >> MARIANNE FRANKLIN:  That brings us back to the issue of complexity and nuance that even large language groups have specific dialects that need tailoring.

     Yes, I think this has been an incredible session.  I would like to thank our panelists for the high standard of contribution.  Clear challenges to governments and corporations.

     >> I want to throw some questions to the panel which they cannot answer now but --

     >> MARIANNE FRANKLIN:  Let's have another round of question to the panel.  No one is kicking us out.  Just questions, I think this is an open topic.  We need to end on an open note.

     >> My name is Jean Francois.  It is throwing the questions for reflection maybe more than getting answers right now.  Are we training the teams who are behind those software development products into looking into human rights themselves? 

      Not talking about policies of companies but they themselves as quarters, do they have human rights training?  Are we bringing those members of the team to the field to see what they are trying to field and what is the outcome of what they are developing.  Do they get an emotional connection to what they are developing?  Are we integrating any of the refugees to, training them to bin with, training refugees to become coders to get them into the solutions?  Are we running hackathons with them?  Is it always the same people from the tech companies who are just trying to figure out again solutions from the outside?  Thank you.

     >> MARIANNE FRANKLIN:  Any other questions to end with?  I think that's a great way to end.  Thank you very much.  I think I would just like to thank you all once again.  The Charter of Human Rights and Principles for the Internet engages with these topics and Google is taking it further.  Amnesty has it on track.  This is our booth at the back behind the wonderful African stalls if you come that way.  We have copies and pamphlets.  Keep up the good work, everyone.  Thank you so much and have a great day.  Thank you.  Goodbye.