You are here

IGF 2020 - Day 6 - WS128 Global crises and socially responsible data responses

The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> MODERATOR: We're at the top of the hour.  Just checking that we're broadcasting live.

>> Yes. 

>> TIMEA SUTO:  Thank you very much.

Hello, good morning, good afternoon, good evening everyone, wherever you may be.  Welcome to the WS #128 Global crises and socially responsible data responses.  This session is organized by ICC basis, in partnership with the European Telecommunication Network Operators Associations and with great support from Telefonica.  We're lucky to have our session preceded by the main session of data, and it was a rich discussion on how data is used and how it was used before the pandemic and how the pandemic has changed that situation.

I hope that many of you have engaged in that conversation already this morning.  If you have any questions, you would like to introduce yourself, please feel free to use the chat and the Q&A functions as explained by my colleague Victor just earlier, the chat for social discussion, the Q&A for anything that you would like to direct at the speakers and before we start and passing on to the moderator, I would like to have a quick question with the audience to see how you feel about your data.  I ask my colleague Victor from the IGF technical team to launch the poll we have for you.

One, how to you feel on sharing data and how do you think, two, what barrier there is are for you to share your data.  Victor, if you could launch that poll for the audience please.  Question one, we'll leave a couple of minutes. 

I don't see responses.  I hope they're coming in.  I will just count to 6 silently.  I'll ask strict tore share the results for the first question..

There we go.  Thank you all who responded.

It looks like most of you are somewhat willing if there are special circumstances like the COVID‑19 pandemic crisis and 11% of you are not willing at all and about a third, a little less than a third is mostly getting to share your data regardless of the circumstances and nobody is very open to sharing their data, very, very interesting.

Let's go to the second question, and I will ask you to let us know why is it that you think that you're not so willing to share your data, what are the barriers that makes you think that way?

I think we need the second question.  This is the same question again.  Let's do the second one, after the introduction one.

We'll close the poll now.  We'll see the results.

Okay.  An overwhelming 92% thinks that there are trust and security barriers to take the obstacles and the barriers in the data sharing process.  A lot of people, they're policy barriers and some of you have mentioned capacity and technical barriers as well.  That's a little bit of information to our speakers and our panelists here today, but for all of you as well to discuss in the chat while the discussion is going on.  Good food for thought, thank you for your responses.  Without further ado, I will hand over to our moderator Mila Romanoff to formally start this session.  Before I hand over, there is a slight change in our plans that I have been informed you of, one of our speakers cannot make the session this morning, so Christoph Steck will be replaced by his colleague Gonzalo Lopez‑Barajas from Telefonica.  Thank you for being here, jumping in at the last moment.

Without further review, let me introduce you to the moderator for the session today, Mila Romanoff who is a privacy specialist on data privacy and governance and global pulse specializing in the United Nations Director General responsible for establishing enabled mechanisms for public private data partnerships and leads to data protection and risk program across the global cross network including in Asia and Africa.  On behalf of the UN global post as she leads work on ethics as part of the high‑level panel of digital cooperation and was part of drafting of the UN data strategy and coordinates the global expert group on the governance of data.  Thank you so much for being here with us, especially that early in the morning in morning.

Please, take it away.

>> MILA ROMANOFF: Thank you.

Good morning, everyone.  Good morning to our speakers and to our ‑‑ to all attend why is and participants, it is a pleasure to be with you here today.

I'm very happy to introduce our session on global crisis and social responsible data responses and with that, I would like to make a few remarks that indeed the polling has shown that we do need to make sure that we can share data and that ‑‑ and the trust and lack of trust and the policy especially ‑‑ and the policy barriers as shown in the poll, they're one of the key components to ensure that we do have access to data.  In line with the work that we have been doing in the UN global pulse, that seems to be a barrier along with the technical ones.  Today was a really rich panel of the speakers that we have had, we'll try to tackle some of these issues and answer some of these questions and potentially even provide some of the recommendations on solutions on how we can build trust, on how we can address the challenges with regard to data access and build a future where we all feel more comfortable in sharing data on a regular basis to benefit the public good, to benefit and be more prepared for the future crisis like COVID‑19 today the experience showed we were not prepared with data access and the use of AI, so let's try to dive into the discussion today with the help of our speakers, with the audience, to try to understand what are the mechanisms we can build and the policy frameworks to have so that in the future we can have a more responsible, trustworthy access to data to address crisis like COVID‑19 pandemic.  I would like to introduce our first speaker, head of internet governance for the Federal Minister of Economic Affairs in Germany and he'll discuss policy perspectives how governments work to harness the power of data for good and protect the data and privacy of citizens. 

Rudolph Gridl, on to you.

>> MILA ROMANOFF: Let's go to our next speaker if this is not resolved in 5 seconds.  We'll come back to Rudolph Gridl.

Okay.  So with that, I would like to introduce our second speaker.  Rudolph Gridl will certainly come back to right away. 

Carolyn Nguyen, Director and Technology Policy Lead at Microsoft, and she will discuss private sector on private public partnerships to provide evidence for informed policymaking and data for scientific research.

>> CAROLYN NGUYEN: Thank you so much, Mila.

Good morning, good afternoon, good evening.  It is good see everyone and to see everyone doing well.  Thank you for the opportunity to participate in this discussion.

As stated by Mila, COVID‑19 has demonstrated data and moreover the need to share data responsibly and globally in collaboration with governments and organizations from public, private, non‑profit sectors globally have been critical in enabling rapid and informed responses and I'll describe some of the ways in which data is being used to address COVID‑19, some of the gaps that are exacerbated by COVID‑19 and some learnings and recommendations. 

Firstly, we know that information about infection and spread combined with other available data can help really to inform plans and actions by governments, for example, the data presented by Johns Hopkins University in its interactive dashboard, they are accessed more than a billion times each day to do just that, this data, it is helping the public and the government as well as optimizing provisioning of medical resources, especially critical resources such as ICs, doctor, beds, et cetera, shared data is needed to calculate when it is safe to reopen schools, shops, restaurants, offices and other facilities when they need ‑‑ and also when they need to be closed.  Another way in which data is being used is to speed up research related to the virus the open research dataset of COVID‑19 is a growing, extensive resource of scientific papers on COVID‑19 and related coronavirus history is available for data and text mining.  GitHub is hosting a range of collaborative COVID‑19 projects including software, hardware design and datasets that are made available by governments, researchers and the global developer communities of interest as an opensource operator. 

When Microsoft launched out the iHealth initiative in January, a vision was to look at AI resource data scientists and shared data available to medical researchers and organizations to accelerate research that improves the health of people and communities around the world and an example project is the data discovery initiative which aims to collaborate cancer research by developing a robust regional data sharing ecosystem and the challenge to overcome, as shown in a poll question, includes regulatory, social, technical, licensing barriers to limit the cross sectional resources and data and research and technology.    The aim of the initiative, it is to share Best Practices and data governance framework in cross institutional data governance.  Since its launch, AI for health has now been mobilized to help fight void COVID, to focus on area where is data analytics has the greatest impact, for example, the institute for health metrics and evaluation, it releases a set of COVID‑19 data visualizations and forecasts that are used by U.S. governs and administrators to mobilize resources.  This dataset contains highly individual data for each patient such as data of symptom onset, laboratory confirmation and more.  Another project is the Washington State Department of health effort to develop a dashboard for timely, accurate, reporting to the public so the notion of private public partnership is incredibly important.  The foundation for any data sharing is trust and we believe that privacy is a fundamental Human Rights.  We're also developing privacy preserving technologies including an open toolkit for differential privacy, a technology which enables insights to be extracted from personal datasets while still protecting individual privacy we released principles for governments and other organizations to consider when collecting and using data to address COVID‑19.

In June we launched a global skill initiative to bring digital skills to 25 million people worldwide.  This is launched based on data from linked in economic graphs that are used to identify what are the top jobs in demand, the skills needed to fill them and what are the learning paths.  We're pledging to make this data available to governments so that they can better assess local economic needs.

Another gap that's been amplified is connectivity.  In 2018 our analytics of data in the U.S., it showed that the number of people who have access to broadband speeds is much lower than that reflected in official statistics.  This data has now enabled governments to better identify communities that are even more challenged by COVID‑19 and enable governments to provide appropriate incentives and beliefs.  The data sharing is critical, we have found that a data sharing culture is lacking in most organizations.  Tangible short‑term challenge cans help to make people more comfortable with data sharing and put in place necessary data practices and investments necessary to readying data for sharing, organizations also need better understanding of policy issues as well as instruments to address them as was shown by the poll response.  This includes technical resources such as differential privacy and other instruments such as data sharing principles and data use agreements.  To address this, we launched our open data campaign to build learning and in October launched a call for participation in a new peer learning network for data collaboration in conjunction with the open data institute.  This is to encourage sharing of Best Practices and trusted collaboration.

COVID‑19 has clearly demonstrated the need for a global digital infrastructure developed through collaboration between governments and organizations from public, private, non‑profit from around the world.  We can discuss later about the developments, the necessary elements of such infrastructure to share personal and non‑personal data and we need data from public, private, non‑profit sectors in a trusted manner across national borders and deploy in a perfect computing infrastructure and we hope that the learning from COVID‑19 can provide the impetus and collaborative environment that leads to the development of such infrastructure so that we can all be ready for the next global crisis.

Thank you very much.

>> MILA ROMANOFF: Thank you.  Thank you very much.

As you were speaking, one of the questions came in and I'm just going to ask you as we go through it, what do you think ‑‑ given and considering the overall lack of data literacy, the fact that data breach is more of a cross‑border problem, would you agree that one of the most effective ways to enhance trust can be increased in the role of national data protection authorities when it comes down to cross‑border data transfer, processers in order to provide or for additional verification in data protection.

>> CAROLYN NGUYEN: This is a complex issue that I don't think I can answer in 2 minutes.

I think what it is, it is that we really need to take a look at a risk‑based approach in terms of how the data should be used, what kind of data.  We often talk about data as a monolithic but there are really different types of data and we really need to preserve and protect privacy but also consider national security considerations and I think that in 2019 the Government of Japan put forth this initiative, data free flow with trust and I believe that really is a global dialogue where we need all stakeholders to be at the table, to discuss, share evidence, and some of the work that we're doing, we're hoping can bring some evidence in terms of, you know what, are the appropriate context for sharing data, what needs to be in place, we talk about differential privacy, you know, data governance, et cetera.

I look really forward to working with everyone to try to address this.  Thank you.

>> MILA ROMANOFF: Thank you.  Thank you for the questions.

I will then go on to our next speaker, we'll try.

>> RUDOLF GRIDL: Once again.

It looks like you're muted on the phone.

>> RUDOLF GRIDL: Can you hear me now?

>> MILA ROMANOFF: Perfect.

>> RUDOLF GRIDL: Thank you.

I'm sorry for this.

I wanted to say one thing, that of course as the government in Germany, also Europe, for a very long time data was not a major issue in our thinking and we were very much ‑‑ we were very much concentrated on privacy, on personal data, privacy protection, but data is a holistic concept, economically, socially, opportunities and also risk.  That's something that's been developed in the last two years or so and so it started already before the COVID crisis. 

For instance, in Germany, the data strategy, in November last year, the permission adopted ‑‑ the commission adopted a data strategy in February this year and so there was already a COVID crisis when it struck, there was a mindset ready for this and for not yet all of the actions.  I think that what we have in the strategic results, a major element that we have in the strategies, it is the guiding principles of how governments reacted and I just want to perhaps show you one example.  In Germany, during the COVID crisis we started to develop this coronavirus ‑‑ it is an app that you can download on your smartphone and if you downloaded it and you had registered, you will be warned if you have been in contact with a person that is positive.  This is something that sounds simple, but it took us a long time to develop this app because data protection issues were at the forefront and there comes the issue that's been called the trust issue, there have been many issues with the population in Germany, they all said I don't want the state to know where I am, whom I'm in contact with, and who has COVID and who has not COVID.

That was the basic, normal affects.  Up until now ‑‑ and then we developed a system where all of the data is anonymized and the changes, you don't have the profiles and all of these things and we have 18 million times the app has been downloaded with a population of 0 million, so 80.  Is it a success, not a success, I would say we're trying to improve the system and to reprogram it.

It is very clear that if everybody in Germany had the app, would use it, it is beneficial for all of the pandemic situation but the trust is not there, even though we have introduced all of the measures so we're trying to build on it, to have something that is one of the elements in our strategy, data literacy so, data literacy, data autonomy, people have to be aware of what is possible and what are they able to agree to because up until now, you have the people that say I don't care, I give everything away, big platforms, I don't care, or you have people saying I give nothing, my personal data, nobody has a right to know anything and there is a middle ground between that.  That's something that we're trying to achieve with the information contained and therefore.  That's one thing.  The other thing we learned in the crisis, already before that, it is accelerated in the cries, it is the issue of not private data, but machine data or data that's not personalized so we have so many in eastern Germany so, many companies, they're producing machineries, from the ecos, whatever, and all of these pros generate data.  That's something that we're not very much aware of and so now we're trying to write the awareness and to tell them we're a small company, you have to work together.

Perhaps that would be best to work together on a European basis and that's why we initiated this project already before the crisis, but now it has been accelerated as a priority which is the idea of a European network of Clouds and of data where everyone can feed in the data, every organization, company, research center, and then obtain the rights to use the data that other high school also put in and that's the basic idea of it.

And, of course, this would be located in Europe, it would be according to the regulations of data protection regulation and so forth and so there is a certain certainty for people, legal and moral security for companies, for those using it, and we're hoping and we're confident actually that we have managed to hit this critical mark.

The last point I wanted to mention, it is also the state.  The state as a data actor, there is so much data, communities, the federal government, regional government, there are so many, they have the parliaments and the parliament, they're the data for personalized nature but not only of the personalized nature, this is something that's also in our strategy and that will now ‑‑ I think that even now it is more crucial, the state must be at the forefront of the open data debate.  And they must put to the availability of everybody so that the data, so that new business models are created so that data gets the use that the researchers have access it and we hope and think that the data sharing from the state will generate some kind of a chain reaction with other actors that will be able to also put aside the storage, all of the negative thinking and also take part in data in dev that we're working on.  These are the three points I wanted to mention.

>> MILA ROMANOFF: Thank you for your remarks.  I would like to ask you a question, how do governments work together with other stakeholders on these types of topics?

>> RUDOLF GRIDL: I think a very good example, it is ‑‑ we're working very, very intensively together with the private sector, of course and also with the research and scientific community.  That truly is an example where we as a state are very much aware of the fact that he has not the knowledge, not the human resources, not the insights that all of these scientific communities, all of the private actors have, so from the outset, the whole ‑‑ the whole project was, if you wish, a multistakeholder project, for everyone, they were on board, not only in the thinking, but in the design, in the economic implication, in the social indication, so that's something which we also have normally already in the German system very much but this was really an outstanding example.

Also, what we think is very important to work with Civil Society and consumer protection and data protection organization, not only authorities, but also organizations.  That's ‑‑ in Germany, it is ‑‑ data protection, it is very high in the mindset of people so if you have them on board, it is guaranteed that people will be more trustful to the product you produce.  I hope that answers the question to an extent.

>> MILA ROMANOFF: Thank you.  As you weren't responding to this question, one more came in ‑‑ as you were responding to this question, another came in, we'll go to the next speaker.  It is an interesting one.

The anonymization methods used in the mentioned app, that's the tracing app I presume, considering global practices, you said that you were able to reach 18 million ‑‑ if I understood correctly ‑‑ correct me ‑‑ but what were the methods that were used to guarantee that privacy?

>> RUDOLF GRIDL: To affect the privacy?

>> MILA ROMANOFF: Through the anonymization methods.

>> RUDOLF GRIDL: I could explain technically, I cannot go into those technical details here.  It creates a personal identifier, I don't know, every few minutes for every app.  This goes inside of the poll, but the poll cannot retrace after 5 minutes the person that goes with the identifier.  You get on your phone the information, I was in contact with someone who was COVID positive, you get the information, but not the poll.  I think that's technical ‑‑ I could not explain it to you more in detail how it is being programmed.  I'm a humble lawyer.

>> MILA ROMANOFF: Thank you very much.  That was ‑‑ that was great.  I think that was covered in the response.

I would like with this to introduce our next speaker, I'm going to ‑‑ Christoph Steck is not with us today, instead we have Gonzalo Lopez from Telefonica and I'll talk about the data sharing frameworks in the private sector.  On to you.

>> GONZALO LOPEZ‑BARAJAS: I'll do my best.

First of all, I would like to say hello to the community, good morning, good evening, good night depending where you are and I'm very happy to be here with you.

So just to start, I will basically try to explain to you what we are, we're a telecommunication company provider and we have more than 340 million clients and we're providing the services in countries and that's giving you the scope of the company that I'm working for.

Going to the specific cases, during the pandemic, we have seen governments were forced to impose the strict lockdown measures to tackle it, this has impacted people's mobility and the economy and their society.  In this context, the availability of tools to effective modeling, to monitor and quantify this, it has been key for private institutions to decide which policies to implement and for how long to keep these confinement measures.

To that, it has been helping them to deal with this analysis, it is a use of an aggregated data from cellphone tracing to characterize the mobility and the spread of the virus.  This is not the first time this is used, in fact, this is also used since, for example, in 2011 during the epidemic in Mexico, H1N1.

Basically we have helped governments to analyze and our main focus with this agreement with governments has always meant the protection of privacy.  Our Chairman has recently presented the company proposal for a detailed deal to a sustainable cost of recovery to rebuild societies and economies and our response was to create trust and confidence in the digital data, it has been one of the most driven pillars of the digital deal and to create confidence and trusting the use of data, privacy is clear, basically all of the agreements that we had to comply with the following guidelines, first we used data from  and we used segregated data so that instead of having data referring to specific experiences, we had data referring to group of individuals and in a location, there was a certain minimum number of cases, for example, if there was less than 10 individuals in the region, we would not open the data, it would be easy to identify those individuals and also in addition, we have not shared the data itself.  We have shared insights on the data.  Maybe the best approach to a better understanding is to give you an example of one of the use cases, it would be in Argentina, we started public private collaboration, including the National University of St. Martin and we signed an agreement with the national government and others to use the tool that we implemented.

Basically, the objective was to measure the mobility of citizens through the Argentina territory to supervise the movement of the solution measures and in order to be able to monitor the reopenings.

For this, what we implemented here, it was the mobility indicator index that was built and indicated from 0 to 1 the level of mobility in each location of the country relative to the prepandemic mobility levels.  The information we shared, it was compared to the mobility before the pandemic and that's basically the insight that we share.  We do not share the specific data itself.  We just share what the insight is.

Basically, all the use cases that would have been implemented, it may different a little bit from country to country, the basic principle, we have respected and protected the privacy, complying with all privacy policies and legislations, you know, in our countries, we have data with measures to avoid over identification of personal data and we have not shared the data itself, but the insights as I just commented.

I'm going to the learnings.  Basically the learning that we have gone through after the implementation of the status, it is that in some cases there is a trust, a lack of trust in the use of mobility data, we have seen in a position of parts of the political opinion in introducing this kind of data and what we have seen, it is one of the main problem, the lack of understanding of the data that we're using and the first approach in this case, it is to be open and transparent and to be sharing publicly how we are doing things and informing people and raising awareness campaigns on how we're contributing to provide ‑‑ to help governments with this kind of data.

One of the additional learnings, it is that the coalition between public and private partnerships, it is complex, as every partnership follows a different approach, interestingly in all cases, government stakeholders, and the country, the particular area of data has completely different infrastructures.  In the case of emergency situation, when there is an urgent need, it takes a lot of time in order to define the legal structures and compliance and all of the issues to address this.  Maybe a potential solution that could be interesting, it would be to investigate a predefinition of the partnership templates in order to have more accelerated developments of such partnerships.

Also, relevant barrier that we found, it is the data readiness, by governments and public administrations changed significant barriers from country to country.  I think it would be good to a significant effort to fully prepare administrations on both the infrastructure side and on the skills for the data driven society.  And another difficulty that we have found, it is about the business model of sustainability.  In all of these cases that we have agreed with the governments, due to the urgency and the enforcing interest context, it was based on a proponent, we don't receive any payment for that and though I think this approach, they are the way to go in this crisis moment, looking at a more longer term, it would be needed, different approach, we have to look at the cost and investment required.  So it is difficult to sustain this service in a medium to long‑term.  It is essential that governments and companies could assign business models that guarantee the immediate and sustainable availability of the tools in anticipation of this.

I think those are the main learnings that we have.  I think I would leave it here.  Thank you.

>> MILA ROMANOFF: Thank you very much.

Just a follow‑up question, and especially since ‑‑ I know Telefonica is active in the data strategy and the policies on the data sharing, it was brought up by Rudolf prior to your presentation that's made now of the announcement of the data strategy made in February of 2020 and you touched on the pro bono relationship in time of crisis and of course thinking of sustainable ways in going in the future the European Commission has the expert group on implementations on how data could be shared more sustainably and not only for humanitarian response, although this is the most general topic right now, but in your opinion, coming from the private sector assessment, not just from Telefonica, but a private sector representative, would you consider as a question is coming in, that in order to accelerate the process more and one, I would like to ‑‑ one of the recommendations of the E.U. commissions policy recommendations and business to government data sharing, it was actually lack of technological solutions at the government level to protect data and to ensure trust.

Do you think that let's say the third party intermediaries, or concepts like the promoted by MIT on data Unions, data cooperatives, data trusts, companies that actually do analysis on behalf of the telecommunication companies or only on behalf of other company, it would be one of the potential solutions to ensure that data could work better rather than just ensure that we have regulations and sharing on a bilateral basis?  Do you think that could enhance the process for data sharing?

>> Having trusted parties could be an interesting approach.  It would depend on the use cases.  As we have seen, there is a lack of trust in this corporation works and how the data is used.  If you have a third party that's trusted by all and has all of the legal ‑‑ you have the ‑‑ you have the acceptance, I think that would be good.  We would have to do a deep dive in its case to decide what data is going to be used for and how long they will be keeping the data, but it could a good way forward to accelerate data.

>> MILA ROMANOFF: Thank you very much.

I would like to go to our next speaker, Nnenna Nwakanma, the Chief Web Advocate from the Worldwide Web Foundation and she will discuss insight on responsibly harnessing data to bridge digital divides, especially the digital gender divide.

On to you.

>> NNENNA NWAKANMA: I home you can hear me very well.  Greetings from West Africa!.

So I have listened very closely to my colleagues from government and from the industry.  It was quite interesting to hear the word trust, trust, trust, trust and trust again.  I do acknowledge that there is a number of initiatives coming from most of us across our areas of engagement.

I would like to speak to you briefly, also looking at the time, about three principles in the concept of the web.  I have seen government responsibility, but the question, it is always what is the responsibility for users so I'm going to briefly speak on principles 3, 5, 8 for the contract for the web and then I'll add the global cooperation and the gender perspective.  Principle 3 has the government to establish and enforce comprehensive data protection laws and frameworks.  This is really, very, very important because if government is speaking about trust we need to understand what is the rules of trust and it needs to be defined within a specific law and then monitoring needs, it is also very important.  On the part of the government, it is really regulation, clear regulation so that the industry can understand it. 

On the part of the industry, that's principle 5, what the web puts across, it is giving people control of their data.  That is the main barrier in trust with users and to support and the accountability, the reports of the accountability, and making the rules understandable, equally available to everyone.

I think that's the responsibility from industry partners.  Everyone talks about trust.  Yes.  Trust may mean different things to different people.  I will come back to that.

On the part of the citizens, the contract of the web principle 8, us people like myself, the web foundation, it is to commit to raising data for those whose voices have been heard because one of the things that we are looking at security, trust, we also are looking at those whose voices are not heard every day.  This is the IGF, let's not forget we're all in different frameworks of the Sustainable Development Goals, we're saying leave no one behind, when talking about data sharing, data trust, data security, this may not be people just online, only half of the world is online.  We also give citizens like myself, it will help to connect data from unconnected people to the mainstream data so that all of that will form our decisions.

People like myself, user, contributor, cocreators, we should all also take a step to do our part, to protect ourselves, read through the end user licenses and make wise choices.  Overall, the web foundation, what's on the table, it does show that in 9 principles, sub principles how government can be responsible, how industry can be responsible and how users have been responsible.  It has been exactly two years when the principles were launched in Lisbon and exactly a year ago we launched it at IGF in Berlin.  Coming to the gender part, recently we launched our women rights online report and it is an in-depth study of countries across Africa, Asia, Europe and I would just like to share a key piece of data for you, from us to you, there is less trust in companies online.  The women that we studied, we are skeptical on tech companies using their data responsibly, 54% of our female respondents said that they would not allow companies to use any of the data at all.  Of course we ran the poll earlier on.  You saw how 63% were like, yeah, maybe, I can give my data if everything is fine, and then the other 92% is like trust is the issue.  Clearly trust is the issue.  The government, whether it is Civil Society, industry, trust is the issue and women every day who are lesson line than even within the 50% group, women are lesson line, our study shows that men are 21% more likely to be online than women.

The issue of security, the issue of trust, it is one that is everybody's issue.

As the Worldwide Web Foundation, we have been collaborating with the Office of The Secretary‑General of the United Nations implement the roadmap for digital accomplishment and that's where I would like to end.

The digital cooperation, it is the framework that has been built across multistakeholder corporations and now we have a roadmap.  One of the very key things in the roadmap, it is trust and security because everyone knows now that this is our collective problem, you recall when we were starting IGF, we said that we would not take any decision, we were just going to come here, talk and exchange requests and go home.  We have done that for 15 Goodyear, can you recall someone who was born when we had IGF 1, that person is 15 years today and we still are not making global decisions on issues that concern us, how the world population is online and we don't have platform for all of us to sit down and say this is a common understanding of trust and security.

This digital cooperation, the roadmap of digital cooperation, it gives us that opportunity and the United Nations Secretary‑General has engaged to do it.  Now, there is action point 6, we have a global statement on trust and security.  Why haven't we started it?  We have written a letter during the UN 75th Anniversary, the last General Assembly, we sent this letter to the incoming President of the UN General Assembly and it was copied to the Secretary‑General because for us, it is critical, it is critical that we sit and speak and agree on the global statement.  We're not calling it a treaty, we're not calling it any other thing, but a global statement of understanding because to understand from the right perspective, you understand it from the security perspective, you have to understand it from regulation and we all know in IGF that no one person can solve the problem.  So the web foundation, on behalf of others, the inventor of the worldwide web, the billions online, I'm here today to make a call, we need to serve as global community, we have to say what's COVID taught us?  There is only one global community.  There are not two of us.  So the digital community, the data community, it is one and we owe it to ourselves to sit and agree on trust and security within the framework of the digital cooperation.

Boys and girls, ladies and gentlemen, cats and dog, everyone watching!  Thank you!

>> MILA ROMANOFF: Thank you very much!  Thank you for bringing up the work of the Secretary‑General and thank you for that contribution to the panel and the implementation of further implementation of it.

In overall, as a follow‑up to your presentation, I would like to ask whether you concentrated a lot on trust and this is a key component I think prior to the discussion, policy and trust, we received a few comments also in the chat box and one of the comments said that trust is not the only component.  Do you believe that trust is an enabler of all other components, or do you think that there are other key elements that would promote and enable a better data share, a better access to data to ensure that we can better address crisis like COVID in the future knowing it is not the last one.

>> NNENNA NWAKANMA: It is not the only one.  You do recall during COVID that ‑‑ I live in West Africa, there were actually Africans who believed that it was a lie, it was a hoax.  Until the data started to come in.  I think was it Carolyn that talked about the dashboard, the daily maintaining of the dashboards helped us to build the trust that we needed and it helped us bring people to order, it helped us to convince people.

I think that it is as much as talking about data, inasmuch as you want to talk about open data and available, immediate data, from the foundation, you know us, you know we know data, inasmuch one talks about putting data in good format and all of that, but trust is still the bedrock.  Citizens need to trust the companies and the companies need to trust the government and the government needs to trust citizens and citizens need to trust government not shut down the internet at this time.

I mean, COVID has shown us that it is either we trust each other or we die with each other.

>> MILA ROMANOFF: Thank you.  A very strong note to end your intervention.

Given that we have only 5 minutes left.  I would ask one general question to all of the panelists if possible to keep your ‑‑ you have 3 minutes left, to keep your response to maybe the key points of 30 seconds, 45 seconds, main touching on the questions of is there discussions occurring within government bodies, tech companies in considering the data gathered by COVID‑19 tracing apps could be repurposed for other purposes rather than COVID?  A key component of trust, why we don't have security to believe that data is used for COVID‑19, it could be repurposed for other use, without the knowledge.  Maybe in your closing remark, each one of you, again, 30 seconds, could reflect on this question and also make the final point and then I'll pass on to you.  Thank you.

>> RUDOLF GRIDL: Can you hear me?

In 30 seconds, it is not that easy.  I will try to say that COVID‑19 and data collected by the app, other mechanism, it has opened a door, it has opened a door technically but also the mindset of people and I think that we'll be able in the future, not in the crisis situation but also in other situations, to have a much more open and positively ‑‑ a positive population when it comes to the collection of data and processing of data for the common good.  That's what I wanted to say.

>> CAROLYN NGUYEN: Thank you so much.

Quickly, trust, it is contextual and as everybody said, COVID provided us an opportunity to start working together on very, very concrete problems, it is how much trust state parties, others can bring together, we need that evidence to understand what is trust.  Thank you.

>> MILA ROMANOFF: Thank you very much.

>> GONZALO LOPEZ‑BARAJAS: On the first question you made, the purpose of limitation, it is a basic principle already included in the GDPR in Europe.  Even though there are some exceptions into which data can be produced for sampling the cases of the coronavirus crisis, this limitation, it has to be abided with.

 

Contact Information

United Nations
Secretariat of the Internet Governance Forum (IGF)

Villa Le Bocage
Palais des Nations,
CH-1211 Geneva 10
Switzerland

igf [at] un [dot] org
+41 (0) 229 173 411