This is now a legacy site and could be not up to date. Please move to the new IGF Website at

You are here

IGF 2020 - Day 6 - Main Session DATA: Data governance and practices lessons during COVID 19 pandemic

The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 





>> MODERATOR: Can everybody hear me?  Brilliant.  I'm sorry.  It is not a very glamorize location in the back.  I'm sure we'll all survive.  I'm interested to see that I'm a minute behind schedule, I'm not sure how that bodes for us.

I welcome everyone today.  Thank you for joining us.  Our session is Main Session DATA: Data governance and practices lessons during COVID‑19 pandemic.  My name is Gabriella Razzano, I'm a senior research fellow with Research ICT Africa, South Africa based in Cape Town and an Atlantic social on economic equality and work with others.  I'll be moderating.  Time keep something not a strong point personally, but maybe I'll be better at drilling it in with other people.

Just to outline how we're going to run the session today, we have sessions split up into two parts with three speakers in each different section who will each have 5 minutes to give their interventions.  We'll have an opportunity to do crossover questions at the end of each part, and that will be followed with a 7‑minute audience Q&A to end each part.  The first session is dedicated to sharing experiences from different stakeholders and developing data‑driven technologies during the pandemic as well as the Human Rights implications of these kinds of technologies.

The second part of the session will be a discussion on how the challenges and opportunities present ‑‑ for data‑driven policies have impacted on the future or whether they have or haven't.

I'll be introducing each speaker just before they talk, trying to guide the session with time.  I thought I would take the opportunity just to set the scene a little with 5 minutes.

I think there has been obviously variances across the world in how technology and or data has been incorporated into the fight against COVID‑19 with various level was of efficacy for these interventions, there have been various potentiation of how they work together offline strategies and if they haven't.  I think that arising from these kinds of activities you have questions of both legitimacy, effectivity and scale of these initiatives being centered ideally considering social and political context more than the technology themselves.

One thing that distinctly emerges from the technology is that we're starting to incorporate, it is an expanded use of data, obviously with our technologies there is a lot more data‑driven technologies and this has direct implications for data governance and other principles around data.  I think that COVID‑19 has been like essentially an accelerant so we can see it as an accelerating moment for a lot of trends that we're really starting to gain insight into and it is happening rapidly but has happened even more rapidly.

So just from South Africa, just from my own context, I think we have seen particularly attempts to engage in both technologies and data‑driven initiatives in response to COVID that we have already failed to see an ability to o do bottom‑up solutions particularly for the national initiatives, which means that our solutions aren't properly considering both our significant digital divide and other forms of digital inequalities.  And other concepts of how the disease manifests in a country of just existing social inequalities.

I think that's what's been interesting, is that COVID is not just exposing that we have insufficient data frameworks and know that our digital inefficiencies are actually linked a lot to the third natural revolution just rather than the post‑industrial revolution, but we're seeing how some of these challenges implicate innovation policy and how moving forward we're going to engage with innovation and enable innovation locally.  That's been an interesting dive in some of the things that have happened.

Leading us into contextual considerations we'll be going into part one and underscoring our session for section of part one is the question of what useful experience can you share on ensuring privacy and other Human Rights while leveraging the use for data for tackling the pandemic.

Our first speaker will be Robyn Greene, a Privacy and Public Policy Manager on Facebook Privacy Policy team, USA for nearly two years leading on surveillance and cross‑data security flow.  Before joining Facebook she was at the Open Technology Institute and worked at the American Civil Liberty Union where she specialized in privacy issues in security policy.

I'll turn it over to you.  To flag, at 4 minutes I'll try and ‑‑ I'll raise my hand if I can find the raise my hand button.  I'll give a nod in the chat, that way we're running out of time.

Thank you.

>> ROBYN GREENE: Thank you.  Thank you for inviting me to speak today and participate in this panel.  COVID has shown us that it is more important than ever to safeguard an open, free, unfragmented internet, and the responsibility lies with all of us.  We at Facebook are committed to preserving an open, secure internet allowing for information to flow freely and then enables the environment that's necessary for sustainable innovation, economic development and freedom of expression while creating a more connected, cooperative world.  Conversations like the one we're having today are critical to realizing that commitment.  I'm really pleased to be here.

Specifically with regard to COVID, at Facebook we believe we have got a really important role to play in the response.  We have focused on things that we're best positioned to do to help get people through the difficult time.  This is by doing things like, you know, really prioritizing how we help to connect people with their friends and loved ones, but also things like fighting misinformation online and critically supporting public health authority efforts to contain COVID‑19.  In only a short period of time, we built new features to highlight information about COVID‑19 from health agencies and other trusted sources.  We have also provided millions of dollars and free add credits to governments and small businesses and supported researches efforts to forecast the spread of the disease through a recently launched data for good surveys and maps which we'll talk about more later.

You know, as the world fights COVID‑19 and countries reopen societies, it is critical we have a clear understanding of the movement of people and how the disease is spreading and so one of the things we found is that better data has been really, really helpful for governments as they're determining whether to send resources like ventilators and personal protective equipment and which areas are safe to start opening again.  Getting that kind of accurate data is really challenging and obtaining such focus data from across the world is even harder.  Because we have a community of billions of people globally we have been able to help researchers and help health authorities get the kind of information they need to respond to this outbreak and plan for their recovery.

Our support for these efforts ‑‑ and I want to be really just clear about this and really emphasize it, our support for these efforts need not and should not involve compromising people's privacy.  You know, our approach has been very privacy first while contact tracing apps have, you know, for many months dominated the public debate our focus has been on identifying privacy protected ways that we can use the Facebook reach and resources to help researchers and health authorities respond to this crisis.  We provide tools and data that advance public health responses while maintaining our strong commitment to people's privacy and the security of their data.

So sharing our user personal and identifiable information has not been a part of our response at all, and we'll continue to require a warrant in the government seeks our user's location information, just as we did preCOVID and we have focused on how we use aggregate deidentified data, aggregate and deidentify data to help with the response through the data for good work as opposed to using people's identifiable, personal data and we adopted a framework to help us review proposal and requests to help us ensure that we actually fulfill our commitment to do everything we can with the pandemic response without jeopardizing people's privacy. 

First, our views are guided by experts.  When we're seeking to undertake these new efforts for pandemic response we have sought active and meaningful participation from relevant stakeholder, in particular, experts in the public health sector as well as Human Rights experts and privacy experts.

Additionally, we put advocacy first, the stakes are too high, so to that end, our COVID‑19 initiatives have been provided either in partnership or deemed effective by public health authorities such as the WHO and we sought to contribute in ways that integrate coherently with country's public health strategies and systems.

The third element of our framework is voluntary consent.  Our COVID initiatives have adhered to content‑based, voluntary data collection practices. 

Fourth, transparency.  It is an absolute necessary requirement for the work related to COVID‑19.  It is important to be transparent about our role and measures we're taking to help address COVID, including clearly disclosing how we check, use and share a person's data.

Fifth, element of our framework ‑‑ the fifth element of our framework is data minimalization.  We collected, used, shared and stored the minimum amount of information to support identified COVID‑19 initiatives.

Finally, it's last but certainly not least, we have prioritized Human Rights and conduct due diligence to ensure that we effectively identify risks related to our COVID response and only proceed if we determine they could be adequately mitigated against.

I think by really putting things like efficacy, consultation with experts, privacy, Human Rights first, we have done a really good job of really being able to respond to the pandemic, use data or surface information for people, use our services and help people connect in a way that makes this time easier and that sort of gets us to an effective response better without having to sacrifice people's privacy and we're really part of the work that we have done.

Thank you.

>> MODERATOR: Thank you so much.

I'll turn now to Gonzalo Sosa Barreto, he has been recently designated as Secretary‑General of AGESIC, Uruguayan Electronic Government and Information and Knowledge Society Agency, ARESIC, Uruguay, manager, he carries out teaching activities at the University of the are republic and the Catholic University of Uruguay, previously he carried out tasks of his profession in the public sphere in the tax office and in the planning and budget office as well as in his private practice.  Thank you so much.

>> GONZALO SOSO BARRETO: Thank you.  I'm happy to be here.  I'll try to be brief.

Since the declaration in March, 2020, it became clear that a responsible use of data had clear roles and determinations of the responsibilities of everybody involved.  When it comes to considering health and other Human Rights, it is important to consider the opinion of the InterAmerican court of Human Rights and early on the court issued a statement stressing the fact that problems and challenges derived from pandemic must be addressed from a Human Rights perspective and with respect to ‑‑ sorry, for example, international obligations.  The enter American commission of Human Rights put in temporary restrictions in Human Rights due to the pandemic to be subject to strict observance of the public health objective to be temporarily limited and to have defined objectives and be proportional to the objectives pursued.

The Government of Uruguay has a health program, the participation of different public organizations, including the agency for the digital development, AGESIC and they worked with private developers since the pandemic and were one of the first countries to design an app with resources from the various sectors, the public sector, Ministry of Public health and the agency for digital government.  The result was an application widely unrolled to the public under responsible freedom concepts.  It was and it is now voluntary and many worked along with the digital authority and the national groups to ensure that the data policy of data protection and security standards were met.  The application goes in line with the resolution of March 2020 by the data protection authority, that establishes that the processing of sensitive data in the context of sanitary emergency must be carried out by the Ministry of Public health within the framework of certain legitimate basis, purposely protecting vital interests of the relation, considering principles and other data protection provisions and an example of collaboration between government and private sector has come from the accreditations of a scientific group that's been advising the government since the onset of the pandemic. 

To perform duties, the group receives information from private and public entities and the interpretation framework requires a justification of legitimate interests from the sender and receiver of the information, consent of the data subject or an exemption to such consent, for example legal mandate or a sanitary emergency.

There is legal framework for information sharing which is in public entities establishing principles and safeguards in addition to data protection regulation which may be agency for digital government or harder to deliver a safe environment have private and public entities can perform analysis for the Ministry of Public Health and deliver the reports required by the scientific group.  This refers both to personal and non‑personal information.  It is important to work on ethical frameworks also that complement existing norms and guide the way of processing information such frameworks could provide a needed transparency and guidance for all of the participants and a good example of such framework is the declaration that sets principles that could be easily extended to other areas of innovations for data such as accessibility, Human Rights, ethics, accountability, purpose, dataflow, open by default approach, privacy, data security.  There are of course, common standards in all of this analyses. 

There is a need when dealing with personal data, there is a need to perform the interpretation impact assessment prior to the collection of data to provide clear information for data subjects in which to ensure transparency, precise definition of the roles of all of the participants involved, the specific security measure, et cetera.

Of course, to summarize, accountability is key.  The protection framework, there is obligations and the internal processes, when they have an impact on the protection and in several cases to appoint a data position officer.  Governments and the companies will define and bring changes to the way that people receive personal data protection and trust that it is built under respect in the data processing by governments. 

I look forward to the Q&As.

>> MODERATOR: Thank you so much.  I will turn to our panelist for part one, Carmela Troncoso, eHealth and Innovation portfolio of the WHO Regional Office for Europe, she heads the spring lab focused on security and privacy engineering and the work focuses on understanding and mitigating the impact of technology on society. DP3T Exposure Tracking Protocol Developer Team, Switzerland, they have been in involved in proximity tracing and has been adopted by Google and Apple for a guide in protocols and is a lead in technology.  Thank you.

>> CARMELA TRONCOSO: Thank you very much for the nice introduction.  Thank you for the invitation.

First of all, let me say that I'm here in personal capacity by no means, it is the Swiss government but by less means I represent Google and Apple.

The applications came as a response to a need faced by COVID‑19 which we know that contact tracing is very important to stop epidemic was the volume of COVID‑19 cases it people were overwhelmed and they decided that technology could help.  Then people turned to technology because you need a sensor with the mobile, that's the sensor most people have in their pockets and I'm happy to discuss the technology divide and what this has caused in the Q&A.

Initial options for this kind of app‑based technologies were data hungry, and that was very worrisome.  This is not a question of privacy.  When we have data‑hungry technology, what we have, it is the huge potential for abuse.  Privacy is not an ends, it is the means to protect citizens, to pro secretary users and to protect our democracy.  Together with a group of scientists we joined forces to try to provide the most privacy preserving way of building contact tracing apps.  Again, let me insist, the goal was not to protect the privacy of the user, it was to protect the world from this information that had never existed, this information that we had gathered off of phones of everyone, we're going to ask every citizens in Switzerland and Europe and the world to put a sensor in their pockets.  The protocol we came up with, that in the end was the basis for Google and Apple is extremely privacy preserving and not only does data immunization but something even more important that I hope it become as concept in the future, it is this purpose of limitation, a basic principle of data protection and the protocol we built ensures that the information that flies around is actually useless for anything but this particular way.  This is opposed to many other options that came with like automatization, applications, many other things, we wanted to build in this but not bring in another problem.  When we introduce technology, we cannot solve a problem by causing a worse one.

Current applications, we're seeing early evidence that they help.  Maybe so far they have not been seen because of the adoption, because of some limitations that they have helped us to see and it is not really the technology that is not working, it is that the health system is still not prepared.  In most countries that I know of, many problems come from the fact that the health system is not ready to give support to the application.  For me, this is even more important from the data governance perspective, before we go to countries or to anyone to give technology, we need to help them build infrastructure and we made the technology successful.  Otherwise, again, we'll introduce infrastructure, collecting new data, putting in new problems without actually solving the underlying issues.

The last thing, because it is relevant for this, when we decided to be building an app, it is a ‑‑ not particularly, we, me and my group.  It was a requirement, we decided that we had to use the mobile system, we would build on the Google and Apple kingdom and as such we now rely on Google and Apple to maintain this creating a new governance issue.  It may be about data and infrastructure and the power that the sector has.

I'll be happy to answer any more questions during the Q&A.

>> MODERATOR: Thank you so much.

Now just to sort of round up part two, we have ‑‑ we have set up some crossover questions, so I'm actually going to call on the participants themselves to ask them, and then each panelist will have a minute for part one to answer the direct question.

The first question was to Robyn Greene.

Would you mind posing it?  Would you prefer me to read it.

Rasha Abdulla:  Whatever you prefer.  Fine with me..

>> RASHA ABDULA: My question has to do with data for good program or initiative that Facebook has been deploying.  I'm wondering if you can give us a little bit more information about that, the concerns that the program may be collecting more data than absolutely necessary for fighting the pandemic and particularly the concerns that users cannot edit or delete information posted about them by other users.  Can you tell as you little bit about that and also what happens with this data later on, when does this program end?  What happens to the data later on?  Thank you.

>> ROBYN GREENE: Those are good, important questions.  Data for good, it is not a COVID‑specific response.  I know that some other technology providers, you know, hadn't previously been looking at aggregate and deidentified data to see how to use it for social good and so they have deployed it really for the first time for the pandemic.  Data for good has actually been around for several years and has been a critical part of doing disaster response and doing research that's very privacy protected but also enables people to understand how small businesses are succeeding and failing in different places and things like that.  In addition to doing hurricane response, and in factory responding to the Ebola breakout ‑‑ excuse me, the Cholera outbreak several years ago in Africa, we were able to use this kind of data in order to also very quickly respond to the COVID‑19 pandemic.  So part of our data for good program offers these apps that researchers and non‑profits are using to understand the coronavirus crisis and the effectiveness of non‑pharmaceutical interventions and this relies on using aggregated data to protect people's privacy. 

By using aggregated data in this way we're able to help health authorities predict disease spread, allocate resources and measure the efficacy of the mitigations and ultimately researchers and non‑profits have used these datasets to inform their decisions all over the world.  In fact, researchers in Taiwan were even able to identify the cities of the highest chance of infection, researchers in Italy analyzed lockdown measures and their impact on income and inequality and public health officials in California and New York in the United States have reviewed county‑level data to steer public health messaging to make sure that the responses are as effective as possible.

In terms of privacy, you know, one of the things that we are really proud of is that our disease prevention maps from the data for good program showed an aggregate where people were traveling and tracking between regions and these are colocations maps that revealed the probability that people in one area come in contact with others and eliminates where COVID cases may appear next and we have the social connectedness index showing friends across states and countries that can help epidemiologists forecast the likelihood of disease spread as well as areas that are hit the hardest by COVID‑19 and seeking support, and we have a movement of trend like mobility maps that show a regional level, what rates people are reducing mobility or remaining in the same place and this can provide insights into whether preventive measures are headed in the right direction. 

You know, a really important thing to keep in mind, it is the data for good shows protected datasets with network ‑‑ with our network of trusted research partners who are actually enrolled in the data for good program, but even those research partners only have access ting a great information from Facebook, we done share any individual information so, you know, questions about someone being able to delete, correct their information, their information is not used in an identifiable way in the data for good research.  Whether it's the research that we make available publicly or the data that people are accessing the portal when one of our trusted partners and research experts are at, so the key is these datasets include location information that's aggregated in a way that protects the privacy of individuals.  We use techniques like spatial smoothing to create weighted averages and avoid map tiles and very few people live there, basically the datasets are broken down in a way where they really con be disaggregated in order to try to sort of figure out who’s in which dataset.

For things like the public datasets on mobility, we use our differential privacy framework which takes into account the sensitive ‑‑ excuse me, the sensitivity of the aggregated dataset and it add as random number of additional observations to each map to ensure no one could identify the users.

The last thing I want to note, which I think is extremely important as I mentioned before, we ‑‑ you know, we consider voluntary consent for part of the COVID response we're undertaking to be really critical.  Facebook users can decide if they want to share their location data.  The data that's used by the data for good program maps, that location data, it is the same data that allows us to show people locally relevant content on Facebook and users can choose whether they want to share that information, the location history setting.  So this is actually ‑‑ the data for good program, it is only getting that location data if the user has already opted in to location data sharing and then the data for good program is aggregating it, deidentifying it and then only making sure that people can ‑‑ that researchers can access and use it once it's been aggregated and deidentified in a way where we're confident that the user's privacy and identity are protected and we put privacy first when it comes to our data for good program.

>> MODERATOR: Thank you so much.

The next crossover question is from Clayton Hamilton to Carmela Troncoso. 

>> CLAYTON HAMILTON:  Thank you. 

Carmela Troncoso, there are many discussions in Europe on the potential impact of privacy in Human Rights that digital contact tracing and other digital technologies can have.  Going from the very specific ‑‑ from the general to the specific, a major concern people have about the technologies and how they can be addressed as part of the implementation.  In your intervention you mentioned a huge potential for misuse.  Do you consider that the use of digital contact tracing solutions as they are now pose a direct or indirect threat to an individual's personal privacy?  Thank you.

>> CARMELA TRONCOSO: Thank you, Clayton.

What are the concerns, the kind of concerns we have heard a lot about surveillance, in particular, that potentially we collect now data, contact tracing at the end of the day is to collect colocations with extreme precision, much more than what we had ever seen before and at an unprecedented scale and that data could be used for, I don't know, almost anything from building the same kind of data we can do for data for good and also build databases with whom, how often and you can imagine how that could go down the wrong road when this data falls into the wrong hands and the wrong hands can be from hackers and vicious hackers, to also governments that may have the ability to use this data for population control.

When we designed this application, we have this in mind, that's why we went to an extremely conservative approach, that we had not created data.  Our goal was to make sure that there was not a powerful actor that would decide who and why would have access to the data, the moment that we have someone with the data that we can make a decision, they can make any decision and we have heard before that in different legal framework there is are exceptions under which data can be used for something that they had not been collected for.  It was very important for to us even not create the data, to not create the mediator, and to try to find a way which we build the system to avoids this.  To me, this was kind of the key and it was very important and I hope it is a message that that's not with these apps, we can build these kinds of things, there are plenty of technologies and plenty of people that are able to think in this way.  It is a hard way, we need to change the way in which we design, we have to change the way we train engineers to think about solving problems instead of driven by the data, driven by the problem and data is something that you may or may not need.  This is the way which we can eliminate the individual privacy. 

The bigger threat I'll say again is not the individual's privacy, but societal privacy.  Societal behavior, societal moves that if we learn them, can be used in order to ‑‑ to try to influence back and for instance you look at Cambridge analytic, when it went back, it went to individuals but the information gathered was information about the global sentiment of the population.  It is very important that we not only think about privacy as individuals, but we think about what we're collecting and how that can be used to influence and how to eliminate that to the best that we can while we build technologies.

>> MODERATOR: Thank you so much.

I'm also going to abuse my position as moderator to flag that issue of, you know, the communal nature of digital harms for when we talk about this later.

The final question for the crossover questions, it is from Amber Sinha to Gonzalo Sosa Barreto.

>> AMBER SINHA:  My question, it is towards the adoption of the exposure alert, how is the protection and how is the state law on managing these particular sectors that's been creating in the development of these apps.

Thank you.

>> GONZALO SOSO BARRETO: Thank you for the question, Amber.

The adoption of the protocol is another example that collaboration between public and private sector can provide people with useful tools to face situations like the ones you're experiencing.  It is important to highlight the need and, in fact, the government, it was especially worried about this for transparency, to present the facts clearly to the population and to document in detail the responsibilities of the obligations and the Rights of the parties in order to preserve privacy when developing the applications.

Regarding data protection, the involvement of the digital protection authority and the issue was at an early stage, before the functionality was incorporated in the application, the protection unit carried out an remuneration of the different exposure alert systems and raised the necessary requirements for them to adopt this wider regime, transparency, a decentralized approach, appointment of the Ministry of Public health as the sole data controller and data protection by design approach considering principles like the one that Carmela mentioned, purpose of implementation, the deletion of information when the pandemic ends because the information has only one reason to be there, it is to address the pandemic so it should be among other recommendations that were followed when developing the app and those were key factors.

Regarding the accountability, the information and application, it is available through a procedure and for a specific purposes, but all the information and measures adopted by the government included reports made by the scientific or honorary group which the government based its decisions, they're available and published on the various websites.  This is also a key issue on accountability, transparency and informing the public.

>> MODERATOR: Thank you, everyone.

I see there are three questions in the Q&A.  What I'll do, just to keep us on time, it is I'll address those three questions to the panelists and if they can just answer in the order in which they spoke, if they have anything to answer, but I actually think that the two questions, the first two questions kind of go together.

The one question, it was addressed to Carmela asking if your approach to the tracing app is privacy by design, would you go beyond that?  You said some things in there about it being about purpose, limitation by design which is underlined.  The second question, it is asking during most ‑‑ during the pandemic, a lot of people were unhappy to share and save the results of COVID‑19 in the systems of eGovernance, how do you motivate people to trust on the system of privacy in lower income countries.  Both of these questions link to trust, one is a possible aspect of privacy by design, but how else do you foster trust in the systems?  Particularly in different kinds of contexts which is a difficult question to answer and we may also engage on that more in the second session.

There was one more question on how does one ensure the accuracy, the completeness of the self‑reported data related to COVID, in addition to non‑reporting there is issues of lack of access of technology regarding infections and to lack of testing false results and policy made on this data, how can it be reliable. 

The second question talks a little bit to some of the mechanics of creating good data and what it means and what it means for our reliance or over reliance on it and policy. 

I will turn to you first, Robyn.

>> ROBYN GREENE: In terms of all of the questions around non‑reporting, issues of lack of access to technology in reporting, misinformation, this is why so much of our work has focused on making sure that we are prioritizing making sure we get good information to people about the disease, about their governments response.  This is necessary not only to ensure people know how best to protect themselves with the most up to date, you know, medical information and research that's available, but also it is just absolutely essential to be able to instill trust in government and their response all over the world.

You know, the trust element, it is another critical reason why we're prioritizing privacy in our COVID response, joy we're only using data in ways to respond to the pandemic with data for good in ways where it is aggregate and deidentified datasets making sure that people don't have to worry about privacy or what will be done with the data or even, you know, other kinds of issues and to that end, making sure that we're actually only using that location data once consent has been given to collect it and use the location data which you can do in your privacy settings and your app.  I think without doing those kinds of solutions to make sure that people can consent, that people have transparency on what's going on and that people can actually trust the information that they're getting it would make the response much harder and those are things that we're really proud to be able to be well positioned to help with.

>> MODERATOR: Thank you so much.

We'll turn to Gonzalo Sosa Barreto, please.

>> GONZALO SOSO BARRETO: Regarding the unreported data, in our experience, the Ministry of Public health has played an important role and a robust health system that gives clear information and also a team at the epidemiology, the Ministry of Health, they have an important role, a role that because they follow each lead and they confirm the information before making public the aggregated data.

Also regarding trust, as I said, trust must be built on the respecting the obligations by governments and companies and if I remember, there is an accountability measure, of course, they are a very important part of such trust.  Other principles have to be put in place and always adopting a Human Rights approach to the issue.

>> MODERATOR: Thank you.

>> CARMELA TRONCOSO: The first question asked did we go beyond privacy by design?  I could answer that your motion of privacy of design, a lot of times, it is understood as data protection by design.  In that sense, we go well beyond that, we really implement privacy by design as privacy is understood by the academic community and understanding that your privacy advisory could be anyone as we say, we have these ‑‑ no program balance should be created by the trust itself.

Regarding trust, a way that helped us a lot with the acceptance was openness, all of the design, the development, everything is available, I have been in panel, interviews explaining again and again what we're , do how we're doing it, exposing ourselves, I'm not very used to these things, getting so many questions, trying to explain again and again what is our rational and making sure that people and experts can check it.  I also have to say, again, when we move to other countries, it is very important that besides looking at how we're going to put the technology and the technology will work, we take care, that we also help them in improving their health system and proving infrastructures, otherwise there is no technology that is going to help.  Technology is only a complement and if we're trying to do this solution, that's important.

>> MODERATOR: That's a good way to evolve that section largely looking at the actual experiences and Human Rights and to talk ‑‑ how central the question of trust and legitimacy of these kinds of initiatives have to do with not competing issues of privacy and transparency or competing issues of privacy and access to information, but rather corresponding principles of access to information and privacy and how well partners those two things are and how well a lot of Human Rights discourse allows us to understand those two issues as partners.

We'll be moving on to part two now, in part two the direct question is how are the challenges and opportunities presented for data‑driven technologies impacting the perceptions of data governance and protection for the future.

Our first speaker, Clayton Hamilton, eHealth and Innovation portfolio of the WHO Regional Office for Europe country health policies and system, providing guidance in all aspects of digital health and innovation to support strengthening and reforming the region's 53 Member States and working to accelerate the uptake of safe, inclusive digital health services for all.    .  He leads the development of dialogue and reaming Nall guidance on digital health systems and public health and provides thought leadership on the idea of frontier digital technologies to improve access to healthcare and empower individuals to make choices for better health and well‑being.

Thank you.

>> CLAYTON HAMILTON: What we want to do today, have a brief reflection on the challenges and opportunity presented for data‑driven technology the in particular, what I want to try to do it was to reflect on some lessons that we have learned within the 53 countries of the European region and see how they apply directly to data‑driven technologies.

Firstly, a key lesson we have learned from the COVID‑19 pandemic is that even in the most well prepared health systems there has been wide‑scale recognition of the inability to the systems having access to data in realtime or near realtime.  This is despite the fact that countries have been working on developing the health information systems literally for decades, countries were simply caught unaware.  This lack of access has led to response efforts, it is less effective critical public health and social measures applied.

The second lesson, even if we do have access to data, we are unable to effectively analyze and act upon the knowledge which that data brings.  Again, some of the types of analysis that Robyn presented are really still very much a pipe dream for many national health authorities.  We understand the data quality is an issue to be tackled and we don't always have a standardized approach to be able to certify the quality of data and we have want yet within the health system developed a level of data literacy or a data size culture to be able to help feed the knowledge gleaned from data to either policy, decisions or logistics management.  Again, we say many examples of that arise during the COVID‑19 pandemic.  In terms of perception of practices, it is fair to say, when translating lessons learned from COVID‑19 to artificial intelligence and machine learning there's been renewed efforts from privacy advocates and international organizations to further emphasize the critical nature of strong data governance and accountability frameworks which take ethics considerations into full account.

Really to understand what it means to implement this in a very practical sense at a national level in data ecosystems.  One thing to say that it is important to have data governance and a completely different thing, often misunderstood, to actually apply it practically, pragmatically at country level.    The importance of addressing issues such as data privacy and privacy by design raised by others and also data bias and completeness, managing public private partnerships and renumeration models of data use is galvanized by the COVID‑19 pandemic, what it has done, giving us and sight into how to avoid the data and digital solutions are repurposed or misused, intentionally or unintentionally and that Human Rights and civil liberties are not jeopardized in seeking to implement public health measures very much has been learned there.

We understand also that on a more granular level the importance of legislation and informed consent as part of data governance and data guardianship measures have come to the for now and we have seen several cases where there's been ambiguity in how data legislation at the national level is being interpreted, so this, again, kind of feeds back into further emphasizing the importance more than ever of having this type of legislation applied and interpreted consistently within and across the national borders.

The perspective of the importance of international work has also been something that's come up specifically in terms of artificial intelligence in machine learning and some issues cannot be solved by national measures alone and international standardization alignment here is not only critical when showing we can effectively exchange data across borders and also that we can contribute to the collective poll of learning and that we can source larger datasets when and if they're required to solve certain problems and we're a long way from reaching that goal.

There is a perspective of the relevance of solutions so we have seen very much that solutions no matter how well developed and how safe or secure they may be are simply not relevant in every situation and we have seen some cases where the application of digital technologies has in fact confused or perhaps detracted from some response efforts.  Again, we firmly believe that the right job at the right time is applied and data governance play as role in determining when those situations are the right situations to have them applied.

Lastly, the perspective of equity, and this is something that was mentioned in the beginning, the risk of perpetuating inequities that may exist in society, and how they're likely to be impacted by data‑driven technologies, it has been a keen issue.  This is not a new issue and not unique to data‑driven technology, it ties closely to how ethics are applied to data ecosystems and the implementation of digital systems.  In short, we don't want a future where health systems are not affordable or accessible to all members of the population.  Digital inequities can arise basically in three ways, by either preventing access to health services provided digitally, either through the inability to identify an individual or perhaps an individual is shutout because they don't have a mobile phone or other digital ability and the ability of the individuals to navigate or they don't have the digital literacy needed to navigate those systems, and finally an individual can be excluded through cost barriers, we have to work to ensure that those particular situations don't arise with data‑driven solutions of the future in health.  To govern data effectively, we have to be ensured that the use is flowing back to the public system and solutions built on such data are accessible by all.

I'll leave it there and look forward to more questions and perspectives in the questions and answers later.

Thank you.

>> MODERATOR: Thank you so much.

I'll turn now to Dr. Raha Abdula, professor of Professor, Journalism and Mass Communication at the American University in Cairo, AUC, Egypt she is also a former member of the UN multistakeholder advisory group of the IGF representing Civil Society in Africa.  She's also wrote books and numerous Articles and book chapters and the recipient of several research awards on her work on social media and political ecosystem and she's an advocate for civil rights online and tweets regularly.  Thank you.

>> RASHA ABDULA: Thank you very much for the introduction.  Thank you for having me here.  It is a pleasure to be here.

This is obviously a unprecedented situation we find ourselves in, one in which we're finding out more and more every day how just imperative and extremely important data is and online communications are and the means by which these data are governed, shared, collected, how it effects our whole lives.  It is a binding situation.  On one hand, literally, data information, they're key, that's actually the key out of the situation that we're in.  Also transparency, accountability, they're key because if we don't know how the data are being collected or are being used, that becomes a big problem for individuals. 

Unfortunately, in my part of the world, we tend to have very little information and usually from very few sources, from official sources especially at the time like this.  Basically it is the Ministry of Health that gives you numbers, that gives you data, counts, information about what to do if you get sick or a loved one is sick and there is absolutely ‑‑ there is usually no way to double‑check or to verify the information so whatever you get from the official sources, they're basically, you know, the way to go in situations like this.  Many times, unfortunately, the citizens are not getting enough information and that's really where the danger is.

  we have so many questions, and we don't have much answers, we don't know, you know, what data are the governments gathering in the name of COVID, in COVID‑19, what data are they gathering now that they have not previously been gathering because we don't know, you know, what they're gathering on a day‑to‑day basis.  What's different now, you know, all of the data that they're gather, is it necessary to combat the situation that we're in?  Are the data being protected, encrypted, how?  What happens if that data is compromised one way or the other?  Are the data being shared with law enforcement agents and again on what basis, how are the data stored?  How are they used?  Then, of course, you know, the sunset clause, you know, when will the programs end?  Hopefully there is an end to the pandemic, you know, what are we going to do with that data afterwards?  Where does that data go?  How long does it remain after the pandemic ends?  What kind of accountability exists to make sure that governments do not retain such data when it is no longer needed or do not use it in ways in which it was not supposed to be used.

In North Africa there have been several attempts at contact tracing and data gathering apps, Egypt has one, Egypt's health, it is an official app by the Ministry of Health, supposedly it raise awareness of the pandemic and it has over a million downloads which you need to know, Egypt is a large population relatively, we're 100 million people, you know, that's 1% of the population.  The download is voluntary, however there are many reviews on the app that says that even after entering your personal information the app doesn't really work very effectively and you have to enter a copy of your national ID or passport and you have to enter your phone number and automatically it gets your location information at all times even if you're not using the app.  Tunisia has one such app, it is protects, again, the collection GPS will he occasion, phone number, it is required for registration and again those sunset clause, Morocco has one, called our protection, developed by the Ministry of Interior and it had over a million downloads in a week and Morocco is a much smaller population, about 40 million, smaller than Egypt anyway.

Usually the policies are generic, not very clear, I tried to look for terms of privacy on the Egyptian app, I couldn't find any.  Again, you have to give a copy of your ID, phone number, your location, your contacts, your media files, it has access to a lot of data that you have to wonder and people have actually written comments like that on the reviews, like why do you want my media files?  What will you do with this?  I'm sure there are answers, but, you know, you kind of have to wonder isn't this a little bit too much information.

South Africa, for example, has a better app, you know, the identity is an anonymous, the user consents to reporting if someone tests positive, the office, you have to report that, but no personal information is reported, a better model.  Given that much of the efforts to combat the pandemic depends on delivering accurate and timely information in a timely manner, so the question is what happens to people on the low end of the digital divide and we have a few in ‑‑ in Africa in general and the Arab world, of course.  Egypt has an internet penetration rate of 60% which is not bad.  A phone penetration, it is about 95 to 100% depending on the population figure that they take into account for calculation, however that doesn't mean that everybody has mobile phone because many people have two three.  We still have quite a few people that are off that radar, we actually have about a third of the population still who don't have basic literacy skills, they can't read or write.  How will these citizens get access to the basic, basic critical health information that is needed to survive the pandemic like this?  I believe governments have a duty to provide this basic information to everybody, to all citizens regardless of their access to technology and that access to information because, you know, this is a matter of life and death obviously.  There are many questions to be asked, there unfortunately are more questions than answers coming out of our part of the world.  Once again, I will just stress the very key elements that I believe are for everything to be centered about, basically data and information because, you know, that's how you survive the pandemic but also transparency and accountability.

We cannot ‑‑ we cannot let our guard down, we have to be very vigilant and we have to be very careful with the information that's being gathered and who has access to it and how is this information being treated basically.

I'll stop here.  Thank you.

>> MODERATOR: Thank you so much.

Finally, on the panel, I'll turn to Amber Sinha, Center for Internet and Society, India, leading projects on privacy, digital identity, artificial intelligence and misinformation and has research has been cited with appreciation by the Supreme Court of India.  He's a member of the Steering Committee of about an initiative to bring diverse perspectives to test and implement machine learning systems and implementation practices, the first book was released in 2019 and a study on law and humanities in the international law School of India.

Thank you so much.

>> AMBER SINHA: Thank you.

Thanks to the previous panelists as well.

What you see, the pandemic, it has led us to techno socialism solutions in large parts of the world.  You have the contact tracing and often this has been seen as fairly complicated, there are solutions that are leading to the data which is seen as critical to the response to the pandemic, along with this, there's been negative impact.  Particularly on members of marginalized communities leading to this.  Every humanitarian crisis needs expedient responses and often in the responses we leave behind fundamental Human Rights.  It is easy to be swayed by the enormity of the crisis, particularly a crisis of a non‑precedented scale such as COVID‑19.

(Poor audio quality).

There is contact tracing working in tandem and the digital solutions, they often work on analogue solutions and so a country like our, with limited internet penetration, the question must be asked about what extent contact tracing looks like and secondly, if you don't have sufficient number of testing within a particular geography, how useful is it to rely on the contact tracing with other digital solutions as to valid responses to the pandemic.  What's most disappointing in a lot of countries, it is that in a world of ecosystem, the solution, the app can always be the answer but it is even more troubling when the solution doesn't use the fruits of technology as it is disposed to respond and some examples that we have discussed in the first part of this discussion, particularly those highlighted Id by other, they give an overview of the kind of privacy technologies that can be deployed, yet if we do look at the technologies deployed in various parts of the world, we often don't include those privacy preserving aspects in the technological design.  For instance, with regard to contact tracing, (poor audio quality)..

Over the last year in a half, we created a framework for the digital identity systems.  In the last six months what we have tried to do, tried to adapt that toward the use of these solutions and in response to COVID‑19.  It essentially consists of three kinds of tests, three to four tests that are with the mandates of the government, they must be enacted by the legislation and it must be excessive and clear and accessible to the public and precise with the scope for the discussion.  The second category we have articulated, it is for the entire technology framework, including architecture, user, actors, it is against the rights that it potentially violates and we will be able to look he violations necessary to the benefit that's protected, specifically in that specific instance, what we have felt is necessary, it is also risk‑based assess where technology systems will have an impact on Human Rights and must be based on the analysis of the risk that system produces.  Those risks could be in the form of a centralized data source based on these uses or restricting the usage of the technology to limited actors that are benefiting, in the risk assessment, it is a requirement of the community to be responsive in a communication strategy.

We use that this framework to look at the digital contact tracing solution in India and our partners elsewhere also conducted services based on the framework to look into Colombia, Ecuador, Berlin, Brussels and largely what we felt, it was that there was limited compliance with the principles that I articulated and there is very listen mitigation maps that are placed in the systems and a lot of cases, even the risks they want this place, there is no framework that governs the application and the solution and in some cases the risks, often it is left to the discretion of the executives to change it at its discretion.

Largely, what we have felt was that by digital and inclusive solutions, there was a technological system that could lead to mobile preserving responses and the frameworks which govern that.  Most of the knowledge and capacity must be there to respond to the pandemic.

Thank you.

>> MODERATOR: Thank you so much.

I just would like at the end of part one, for the end of part two, before going to the General Comments we'll have cross over questions.  The first is from Gonzalo Sosa Barreto to Clayton Hamilton governments can you share health implications in the digital solutions?

>> CLAYTON HAMILTON: There was a question in the chat where I shared the details, there are many Best Practices we can draw upon, as highlighted by many today, looking at the technical architecture design in terms of how individuals gain access and privacy surrounding it and transparency is who is accessing the data, having expert accountability and oversight, legal safeguards, important including public Civil Society and the scientific community in the development of systems.  It is often an area which people neglect.

Addressing aspects of data sensitivity, quality and minimization, they're also very important as well as data security measures and it sounds very banal but we have seep data security lax with consequences which have been widespread.

I think also looking at the trade‑offs between sharing data and not sharing data, looking that the from an ethical perspective and privacy preserving perspective and seeing what weather the risk is commiserate to the potential gain, it is not a thought process many countries go through.  You yourself had mentioned data protection impact assessments, that's one tool to help guide countries through that measure.

Other Best Practice perspectives are perhaps a bit more on the practical side.  For example, integrating the many silo data sources around the health information system together into one portal for easy access by citizens.  That's something that's quite undervalued but when implemented correctly really has an amazing impact in providing transparency and accessibility to data.

Having meaningful consent built into not only those types of systems but also when individuals are accessing other points of either care or public health from the national health information system or national health system it is important.  Having policies that stipulate the information that's only needed to be provided by an individual who wants but it sound simplistic, policies in place, it is difficult to reflect across a very disparity digital information system so that eliminates when people are inputting the same information again and again to authorities which we have experienced, it a great burden exposing the use of misinformation or information that gets out of sync.

Having logging of access and sunsetting of data, it is something that's been mentioned and I think log something one of those Best Practices which has now become very mainstreamed within the European region, now people have an inbuilt expectation that whenever access, the record perhaps is accessed, there is a clear log of who accessed the data and for what purpose, it is a very important element to it.

Finally, building digital health literacy to populations, I can't really over emphasize how important its including strengthening the work health force in the ability to use data.  They're the types of Best Practices that we have seen and we want to see again echoed more and more in the development of the national health information systems to protect privacy to the greatest extent that we can.

Thank you.

>> MODERATOR: Thank you so much, Clayton.

Dealing with one of the questions in the Q&A, two for one!.

The next question was from Robyn Greene to Raha Abdula.

>> ROBYN GREENE: Thank you.  So, Rasha, I really appreciated your comments in your initial remarks on how important it is to have access to good information and so I was hoping you could talk more about what effect COVID has had on governments behaviors when it comes to fundamental rights like freedom of expression, press freedom, access to information?  You know, is anyone leading by example and what steps are they taking others should be looking to as guidance and then conversely, you know, are there more draconian measures that you're seeing put in place and what are the harms that you see come from those measures?  What can we as a community of individuals who are committed to the governments of an open, free, secure internet do to respond to the harmful measures?

>> RASHA ABDULA: Thank you, Robyn.  An important question.

Honestly, I don't think I have seen much difference, you know, we do have a problem of course with freedom of expression in our parts of the world and I think it is ‑‑ it has been there since before COVID so I don't really think it has been much impacted by that.  The danger, it is more of a potential danger of gathering too much information without knowing, you know, where the information is going or what's happening to it.  I cannot think of any examples of something that's happened that's too drastic that was because of the pandemic in particular.  I think it is just normal data which is usually not the best, let's put it this way.  I can't think of a particular difference because of the pandemic actually, I think it is ‑‑ I think it is the overall potential for abuse that's really threatening and that we should be very, very because the problem is when you have a situation where there is a real danger and obviously this is a really dangerous situation, you have more of potential approval by citizens to willingly give in more of their freedoms, to willingly give more information, to willingly give, you know, to being monitored if somebody says that this is necessary to keep you safe.  I think it is that potential abuse that we need to be very careful about.

>> MODERATOR: Thank you.

The final question is have Carmela to Amber.

>> CARMELA TRONCOSO: Thank you for the intervention.

My question is about use of data‑driven technology.  We have recently seen that some of them have created problems like facial recognition, automated creating, they have seen the downside of this, I'll link this a bit to a question that goes with the Q&A, a two for one who says that data in COVID‑19 is essentially not normal data, we don't even know how ML can react, what can it cost, and my question is in your experience, what's the best way of trying to avoid this practice and it is a question of design so we need to go back to the drawing board and redesign some of the technologies or is it really something that we can solve on the ground?

>> AMBER SINHA: Thank you.

There are definitely steps that can be taken, some examples I was talking about, in the framework, it is creating the assessments and having access mechanisms and it definitely can be developed but particularly, the kind of technologies that you mentioned, things like facial recognition technology, you have to look at the design of the technology and what we are essentially doing, it is using the technology for the purposes of the public purposes of sort of protecting and preserving the economic social and ‑‑ economic and social rights as well.

So to what extent are these inherent, given the datasets that we're looking at leading to discriminatory exclusion of the impacts, and I do think that the design, the overall design of the technology, it must be ‑‑ if you look at the principle of the proportionality, that requires when we deploy a solution, it must be ‑‑ it isn't less light preserving but it could be an effective solution that exists, something that is essentially a remote, covert system in some sense.

The question must be asked, and there must be limited circumstances where you don't have more rights preserving and other alternative, so you must choose it.

I think both at the level of the design of the technologies as well as the deployment states to recognize the risk and take steps to mitigate them.

We need to look at the issues.

>> MODERATOR: Thank you so much, Amber.

We'll turn now to the remaining questions from the Q&A.  I'll read through them quickly and give you each, panelist in the same order a minute or two to try to address them if you can or not.

The first is a question on data literacy and the fact that as data breaches become more of a cross‑border problem with different layers of data literacy, do panelists agree that a way to provide trust is to increase the role of the data protection authorities when it comes to cross‑border transfers and also when it comes to data practices in general.

The next question ‑‑ it disappeared ‑‑ is it possible to discuss how we can address data retention regimes across the world noting there is a cloud agreement drafted between Australia and the U.S. giving more access to data between the two major states which I suppose also links somewhat to the DPA but any other suggestions people may have on cross‑border data transfer and regulation., as complicated as that question may be.

The next question, since many countries don't have data protection laws yet which we see how can data collection be accounted for during and after the pandemic and related to that it is part of the difficulties that we see in advancing principle‑based regulation and advancing Human Rights principles which are lovely but sometimes in our attempts to get normative standards that are broadly, you know, agreeable enough, how do we whittle those down in particular context in a way that's useful.  Then there's another question that's at the IGF we devote a lot of time and attention to Best Practices, what can we learn from worst practices, which of course is always a great question.

I'll give each panelist an opportunity to answer those questions and then we'll see how we go given that we're running out of time.

We'll start with Clayton.

>> CLAYTON HAMILTON: Thank you.  I wasn't quite sure.

Let me take on the question on I think with the first one, on data governance, how to ensure that there is an absence of data regulation already existing in the country.  The simple answer is you can't.  And again, this is a foundational aspect that needs to be implemented in countries to ensure that all of the elements of transparency include data governance and they're actually valid.

What would likely happen is that under the auspices of COVID‑19 certain data will be gathered and then there will be some uncertainty on what that data is actually being useful for beyond the kind of basic elements of public health surveillance and social measures implementation.

Really what I think is important then, it is that we have an exchange of Best Practices from other countries shared widely and globally so that we can see particularly some of the really good examples that have come out of European Union and other countries around the world of where data protection regulation has actually had a very positive benefit in terms of not only controlling the dataflow but raising awareness to the potential risk that we have discussed in the session today.  I know that's probably not the answer that you were hoping for.  I think really we have to be frank, without the basic elements in place then really there isn't a foundation for the house to be stable.  Let's be honest here and say that we need to work harder to make sure that governments are held accountable for the data that they hold from the public.

>> MODERATOR: Thank you so much, Clayton.

Dr. Rasha.

>> RASHA ABDULA: Thank you.

Unfortunately, I agree with everything that Clayton just said.  You know, in my part of the world, there aren't even data protection laws much.  You know, there's nothing you can rely on.  Even if there were, honestly, I mean, it is still not a guarantee that the data is not going to be misused or abused in anyway. 

I think the best thing we can do right now is just as an international community and as a civic society and international organizations from the United Nations to the WHO to, you know, whoever can speak up to just try to apply pressure so that the data collected are as minimal as possible, as absolutely necessary as possible and to make sure that we will keep asking where is the data going and how long will it be retained for and how it will be used.  We may not get any answers but we have to keep asking anyway.

>> MODERATOR: Thank you.

Finally, amber.

>> AMBER SINHA: Thank you. 

I'll try to quickly answer the questions.  I think that the question about the fact of there is no data protection laws yet, one of the things to potentially look at, those that don't have data protection law, such as India doesn't have one, but if there is a recognized fundamental right to receive, often that right to privacy in itself is a positive application to create rules and under which those, the personal data of citizens must be protected, even from private actors.  In that sense, even in the absence of a proper framework, there may be other things that we can do to protect and create rules, particularly if the rules need to be created urgently during the pandemic because of the technical systems are being deployed.    .

Michael's questions on Best Practice, it is a very interesting one.  That also relates to the point where risk assessments, it was something ‑‑ risk assessment, it looks at the accessibility and in certain cases, something may be sort of out there, but those things, they also need to be addressed because even if there is a law, in the event that that remote risk, it does impact and it could be severe.  The other question, about how to ensure that no entity can use (poor audio quality).  That's a difficult one.  More and more what we see, there is a narrative about data as an ascent, in various countries, countries are interested in creating silos in which they want to provide some protections around the use of data.  First I think you'll have to get to some degree of convergence, around presents for data sharing and other interests that lay with the data.  When the principles are there, we have to think about other governance.

Thank you.

>> MODERATOR: Thank you so much, Amber.

Dr. Rasha, controversial.  Maybe what I'll allow you to do, just give me the word of the day, a quicker way of wrapping up our session!.

So I'll give Robyn a chance, and then everyone, a chance tore the word of the day before I wrap up!  Maybe 10 seconds!

>> ROBYN GREENE: Thank you.  I'll keep this brief.

I just wanted to say thank you again for the opportunity to speak on this panel and I have learned so much just from listening to all of the other panelists.

I think, you know, a biggest take away we have seen from the COVID‑19 pandemic and governments response, and the private sector response, it is the important of establishing privacy‑protected responses that enable getting good and actionable information to citizens and individuals living in places that have, you know, significant infection rates and those that don't for that matter and making sure also that people can trust their government's response to the effective and to be responsible and to make sure that data is only being used in a way that is expected and that will be contributing to the pandemic response.


Contact Information

United Nations
Secretariat of the Internet Governance Forum (IGF)

Villa Le Bocage
Palais des Nations,
CH-1211 Geneva 10

igf [at] un [dot] org
+41 (0) 229 173 411