IGF 2022 Day 3 WS #416 Human Rights Centered Technology in Emergency Responses

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> MODERATOR:  I think we can slowly start.  All good?  Thank you very much.  Good morning, everyone.  It's a real pleasure to be in the room with you today, especially people we have been working on this for so long, and now we are able to share the room.  My name is Ilia Siatitsa a member of Privacy International that challenges government and corporate data technology exploitation.

Today we're here to discuss human rights‑centered technology in emergency responses.  In the aftermath of the pandemic, it found more than half of the world's countries in active emergency measures in response to the COVID‑19 pandemic.  These measures included the rapid introduction and repurposing of surveillance measures and technologies to fight the COVID‑19 pandemic that has downstream effects on human rights fundamental freedoms and rule of law.  Nearly three years after the start of the pandemic, now is the time to take stock, assess the legality, necessity, and proportionality of the surveillance measures and technology introduced to fight the pandemic, and determine what lessons have been determined so that governments and Civil Society and the entire world are better prepared for the next global emergency.

Also, we need to make sure that these measures and practices should not be normalized and remain with us as considering the effect they had and continue to be having on our freedoms.  In a joint initiative by ECNL with support of partner organizations including members that are here with us in the room, and with whom we will talk more during the discussion, we have been conducting an in‑depth research on these surveillance measures and identified globally observed trends about the harms of the COVID‑19 surveillance measures on Civil Society and beyond that we plan to present in an upcoming report that we hope to release sometime mid December.

Today, we would like to share some of these findings with you, and we're hoping to make today's workshop a dialogue with all of you that will reform ‑‑ or inform the upcoming report.  In the next few minutes, my fellow panelists will provide an overview of our findings and recommendations, and then we would like to open the floor to hear your thoughts and opinions and experiences and gaps, and more generally on what should happen in the future.

Let me briefly present Olga Cronin, project Manager in the area of surveillance and human rights in the 15‑member international network of civil liberties organizations.  She's based in Dublin, Ireland where she's also a Policy Officer in the same area at the Irish council for civil liberties.

Karolina Iwanska and working on impacts of civil technology and civil freedom and coordinating level notably the EIAA act.  Olga, can you give an overview of the findings?

   >> OLGA CRONIN:  Thank you, Ilia.  Thank you to everyone that made it here this morning.  We've been looking forward to having this conversation and getting any feedback or insight that you might want to share on our work, and our resources are here with us today as Ilia said, and they'll get to introduce themselves.

Basically, Ilia has given a case or given an overview of why we've done this project, and what I'll do is just give an overview of the how and the why.  Basically as explained, in collaboration with Stanford University did desktop research to see what kind of measures were out there and what surveillance measures were taken in response to COVID‑19, and then after that INCLO as mentioned, a 15‑member network of civil liberty organizations around the world and surveyed the members about the COVID measures and rules and laws that were put in place, and out of those surveys, we felt there were several jurisdictions where they needed deeper examination.  So, we asked our colleagues, Colombia, Indonesia, Kenya legal rights and South Africa to carry out that deeper work and further asked our research partners in France and independent researcher in India to do the same deeper work in those jurisdictions and brought the number of jurisdictional kind of case studies that we had.

Then we used those case studies as the basis for our report.  When we looked further into all of the case studies, we basically identified five key trends that kind of brings us to the wash, the findings.  The five key trends were first of all the repurposing of existing security measures, silencing of Civil Society, the risk of abuse of personal data, influential role of private companies, and normalization of surveillance beyond the pandemic.

In terms of the trend, repurposing of existing security measures.  In response to COVID‑19, some governments took advantage of the existing frameworks and resources that had originally been introduced for counter‑terrorism measures.  These moves including drawing upon existing legislation, the deployment of military technologies and use of national intelligence services.  We saw in Israel the use of the Shinbet intelligent services to retrace movements of people who had tested positive for COVID so that they could identify people who they had been in contact with, and they did this by basically tapping into a previously undisclosed trove of personal data or cell phone data that was gathered by the Shinbat country for terrorism.  We also saw in Kenya, which hopefully Martin will be able to talk to in a few minutes, where the computer misusing cybercrimes act 2018 was used by the state to publish bloggers and voices of dissent for allegedly publishing misleading information.

We also saw similar actions in Indonesia through the use of Indonesia's law and information and electronic transactions and virtual police unit formed to preempt and prevent potential cybercrime.  So, in a nutshell for more than a decade, human rights defenders have been documenting how counter‑terrorism laws operate with little transparency and accountability, you know, to kind of quash descent and silence critics and same concerns apply when counter‑terrorism measures are used in a pandemic response.

In terms of the second trend, the silencing of Civil Society, and similar to the repurposing of cybercrime laws, some countries such as the Philippines, Russia, South Africa, introduced legislation to criminalize pandemic‑related misinformation, and when combined with disproportion at penalties up to six years in jail in the case of Argentina, and unclear criteria to define what qualifies as misinformation, these measures contributed to climate of fear and intimidation.

And then couple that with states deploying or introducing technologies such as drones, robots, facial recognition on to the justification of enforcing mandatory lockdowns and the increasing and surveillance of public spaces, and this posed a threat to the freedom of assembly.

And bear in mind these technologies were used at a time when protest, these technologies were used to survey public spaces at a time when protests were clearly being closely monitored, if not outlawed, and where people were forcibly dispersed in some cases violently.

In the case of the third trend, the risk of abuse of personal data, governments were introducing, as you know, tools designed to trace the spread of the virus, but this involved the collection of massive amounts of personal data, health data, very sensitive data, and in some jurisdictions, these were created with little public consultation or oversight.  At least in our research, we determined that many of the contract tracing apps were ruled out with little, as I said, public consultation, but as they did not meet the fundamental data ‑‑ the fundamental principles of data protection in legality and proportionality and data minimization.  For example, in Colombia in the Constitutional Court of Colombia which Daniel will also speak to hopefully.  They determine that the data collection connected to the mandatory use of the app was unlawful.

So, going on then to the fourth trend, which was the influential role of private companies, we saw private companies playing a significant role in the pandemic by either cooperating with governments to develop apps and tools or engaging in data‑sharing arrangements.  In countries like Colombia and UK, governments entered into public/private partnerships that were very opaque, these agreements, and like in some cases the entire scope wouldn't have been revealed until activists demanded transparency.

So, this lack of transparency makes it very difficult for Civil Society to understand the extent of data sharing between governments and private companies and also where it stands now and where it's going in the future.  Probably we saw, for example, we saw in South Africa the use of WhatsApp which Sherylle Dass from the legal resources center will hopefully speak to, and we also probably one of the most impactful interventions from the private sector, was the use of the Google app exposure notification application program interface API.  This is the Bluetooth gain API which was used in nearly 40 countries and served as the basic building block for governments to build their own contact‑tracing apps.  But I mean, the countries who sought to build their own app, were stimied by the fact that Google and Apple restricted the background Bluetooth broadcasting that lower the efficacy of the app and popularity of that raised significant concerns about the power of private corporations to determine emergency responses.

And then the fifth one, and I'm sorry now I'm probably going on too long, but the fifth is the normalization of surveillance beyond the pandemic.  And from our research, we're concerned that the pandemic has provided an entry point for invasive government surveillance to become normalized, even after the threat of the virus has receded.

So, we observed the continuation and normalization of state surveillance in some jurisdictions as previously stated, you know, our report describes how states repurposes counter‑terrorism laws and technologies and applied to civilians in the name of fighting COVID‑19, but we must also be very weary of the opposite phenomena, the normalization and repurposing of measures and tools.  And we've got reason to fear the possibility of mission creep because we've already seen governments announce their intention to use data collected during the pandemic for secondary purposes, such as the development of national health programs or platforms in Colombia, in India which hopefully Amber will maybe speak to and South Africa.

So, on the surface, you know, this shift from COVID‑19 to general public health may appear unproblematic, but the use of data originally collected in exceptional circumstances for specific purpose violates the purpose and contributes to the surveillance of a state that accumulates large amounts of data about decisions that could be disproportionate and intrusive.  After that overview, I hope I wasn't too long but pass it over to Karolina for the recommendations that follow.

>> KAROLINA:  Based on that we came up with recommendations for three groups of actors that we see relevant to act in this space so that we have a better human‑rights centered technology in the next emergencies to come.  The three actors are obviously the state actors, private sector, and Civil Society, and these bubbles, you know, they are increasingly smaller because we see the most responsibilities, obviously, lie on the shoulder of the states.

And so let me start with those, just a couple of key words, and we have very detailed recommendations actually in the report, and we also hope to discuss with you.  Essentially, the biggest problem as Olga mentioned a general trend of lack of reflection and lack of assessment of human rights impact of these technologies prior to their deployment, you know, that can obviously be explained by the emergency that was happening, there was little time, but there was indeed a lot of hasty and opaque and technologies deployed in hasty and opaque way, and we see an important need for review to take stock three years on, review the surveillance measures and technologies that were deployed during the pandemic, and assess them in the context of human rights compliance, in the context of their impact on human rights.  Also, in the context of their efficacy in order to check which technologies were good and which should be ceased because they're income pliant with human rights or no long beer necessary.

Similarly, any data that was collected no longer necessary or apps implemented but no longer used or necessary, should be ceased.

Another thing that we saw is also from our researchers on the ground it was very difficult to gather information about the status of these tools, about how they operate, about even the legal basis of which they rely.  We see a need for improvement in terms of transparency and providing public information already now about the tools deployed during the pandemic, whether what is their status, whether they are still in use, and the results of those impact assessments, I see I'm ‑‑ I'm sorry, I'm constantly disconnecting from Zoom because the Internet is a bit shaky so we have to reshare.  I'm sorry that you keep losing the presentation.  Am I connected again?  Yeah.  I'm sorry.  Yeah.  Okay.  I'm sorry for the little technical break.  We need to sort things out.

All right.  This should work now.  I'll continue in the meantime.  Yeah, as we essentially see the need to share way more information that was shared before on the status of these technologies under human rights assessment, and then also the same should be done actually with legal frameworks, with laws and policies that were applied during the pandemic, the state of emergencies that were introduced and maybe never revoked, so ‑‑ so we see the need also to revise that.

And now in terms of future emergencies, so these were the recommendations when we are dealing still with the COVID‑19 pandemic, so to reassess actually the response that the states did.  And then for the future, we innovatively will have future health or other emergencies, so in order to be for the states to be prepared, they should revise or develop legal frameworks in line with international human rights standards based on lessons learned and problems identified, and these legislations should have key elements for sunset clauses for technologies used so they are for the repurposes for other reasons.  Data protection rules, if they are not in place already.

And also, meaningful public participation, you know, in many cases we saw that Civil Society or, you know, people in general, they were not really involved in the creation of these tools or in the discussions.  We did identify some good practices and we highlight them in the report, but in general, governments did not ensure meaningful public participation which we see as crucial, actually, to also assessing potential impacts of these technologies.

And then we also would recommend governments, and later we can talk about how to achieve that, not to use the surveillance measures as pretext to stifle dissent, to repurpose them for other reasons after the emergency.  Yes.  Previous one.  Yes.

And also, as Olga mentioned, we saw the increased role of private companies and also the use of data that was collected for the purposes of the pandemic for commercial purposes.  So, we also see the need to ensure that doesn't happen in the future.  Finally, maybe more detailed recommendation, is that ‑‑ to introduce a prohibition of indiscriminate biometric surveillance because we also see these very intrusive technologies being used during the pandemic to stifle dissent under the idea of fighting COVID.

In terms of for companies, obviously they have, you know, we called it in the report that this is the first pandemic or emergency in the smart phone era.  And obviously the companies had a huge role in responding to the pandemic, so we also see the need for improvement in terms of transparency, in terms of keeping them to account on what sort of responses companies also facilitate.  We also see internally the need for assessment of human rights compliance in line with UN principles and business in human rights, but as in line of laws existing in specific jurisdictions.

And, again, to seize any practices or technologies that are not compliant with the standards.  Internally companies should also have human rights policies that are there for the future when they also introduce surveillance technologies or help governments in responses to emergencies, including very importantly, procedures for assessing governments' access to data.  Because we saw a lot of kind of data‑sharing agreements between governments and private companies, and these policies should to the extent possible, make sure that these exchanges happen in line with human rights standards.

Of course, data protection in terms of how people's personal data is processed by these technologies is an important matter as well, and giving the public more information about the sort of agreements data‑sharing agreements as well and ensuring access to remedies when the company actions caused or contributed to adverse impacts.

Finally, last but not least, we see a very important role for Civil Society.  We also highlight in our report a couple of successful advocacy or litigation actions that Civil Society took to actually challenge surveillance measures that were incompliant with human rights, drones in France, and we hear from the colleague from Colombia, another interesting case.  We see another reel for Civil Society to act as watch dog in the circumstances, and of course it's not easy because it requires time, effort, and anybody here from Civil Society knows that we had to drop everything and move to watching, essentially, what the state does.

But this is really important to monitor and investigate these measures and pursue any avenues, be it public pressure, legal avenues to challenge measures which violate human rights, and as well as urge governments not to repurpose surveillance measures after the pandemic is finished and now also a crucial moment for Civil Society to step in, and policy of state agencies on tools used, purposes, private/public partnerships and agreements, as well as laws that should be developed and Civil Society should demand a seat at the table because their view is really important in this context, and of course as well apply pressure at the companies.  These are very high level and we hope that we get to discuss a bit more.  Thank you.

   >> ILIA SIATITSA:  Thank you very much.  We heard from Olga about the very concerning trends that have been identified and these are only a few in the course of the pandemic, and as the recommendations we have for states, companies, the Civil Society, there is a lot to be done and lots continuing to be done, and we all know the world doesn't stop moving.  But we need to remain vigilant to ensure that abuses that happened in the past three years do not occur again with the next global emergency.

So, our aim of the rest of the session today was more to have a conversation with you.  We don't want questions to ask.  We have more questions towards you.  We would like to pick your brains about these findings and recommendations, and to ensure that our report is robust and as useful as possible.  So, we welcome feedback, questions, insights, anything you can offer to bring to the table.

So, we thought that I open the conversation towards you.  I don't know if you already have some thoughts on what was already presented or want to add to it?  I know it's early in the morning.  Is there anyone from online?     >> KAROLINA IWANSKA there is no Internet, so I don't know.

   >> ILIA SIATITSA:  Let me tease the room a bit, a recommendation from Civil Society to focus on the need of Civil Society organizations to monitor and investigate and intervene against state and corporate responses, research details and litigations in France against drone use and Israel and Colombia against unnecessary (?) election.  And is there any other similar examples that one could think of successful advocacy or litigation efforts, good practices in Civil Society that we could use now and in the future to challenge some of these practices?

   >> AUDIENCE MEMBER:  Hi.  Thank you.  My name is Daniel.  I am a researcher from a Civil Society organization in Colombia.  I developed a bit on Ilia's question and in the Columbian case.  As our panelists were saying, in Colombia, we have very interesting strategic litigation, that as a Civil Society undertook with the help of some journalists because as if Colombia, as in many other countries, there was the government that deployed this contact‑tracing app that was called Codon app and received a lot of criticism because of the lack of transparency and primarily accountability.  They said the app was voluntary, but in practice it became almost quasi‑mandatory in as many other contact‑tracing apps, it asked you to report your information on location, on the symptoms you had, who you live with, who is your family, on your friends, even where you work.

And when this app became quasi‑mandatory, Civil Society ‑‑ we were very concerned, especially because the reported authorities requested the app to be downloaded and survey completed to enter the airport.  This case, this litigation started because there is a very famous journalist in Colombia that has been persecuted by the state, and she wanted to travel to another city from Bobita to another city in Colombia to talk with one of her sources.  And she refused to download the app and to kind of complete the survey that the app requested.  She was denied the entrance to the airport and could not fly and work with the source that she was working with.

So, she started this litigation with the help from Civil Society organizations and other citizens, including one Congresswoman, asking the government, the Judges, to protect the rights, the fundamental rights to privacy and data protection and freedom of movement that were clearly violated by the airport.  And the case was a very interesting case, because it at first, the Judges didn't understand it.  They just dismissed the case.  Finally, the Constitutional Court of Colombia reviewed the case.  However, this revision came almost a year and a half later when this mandatory use of the app was no longer in force.  So, the first ruling of the Constitutional Court was that there is no longer danger, so I must dismiss this case.  But, although that might be the general ruling, the ruling did have very interesting considerations on data protection because although it said that as it wasn't mandatory by the time the ruling came, the Government was obliged to respect privacy and data protection in states of emergency and the Constitutional Court deemed what the government had done and airport had done was unconstitutional and unlawful and violated the fundamental right to privacy of ‑‑ not only of the journalist and the Congresswoman that presented the case, but of everyone that was obligated to download the app to enter the airport.

As a result, the Constitutional Court ordered the Ministry of Health and the National Digital agency that were the entities in charge, to erase the plaintiff's data, and it also ordered them to create a special mechanism for everyone that wants his or her data to be raised, this he could access that mechanism.  We were expecting the Constitutional Court to order the erasure of all data, but the creation of this mechanism is still very interesting and we believe that it is a successful story even though it has downsides, it is a successful litigation that expands the right to privacy in Colombia.  Thank you.

   >> ILIA SIATITSA:  Thank you.  Thank you.  It's not only a success story, but as you reflected on the lessons learned and things that could be done beyond the ruling and moving forward.  How is the situation with the app now in Colombia?  Is it still operational?  Do you have any information from the government with regard to the use of the data that was collected from it?

   >> DANIEL OSPINA:  Currently the app disappeared as Codon app but it was repurposed to a general app from the Ministry of Health, that is called (?) that in English would be something like Main Health Digital.

This report was kind of not only repurposed but there was lack of information on how it became repurposed, and I think that in many other countries, this might be the same situation.

   >> ILIA SIATITSA:  It was a trick question, I promise, but it was hinting on exactly the other trend that was mentioned by Olga about the increasing collaboration and partnership dependency to private companies to deliver some of the measures that were introduced, and the lack of transparency quite often with regard to the conditions for these collaborations from what access the companies had to the data, how the tools will be used or repurposes, lack of any information with regard to the transparency around the use.  And in that regard Colombia hasn't been the only case that we have seen that.  We have also seen it in many other countries across the globe.  With that, maybe I could invite Sherylle to give her thoughts of what has happened in that regard in South Africa.  Again, please, also introduce yourself.

   >> SHERYLLE DASS:  Thank you, Ilia.  I'm Sherylle Dass an attorney from the Legal Resources in South Africa.  We have participated in researching the South Africa case study.  Yeah, so the primary technologies that the south African government used to control the spread of the COVID‑19 virus was two apps which was the COVID Connect App as well as the COVID Alert App.  At the very beginning, the COVID Connect app began as WhatsApp platform and later on expanded to become more of a service provided for health care information, screening, and contact tracing.

The COVID Alert app was launched alongside the COVID Connect app and as Olga mentioned, it was both on the Gain API and works via Bluetooth.

So, a lot of private companies were involved in some instances of data collection and processing, discoveries of major medical aid provider in South Africa, they developed the software and it's supposed to be on behalf of the national department of health but they continue to provide the technical services to the department.

Besides Discovery, the largest telecommunications company was also involved in providing technical expertise to South Africa Government.  The most concerning aspect of the COVID‑19 apps is reliance on WhatsApp as a communication platform, so independent technical review is explaining that the use of the WhatsApp API to notify COVID alert users abuses of the test, raised privacy concerns, regardless of how, you know, reliable or convenient that system might be.

So, this approach would allow third parties with commercial interests to identify which users had been diagnosed as COVID‑19 positive, and this wasn't a requirement of the Gain framework so the technical advisors believed that this was just a preference or choice made by the developers of the app.  Although the content of the messages are encrypted, there was serious concern on how the information will be processed when a person engages with the app around the COVID‑19 status.

So, the technical reviews also raised two concerns which I think was important in the South Africa case study.  The first is the App Source Code which was not made public available and this is really around the transparency that private companies are not providing information to the public, and this is frustrating if it's to understand the scope of the apps processing activities.

So, the technical impediment also complicates the analysis of the app independent technical reviews and touches on the transparency criteria.  It is detected a suspicious URL which potentially sends sensitive information, date of birth, names and sir names, and to backend servers and it was contrary to the privacy claims reflected in the Google app description.

So those are the two major concerns raised in South Africa from our research.  Thank you.

   >> ILIA SIATITSA:  Thank you.  Thank you very much.  So, in that regard, I would like to invite if anybody else would have any other inputs or questions in relation to what was shared right now or before?  Peter, please introduce yourself as well.

   >> AUDIENCE MEMBER:  Thanks.  Peter Misek from Access Now and I think this is a super important topic.  Access Now is a global digital rights organization, and we found that what happened ‑‑ what starts during crises can often continue for a very long time, and these crises don't usually just end with the clear bang but they continue long beyond when the headlines and media are reporting on them, but things like Internet shutdowns lasting more than two years in Ethiopia, a long time in Myanmar and other countries what seemed like emergency measures can really become permanent, so that's why it's so important to put human rights at the center.

So for our part we just launched here at IGF on Tuesday a report ‑‑ a declaration on principles for social media companies on how to better govern content in a human rights‑respecting way during ‑‑ before, during, and after crisis situations, and I think the role of those big platforms is one piece of that technology puzzle and that it's important to draw it out because there is content that can insight violence and campaigns of disinformation that exacerbate crises and those social media companies do have responsibilities.

We're also looking at the role that the UN plays and other big governments and multilateral organizations that procure and buy a lot of these tools from third‑party vendors and private‑sector companies and relying on the private sector to digitally transform their services.  There has already been a number of examples where big organizations like the HCR, High Commissioner for Refugees contract with small companies that sell biometric surveillance tools like iris scanning, and these iris scanning scanners are actually required if you want to, as a refugee, procure food and basic essential goods and services.  So, their sensitive data is being processed without any concept, without any meaningful legal basis, and this shows, I think, the important role for human rights screening processes and procurement of big clients like that who I hope can use their power and buying power to raise the standards and, you know, reflect what the human rights system tells us about the need for the ways that these technologies can impact human rights.

   >> ILIA SIATITSA:  Thank you.  Thank you very much.  These are absolutely crucial points, and then very important here about this declaration calling specifically on social media companies as key actors in how we'll address this in future crises.  We heard a lot about companies, but let's bring it back to the states, the primary holders of human rights obligations, and in that regard maybe it would be interesting also to hear about lessons learned from India with regard to what has happened with the measures that were taken there.  Amber?

   >> AMBER SINHA:  Thanks, Ilia.  I think in India there have been a variety of measures in order to respond to the pandemic.  And apart from I think some of the measures such as lockdowns and curfews largely short lived in most parts chose to use a COVID strategy.  A lot of in terms of COVID response was down to data and tech measures.  To begin with, I think the most downloaded contact‑tracing software in the world is ROC2 software contact‑tracing app in India, and from the very beginning the whole thing was mild and a lot of mystery, so there is the national informatics center in India that supposedly developed the app, and over a period of the first few weeks, there was lack of clarity about who actually built the app.  So, the NIC is registered as the developer of the app if you go on the Play Store.  There was also news reports that stated that there was some form of volunteerism in terms of a set of industry individuals coming together and building the app.  And in fact, when our IT applications were sent to the government to inquire about the exact process followed to commission the building of the app, there was not a clear response.  In fact, at one point, the response said that the government did not know who had built the app, which is quite bizarre to say the least.

There was immediate pushback from the Civil Society in terms of some of the measures of the app, so just in terms of the kind of data points that it collected, like some of the other contact tracing apps it uses both location as well as Bluetooth but you a side from that it collects your phone number, age, name, gender, profession, and even some of your health information, whether you're a smoker and your travel history.  Most of which have very little to do or absolutely nothing to do with how contact tracing is supposed to work.  There were fears from the very beginning about, you know, we've spoken a few times through the course of this conversation about how this app would be normalized, and I think some of us who had been studying health tech governance in India, were fearful from the beginning that this would eventually become part of the sort of larger governance design around health care welfare, so to speak, in India.  And about sort of two years later exactly that is what has happened.  So begin with, there was a privacy policy but it was completely inadequate.  After pushback from Civil Society, the government set up a committee which created a kind of protocol.  Again, you know, even after the protocol came out, a lot of inadequacies in terms of privacy concerns, discriminatory concerns were raised with it which never really got addressed, and then about two years later, the protocol was suddenly discontinued and it was announced that this app had now turned into a kind of national health app.

So there remains, again, a lack of clarity in terms of what happened with the data that was collected.  I think even before this happened, what people had noticed is that several people that registered with the app, health IDs in their names were created without their knowledge and consent.  And even at this point we don't know, and I mean there have been RTI applications, news reports, and we've tried to kind of glean what we can by closely studying the policy and what they said about data retention and all of it largely contradicts.  We don't really know what has happened with the data that was collected over the course of two years before this turned from largely a COVID‑19 contact‑tracing app to now a national health app.

Even apart from that, I think there has been a lot of issues around some of the other measures, for instance, the platform that was set up for a vaccination drive.  In India, again, it was fairly peculiar that in a country like India, which has had fairly robust physical health care vaccination drive that has existed for some decades, and the decision to actually move to what was essentially a digital first and, in some cases, a digital‑only vaccination drive.  And a lot of ‑‑ it was only after the government was taken to the court on this that some of the policies around the vaccination drive also changed and allowed for other physical means to register for vaccination.  So, there have been concerns.

What we see emerging now is a kind of broader health care governance technology infrastructure, and there remains a lot of ambiguity around what happens to the data that was collected, in what form would it be used, who gets access to that.  Because, again, most of this has been developed in the sort of public/private partnership model, and even some of the ‑‑ there has been state endorsement of certain private party measures around health care as well, so there remains a lot of questions about who has access to that data, to what extent private actors are also getting access to it, and in what ways are they allowed to use that.

So, yeah, going back to repeat the earlier point, which I think perhaps needs to be repeated over and over again, is that when we do have ‑‑ when you respond with emergency measures, I think it remains even more critical that we continue to apply the standards of necessity and proportionality and data minimization because effectively those have in‑built protections where you can allow a certain degree of restriction on the right to privacy, in order to respond ‑‑ as long as it is necessary and it can be demonstrated that it is proportional.

So, this sort of urge to do away with those protections in response to any kind of emergency, eventually does come back to bite us.  Not to say that much of the fact that most of the data and tech measures that we have actually looked at, so apart from this, we've seen sort of untested technology as thumb scanners being deployed across the world and now, they're being used in schools, being used in malls in various countries.  So, it kind of becomes a sort of gateway drug into use of new technologies and we don't really have a very clear use or any clear intent to scale them back.  I'll pause at that.

   >> ILIA SIATITSA:  Thank you very much.  A lot of very important key issues raised there, including the anecdotal answer by the Indian Government that they didn't know who had developed there.  App.  If they were forced to issue the transparency report, probably they would have had to consider that answer earlier on.  But then a key question around the regulation, what is going to happen now, and you've mentioned how a whole health governance ecosystem has now developed, but then also we've seen how that ecosystem then transfers also beyond the health data to border controls and to different spaces.

And if ‑‑ would you like, yes, please.  Please introduce yourself as well.

>> Hi, everybody.  My name is Juan from Colombia as well, another Civil Society organization called Charisma, worked with times but only met with some of you for the first time.  Thinking about the last part, the common trend that I start seeing here is with the line of work that Access Now has been developing and in our findings regarding another app for the pandemic specifically for the City of M is that the structural differences that are already there in the communities, the inequalities that are present where exacerbated as happened with so many other things during the pandemic.

And I was thinking about this specifically because the street vendors emerging were targeted by that app in a very specific way, and what happened there is there needed to be parameters by the local government to go around, and that had to be done through the app to do it.  So of course, if the permit had to be signed by your employer, which was the case, for street vendors that could not happen, of course.  So, it impacted them in a very, very specific and different way.  I think it's the same thing with migrants because, well perhaps not in the pandemic data collection scene, but in general what we were talking about yesterday, the fact at that they're not able to say no to data collection measures, it highlights those structural differences along the whole use of technology.  And I think that's quite interesting as a leverage point to make people conscious of why this is serious as a broader topic.  Because those things tend to be very hard to grasp for people in general, and this shows it in a very interesting way I think.  Yes.  Thank you.

   >> ILIA SIATITSA:  No.  Thank you very much.  There's exactly why we're here and it's important to hear.  Martin?

   >> MARTIN MAVENJINA:  Thanks.  Ilia.  I work with the human rights program and I just want to jump into the conversation.  With Kenya, part of our ‑‑ I just point out basically our key finding the.  Kenya like other states around the world, employed the use of contact tracing with deal with the pandemic.  From the research there were two key findings, one an absence of clear regulatory frameworks to guide on how companies or even how the state could actually use the contact‑tracing apps.  More importantly, there was an absence of an oversight mechanism, the number of acts developed by the government, even private actors, and when it comes to oversight of how, you know, these apps are used just like some of the speakers before me have said, we all in this room, if you don't have oversight of you know specific use of specific application, then it becomes difficult to hold, you know, the mechanisms of these applications first and foremost to account and even for the state to account even at that they used the data that they collect for the wrong reason.  That's what I wanted to point out.  Thank you.

   >> ILIA SIATITSA:  Thank you very much, Martin.  This is confirming and adding to what we were discussing now.  Since I have you with us, and as we move towards the closing of this session, I wanted to hear your thoughts.  We've identified harms, we've identified the gaps and what the states and governments should be doing, but what could be some of the mechanisms Civil Society could be using to actually put pressure on states to take those necessary measures to ensure we are protected now in the future?

   >> MARTIN MAVENJINA:  Thanks, Ilia.  I think the first measure that Civil Society organizations can employ to ensure states actually implement some of the recommendations that are in a number of Civil Society organizations that have come up with to report is first; the first thing is to sustain public conversations.  When you have an informed citizen or informed public, then that simply puts pressure on the state.  That's not only ‑‑ that's a principle that can actually apply in all states around the world, because if you have in part citizenry and they know that specific applications were developed by their different governments are actually harmful, then they will actually pile pressure on states.  The first thing I encourage citizens to do is that the reports with the recommendations that we have been able to come up with, it's not only the reports, but Access Now talk about the reports to ensure that the information is out there for citizens in very simplistic language, because at time Civil Society organizations write reports so it becomes difficult for the ordinary person out there to understand.  If you have an informed citizenry, then you would pile pressure on the government.

The second thing that Civil Society organizations can employ is the use of strategic public interest litigation.  You have all heard from the different speakers here about, you know, the serious human rights breaches or the serious privacy concerns that some of these apps, you know, that were developed by the states.  Actually, the violations arose from these apps, so what I encourage Civil Society organizations to do is to institute public interest litigation cases, that hopefully will coordinate to orders that will compel the state to come up with regulatory frameworks or even would have orders that would actually record some of the repressive pieces of legislation that actually would allow the states to conduct serious surveillance or privacy ‑‑ or data privacy breaches of citizens.

The other third part that I think Civil Society organizations can employ is the use of peer review mechanisms and there are quite a number.  For the Africa ‑‑ for sill society organizations in Africa, they can always engage with the African Commission of Human and Peoples Rights that is a peer review mechanism for all the African ‑‑ for all the African Member States whereby they use the space to actually raise the concerns with the home that the African Commission will come up with resolutions that would transmit to the different states and request them to actually implement some of the recommendations.

The other peer review mechanism that is actually essential for all countries around the globe is the universal periodic review system, so the universal periodic review system started way back in 2010, so to date we've had three cycles for countries around the world and the reason why the universal periodic review mechanism is very important is because it's a peer‑to‑peer review mechanism, so different states review different ‑‑ review each other at the end of each cycle.  The recommendations are made for the states.  So, the states are mandated to actually implement some of these recommendations, and some of these recommendations could mean coming up with, you know, policy or regulatory frameworks that can actually, you know, provide or ensure that states actually adhere to best interim additionally accepted principles.

I will give an example of Kenya.  In Kenya, we've had ‑‑ Kenya has been reviewed by the universal periodic review mechanism and since 2010 to date, I can say with confidence that since Kenya started the review process, we've seen positive development in terms of legislation because way back in 2010 the recommendations made to the state of Kenya and we were told to come up with new pieces of legislation that would conform to the new constitution.

Now, fast‑forward to 10 years down the road, we have comprehensive business legislation and I think it's a good space that we can all benefit from.  What we need to do is actually try to make sure we make good use of this space.  Thank you.  Thank you very much.

   >> ILIA SIATITSA:  Thank you for that.  Actually, it's a very nice segment to finish this discussion in a positive way that, well, positive because there is a lot of avenues that we can use to work together to pressure for change.  And well, not negative, but there is a lot ‑‑ (Laughing), there is a lot of work to be done still.

So, thank you very much, everyone, for being here today.  As I mentioned or we mentioned at the beginning, we are planning to publish the report later probably by mid‑December.  And if there is any further input or suggestions or thoughts that you would like to share with us, we really, really welcome it and please don't hesitate to reach out.  Stay tuned for the reports and publication.  Thank you very much.