The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
>> We all live in a digital world. We all need it to be open and safe. We all want to trust.
>> And to be trusted.
>> We all despise control.
>> And desire freedom.
>> We are all united.
>> Hello, everyone. Sorry for this moment of the technical problems. I'm together here with our online Moderator, Jenna. Jenna, if you could shortly welcome everyone and introduce them for a workshop. And I will start sharing the presentation.
>> JENNA MAN HAU FUNG: All right. I can start any time. Sure. Let me know when you ‑‑ when the powerpoint slide is ready on your side.
>> EMILIA ZALEWSKA: Okay. I just need a cohost, Emilia Zalewska, because I can show my screen. Okay.
I see that you can see the screen now and the presentation. So hello, everyone, on site and online to the session, to the workshop, Paradox of Virus Contact Tracing Apps.
And let me just quickly introduce you to the agenda of today's workshop. So firstly we will start with the ten‑minute introduction of the thematic background and also of our speakers.
Then the speakers who are with us and I will introduce you soon, who they are, they will share with us their presentations. Then we will have an opportunity to listen to the panel discussion between the speakers. And after that we will start the interactive part of the panel. And we will divide in to two breakout groups in which we will have some discussion.
And then after that discussion, there will be time for some remarks and summary. And that's basically the whole session. And with that I would like to pass the floor to Jenna who will introduce us to the background of this session. Jenna, the floor is yours.
>> JENNA MAN HAU FUNG: Thank you, Emilia Zalewska. I hope that all of you will be able to hear me. So I am today's online Moderator, Jenna Man Hau Fung from Netmission.Asia. I'm going to give a little bit of background. I guess at the beginning of the pandemic I come up with this ideas on this topic because one year ago we heavily depend on the contact tracing to handle ‑‑ to handle how we control the spread of the virus before we have the vaccinations.
So that's how we come up with this interesting topic here. I guess at the start of the century, the term track and trace was purely used in respect of our (inaudible) but now it applies to humans. To tackle with the situation and to live through the pandemic we have become more dependent on the use of the Internet and technology. Whether a centralized or a more decentralized model is being adopted in one session, such as technological method is supposed to help address the pandemic more effectively and help us get back to living our normal life sooner.
While naturally, public health comes first, first is a growing concern of data privacy, surveillance on citizens brought by the tracing app, which can never be neglected. While data is undoubtedly important in solving the problems of the pandemic, we must always be cautious about how much of our privacy we expose.
So is it truly important that we must either sacrifice our privacy for public health or put our public health at risk to preserve Human Rights or without such technology would our economic activity and movement continue to be restricted in itself and the other form of Human Rights violation? What's the balance? Or will it be a never‑ending paradox of control and freedom?
So I guess today we have a few speakers to share different views or sharing some case study on how the situations of the use of con ‑‑ virus contact tracing app in different countries including Latin America and Asia‑Pacific and also our audience from different parts of the world. I hope we can exchange ideas on the practice. And by the end of this workshop we get to know more about different cases in this world. So we can develop some new conceptual frameworks or case study in to policy making, particularly in digital policy development communities. So we can encourage a sustainable mechanism in governing data privacy and protection and surveillance brought by about ‑‑ by the use of technologies like these kind of apps, because maybe perhaps we could say we are even more dependent on technologies.
These are our background for this workshop. And I guess I won't take too much time on giving it. I would like to drop another link to our workshop details in the chat. Those in the room and on Zoom can have access to it and refer to it and that will be helpful for our discussion later. Back to you, Emilia.
>> EMILIA ZALEWSKA: Thank you very much. Just let me display the next slide. So as I mentioned before now it is time to reveal the speakers of our session. So today, we have with us Prateek Waghre who is a Research Analyst with the Technology and Policy Programme at The Takshashila Institution. Prateek's research interests include the impact of technology in democratic networked societies, internet shutdowns, information disorder and major issues affecting the internet policy space in India. He has published editorials. Prateek also writes a newsletter on Information Disorder from an Indian perspective. So Prateek is with us online today.
Also with us we have another speaker, Janaina Costa. She is a senior researcher at the Institute for Technology and Society. She has a degree in social and economic development. She is post‑graduate in digital law. And her fields of interests are public policies, technology and Human Rights.
And the third speaker at our session is Elliott Mann. He is a recent graduate of Swinburne Law School in Melbourne, Australia where he majored in law and cybersecurity. He was involved in both the Government and private sector in developing cybersecurity projects and legal projects. He is leveraging his experience in law and consulting firms to build greater awareness of Internet issues. Elliott has been involved in projects, ISOC Youth Ambassador and regional initiatives.
So these are our speakers for today's event. And with that, I would like to give the floor to our first speaker, Prateek, I'm very sorry if I'm reading it in the wrong way, Prateek Waghre.
>> PRATEEK WAGHRE: That's fine. I am going to quickly share some slides. I think you have to stop sharing yours. Let me see if I can ‑‑ good. Can you see my slides?
>> EMILIA ZALEWSKA: Yeah, we can.
>> PRATEEK WAGHRE: Great. So to generally give us a good amount of background, I want to start with this visualization from this paper, I have linked to it below which is published in Internet policy review earlier this year. And it gives you the landscape of what different, how the COVID‑19 response apps were across the country. It gives you a sense of a number of different types of approaches and things that we saw. I have about seven to eight minutes.
I want to cover three broad points. I'm going to keep it wider and we can deep dive at the discussion. I want to focus on technology theater, a framework called the Viability Rating Framework that my colleagues and I tried to develop early on in May last year. And there is some general considerations in the scenario.
So let me start it. What is technology theater? So this idea ‑‑ essay by Shawn McDonald some time last year and he proposed this idea of a technology theater, which is when your public policy response is more focused on the details of the technology, rather than addressing the actual problem or the core ‑‑ the core issue.
This is not unique to COVID. This has happened before and it will continue happening. But it is important to look at what the implications of something like this are. And when this happens, but the outcome is that two broad things you look at first, is that one, technology plays a much larger role in the way that citizens and people interact with the Government.
And second, it also then plays an outside role in how the Government and states respond to problems and how they solve them. And as a result of that, what then happens is that we have a situation where there is, you know, essentially a shift in power. And what happens is that issues that had to be addressed through a public consultation, et cetera, now get moved towards procurement processes and rules. There are ‑‑ and the decision are outside of the public domain in many cases.
Nuanced conversation is about how the technology works. How to go about instrumenting them, et cetera. All of that then becomes sensationalized. And it has three broad downstream effects that we can think of. The first is the opportunity cost which diverting the resources ‑‑ you are diverting resources to focusing on technology rather than sometimes rather than at the expense of the core problem. The lens of an analysis that we apply to these policies changes because we are not necessarily looking at the outcome of the policy. We are looking at the technology or the technical process. So in the one way we think about in public policy is that efficiency can be being given a much higher weightage than effectiveness, and in reality you need to find a balance. This is not to say that experts have a role to play but it is important. The point that Shawn is making they have an outside role to play. And certain experts have interests at stake.
I just very quickly want to switch now to the framework called the Viability Rating Framework that we tried to define last year. And the idea of this was to say that okay, we have a number of interventions being tried. Is there a way to assess the impact? Because our end goal is to ensure that they compliment the pandemic. So we looked at three criteria. One was the population penetration which is in terms of what percentage of the population, you know, a certain technology could benefit or could effect.
Second was privacy. Don't necessarily need to explain that. I think that's self‑explanatory. Third was effectiveness, what is the efficacy. So when we look at population penetration is some sort of proxy for equity. I want to call out there are a lot of other considerations that need to go in to whether it is absolutely equitable. But sitting in May 2020 I think this was a reasonable proxy we can come to. And we tried to rate them on high, medium and low. It was ‑‑ essentially high, medium, low.
On privacy we looked at certain things. We looked at whether there was purpose, limitation, defined what type of permissions the various apps tried to look at. Whether there was ‑‑ they ‑‑ what sort of data retention policies they had. And what sort of oversight mechanisms were built in. We had done this as a forward looking. If you want to extend this framework further you can also look at this stuff through privacy audits down the road and then essentially Tweet the criteria that we look at.
And the last was effectiveness, which is like I said the efficacy of the technology. Again at the time we were doing this, I think right now everyone thinks of contact tracing and low energy and Apple and Google's exposure notification system.
In May of last year there were lots of experiments being conducted. There were 60 different applications in India alone and trying different levels of things. Looking at GPS. We tried to determine that.
Now this ‑‑ just to wrap up, because I know I'm also running out of time, certain considerations that became obvious to us as we were doing this, in the situation like this, how do we balance for equity which we need to do versus peer that you have to have in a pandemic response. Where will this framework come and help us guide whether something needs to be voluntary or mandated. If it is not addressing a large portion of the population. You want to keep them as voluntary as possible. Whether the response should be at a union, Government level, not a federal level versus different state levels, states doing it on their own. What role do algorithms play in terms of determination of risk and immunity.
And, you know, again this was ‑‑ the platform power was not really apparent back in May when we had to divide this framework, but the idea ‑‑ we have seen over the course of the last year how a lot of decisions and a lot of countries' responses were shaped by the decisions that Apple and Google took with their exposure notification framework.
And just very quickly to close, I want to highlight some points from the document by the theater, which is a few points that essentially when we are looking at these scenarios the tech needs to be effective. Public health monitoring is high stakes. So we need to justify the measures that we take. Corner cases, edge cases, looking at what needs to be built in proactively, rather than dealing with them down the road. And, you know, all other interventions will judge them as part of the ‑‑ as part of the system and not in isolation. Which is to say that you can have an app that scales 2 million of users, but if you don't have the public health system to go along with it it may not be ‑‑ it may not be as effective. So with that, I am ‑‑ I'm done on time. I will stop sharing.
>> EMILIA ZALEWSKA: Thank you very much. So now I would like to give the floor to our second speaker, Elliott Mann. The floor is yours.
>> ELLIOTT MANN: Right. Thank you. I don't have any slides. And I think I'm going to focus on really the situation here in Australia because it is quite interesting how the contact tracing apps and everything have developed. There is two phases to how they have developed. So first of all, nationally, the Government commissioned a Government digital agency to develop a proximity tracing app based off the app in Singapore. So we had the COVID safe app was the first one developed in Australia. And that came out in the middle of last year.
And it was, you know, a Bluetooth tracing app. Keep it on the entire time. There are issues initially about how good it was. But it was a Bluetooth tracing app and they passed amendments to our privacy laws here in Australia to make sure that the information there was, you know, stored safely. You could not be forced to use the app. It was illegal to have private entities or Government agencies force you to use the contact tracing app entry to places. They made public the source code. That was fantastic, except for by the middle of last year COVID was not a problem in Australia anymore. The cases never got enough such that the contacting tracing in the app was actually needed. All throughout last year every single case in Australia was able to be manually contact traced by an individual.
Fast forward to this year and, of course, with the rise of Delta and everything we have had even more cases in Australia. Such that not all the cases being contact traced manually. You would think this is where the app would come in to play. Everyone has gotten it now.
The second phase of contact tracing apps in Australia which is the QR tracing apps, this is interesting. In Australia being a federal system in the Federal Government of states and the territories proximity tracing one is a federal one and has privacy protection and everything. Each state and territory around the end of last year also rolled out contact tracing apps based on scanning QR codes. We would be familiar with the idea when you enter a venue, scanning a QR code and say you marked there. You scan the QR code. And if it turns out that you have been at the venue at the same time as a COVID case, you will be pinged as it were within the next few days and you have to get tested.
It has happened to me at least twice. It is very ‑‑ it is understandably frustrating having to go get tested. But again, you know, that's how the QR code tracing app works. And the issue is, of course, that while when the federal app came out, there are all these legislative privacy protections. There is nothing like that for the state‑based ones. In fact, there have been cases in Australia of police accessing the QR code data, people checking data for police investigations, which should be impossible under the other app. But definitely available under these ones.
So we are in this situation where the app that has the best protections and, you know, uses the proximity tracking, nobody uses and has not been proven to work. Whereas the apps that do work have no privacy protections at all. And are being accessed by other entities.
And we are seeing some changes to Victoria where I am. They recently legislated a new pandemic law. And as part of that they have added privacy protections for the QR code data. But in the rest of the country it is still quite unclear.
So this is ‑‑ that's the main concern that I have at the moment is that people look back and they go we got all those great protections. But it was for an app that didn't work and nobody uses. And the apps that do work now and that people are using all the time and by law you have to use ‑‑ have no privacy comparisons at all. It is an interesting situation that we find ourselves in.
>> EMILIA ZALEWSKA: Thank you very much, Elliott. And as our first speaker, we have here today with us Janaina Costa.
>> JANAINA COSTA: Thank you so much. I have some slides to share just ‑‑ I don't miss myself in the speaker. Jenna, you going to share them for us?
>> JENNA MAN HAU FUNG: Sure. Give me a second.
>> JANAINA COSTA: Sure. Okay. Great. So thank you, everyone, here. My name is Janaina Costa. And thank you so much Jenna, to organizing this panel and this workshop. And I hope I have ‑‑ I bring some more questions than answers to our discussion.
So what I wanted to share with you, and you can share the next slide, Jenna, please, is that here in Brazil we didn't have like the Federal Government kind of gave up using contact tracing applications precisely because of the controversies that surrounded and the lack of coordination between federal and state initiatives. Instead we use more heat maps to generate data from telephone operators which in theory use the data on the location of mobile cell device to measure social isolation rates. Just a second. I think everyone just sees the first slide. I think you can go to the second or third now, Jenna, if you can. Thank you.
>> JENNA MAN HAU FUNG: Is it not moving?
>> JANAINA COSTA: Not for me.
>> JENNA MAN HAU FUNG: Okay.
>> ELLIOTT MANN: It is not.
>> PRATEEK WAGHRE: Not for me either.
>> JENNA MAN HAU FUNG: Is it moving?
>> ELLIOTT MANN: Yeah, it is working now.
>> JENNA MAN HAU FUNG: I will show it this way then.
>> JANAINA COSTA: Great. Thank you. So yeah. So I'm just going to show you examples of what we used, not in Brazil because we gave up on that heat maps. But we had, I'm going to share in a few minutes with you, used a rule of law and a rights‑based test to evaluate three contact tracing apps between America. So we actually applied a test framework that was developed to understand the governance of digital identity system to these applications. So this framework was developed by the Center of Internet and Society in India, the CIS. And the test then finds the case studies, the characteristics that use the ‑‑ that has been applied to other cases of digital equality in other regions.
Mainly like COVID data, in the Mexican state, we had also Peru and the Corona app in Colombia. The three of them were not mandatory. And ended up being used as a means of informing the population and its effective applications for the various contact tracing because they are not mandatory.
However, there was no provision for destruction or anonymization of data. How to access the data will be interrupted. This was a concern throughout the three evaluations of these three applications.
I think you can go. Continue, please. Yes. So once we pass that, I'm going to share you with the links for the whole evaluation. And the ‑‑ and you can see the framework that you test. But I would like to share with you how it was done in Brazil.
So this is a picture of the heat maps that was created. They took aggregated data that usually do not ‑‑ there are five people in some area with you the connected cell phones in the area so can create a heat map to indicate places of concentration and can give a more or less comprehensive view of this place and partners, if people are leaving through the quarantine, breaking quarantine.
But however, there is still some doubts about how this data is treated and whether they can go from a macro view to the granular view of each individual connected. We get this crossword here how to create an analyzed database that can strike a balance for usefulness and at the same time does not go around revealing everyone's identity. And how then ‑‑ how can we know if the data has been really analyzed. And for Brazil, we have to stress that analyzed and anonymous data is different terms.
In Brazil one speaks of analyzed data and not anonymized data. Once being personal data but it went through a procedure so the links with owners were completely erased and make it analyzed. So our general data protection law states this analyzed data will not be considered personal data for the purpose of the law, except with anonymization law process is reversed or with reasonable efforts and pay attention to reasonable efforts. What you have here is that for the data to be considered anonymized we need to look at at least two factors. As an objective factor in the concept of reasonable efforts the law itself mentions the cost and time needed to reverse the anonymization process according to the available technologies.
So on the other hand, for the subjective factors we have to look at who carried out the anonymization process and who tries to break it. This counts when you measure whether the data has been anonymized for real or not, just anonymization and can reverse back to the data owner.
So making a personal data again.
So rather the components hides some information here and there that allows people with more time on their hands, time in their life to access and reveal the identity of the data holders. You can pass again, Jenna. One thing that we have in Brazil was a notorious case to illustrate how this data sometimes are being said anonymized but can be easier read with data not such reasonable efforts. It could be easily traced back to identifying the holders.
A notorious case that was possible to review the identity of people who have their data anonymized in a database available to many third parties in the Internet, completely public by the largest Brazilian cell phone operator. They shared this database saying it was anonymized. Just generate information. But just using journalists using just reference to social media and other websites, she could identify many, many of the subjects that were displayed in this database. This was a scandal. And proves much this process of anonymized should be a better framework and better looked at.
So just to wrap it up, my presentation here, and I would be happy to answer any questions if you have any, you can put it in the chat and we will address them in the chat also. It is that in Brazil it is legal, data ‑‑ it is legal to process the data in order for the protection of life or physical safety of the holder or third party, and for the protection of health and as the other countries, but the public administration can also make use of the processing and share the use of that data towards the ratification of public policies and combatting of pandemics. What I want to stress and worth mentioning before I wrap up the presentation, it is that this approval given by the LGBT to use the personal data even without the consent of their holders in order to protect public health, the health of third parties, it is not a blank check. So it is not unrestricted.
So the data process for the generation of public policies or to fight the pandemic should only be used for this specific purpose. And if they are used for other purpose such as sending advertising or electronic message afterwards this use is completely legal and could lead to accountability of those involved under the Brazilian data law. I'm going to share the studies of framework in Mexico, Peru and Colombia in the chat. I think you will find them interesting, too. Thank you so much.
>> EMILIA ZALEWSKA: Thank you. Let me share the presentation again. It is not moving. And it is moving. Let me introduce you to the next part of today's workshop which is the panel discussion, because all our three speakers made some pretty interesting remarks. So I would just like to ask you, our speakers, if you would like to comment on each other's presentations.
If any of you would like to start, just please take the floor. Okay. Maybe just to facilitate.
>> PRATEEK WAGHRE: I can go very quickly. So I think I just want to say, the point that Elliott made about application, two sets of applications. The one that there were all sorts of privacy protections for, turned out it wasn't used much, while the one ‑‑ it turns out with QR codes carrying, there aren't a lot of privacy and that's the one that's being extensively used now. I think that's extremely interesting. And depending on how that use case spreads, I think that's something that we need to be ‑‑ to be watchful for, right? Because, you know, the way these interventions compounded they are always shifting. It is important to keep an eye across all things. That's an important part.
>> JANAINA COSTA: I would make a provocation to my fellow panel speakers, forbidden to make an app mandatory. Where the app is not mandatory we don't have full adoption. Under 60% of adoption this contact tracing apps nearly offset for nothing. How is the situation in India, for example? How is the balance between ‑‑ is mandatory and how is the balance between privacy and public health?
>> PRATEEK WAGHRE: Yeah. It was a very ‑‑ so as things stand right now, there isn't ‑‑ the contact tracing application is not really in the public concession right now. But going back to, you know, the middle of last year, it was certainly being discussed. And as part of the paper that I shared a link to, we also tried to ‑‑ just tried to estimate, look because this relied on Bluetooth. Relied on everyone having Smartphones and certain types of Smartphones because again the market is very spread across. You have a lot of low‑end phones. You have mid‑range phones and you have a few high‑end phones. Based on what public estimates there were, there was, you know ‑‑ it turned out that a lot of people would not really be ‑‑ would not really benefit. Would not be able to use the ‑‑ a Smartphone application anyway. But the way ‑‑ the way mobile connectivity was dispersed maybe there was some potential for it to be used in cities versus other parts of the country. But again even within cities it is quite a big leap to say that everyone has a Smartphone and everyone's able to use it. And which ‑‑ so we were advocating for this not to be mandatory in any shape or form.
Even though ‑‑ so that you had cases like ‑‑ so a fellow organization in India called the Internet Democracy are maintaining a tracker of different ‑‑ of different either private suspect companies or different State Departments who went out and made the contact tracing app mandatory. There was pushback from Civil Society. So here it was that situation where, you know, there wasn't a lot of ‑‑ it is hard to say how many Smartphones would have been able to benefit from it. As it turned out through the pandemic the way it spread, Bluetooth, low energy, really not that reliable indicator anyway.
But ‑‑ and that's why we ‑‑ from the start we were advocating it should not be ‑‑ it should not be made mandatory. It should not be a way to deny people access or rights in any shape or form.
>> JANAINA COSTA: Totally agree. And I think the conversation is shifting from contact tracing apps. And the same questions should be mandatory. And if it is not mandatory, it is work less and back again. Public health, privacy, on the balance all together, all over again. To repeat itself so quickly.
>> ELLIOTT MANN: I agree. One of the interesting concepts that has come out of my research, the federal COVID safe app might have failed because the privacy laws and protections around that are not strong. Because the health and contact tracing is being done at a state level. And to get the data from the federal contact tracing app you had to give it to the states. And the privacy protections and the processes around that were so strict, that it was actually very difficult to hand that data over. So I was wondering, maybe your thoughts on whether you can go too far in the other direction and you can make the privacy laws too strong and kind of defeat the purpose from that end.
>> JANAINA COSTA: I think it would be very hard for those words to come out of my mouth. Privacy being too hard. I'm really a privacy person. But I understand your point. And yes, I think we ‑‑ we have to have a balance there for sure. For sure.
So very good point.
>> PRATEEK WAGHRE: While I agree with that I would say that we are a little ‑‑ at least we are a little far away from privacy apps maybe going too far. Especially where I live, we are arguing for a stronger ‑‑ a stronger law as compared to the draft that we are seeing right now. But that ‑‑ it is very easy for the pendulum to shift the other way but I think we are not there yet.
>> ELLIOTT MANN: I would say that this is certainly not a normal case. And I think the approach where I am in Victoria of carving out and creating specific sort of roles and processes around that, around in the privacy law, to recognize that this is a special situation, I think is a decent approach to take.
>> JANAINA COSTA: But that is recognized. I was stating like for the Brazilian data protection law, so we have specific highlighted use case that you can share and use and treat personal data, even without the holder's consent. If it is for the public health proposed like fighting a pandemic or even to just for third party health situation. But the importance here is to have clear lines what that means. And the ‑‑ for example, for Colombia because we don't have a general data protection law, it is very blurred, the lines how far we can go, what you can do with this data, what is going to be done after the emergency situation is gone and that's a real concern.
Okay. This data will be deleted. And you will be anonymized. These kind of basic standards should be very clear. Once the emergency is done are going to raise all this data because we only use for the specific purpose. I think that's a very important point. Maybe our audience have some views on that that you would like to share with us maybe on site or in the chat.
>> JENNA MAN HAU FUNG: I believe earlier when Elliott was doing some sharing Allen was typing something in the chat. It was quite earlier. But I guess we will have breakout groups very soon also. But I will look to Emilia in terms of the time, whether we are moving to the breakout groups soon. Because I guess after the breakout group we will definitely have a roundtable and we will discuss it with everyone.
>> EMILIA ZALEWSKA: Yes, I guess if all our speakers are fine with that we will move to the breakout rooms part. If Jenna, you could introduce our attendees in the form of breakout rooms.
>> JENNA MAN HAU FUNG: Sure. Will you be able to move it to the next slide, slightly a little bit? Yeah. Probably we have some new arrangement because I guess for now people in Katowice will be grouping in one group and joining Emilia and Pedro and discussing some policy questions. Emilia will continue to share the policy question on the screen. For those online joining us in Zoom you will be split in to two virtual groups. One will be joining Emilia and the other onsite participants and the rest will stay in group 2 with me. And then we will continue the discussion on the policy questions and on, you know, echoing on the sharing from our guest speakers. So that's basically it.
And after the breakout group discussion, we will like our participants to do like a really brief and concise summary in the Round Table so we can further discuss it later on when we go back in to the main room. And yeah, we try to get more attendees more involved instead of having our speaker talking most of the time in this one‑and‑a‑half‑hour session.
So I wonder if anyone in the room or in the Zoom room have any questions regarding breakout group discussion? And so sorry, I missed it, Bea's comment in the chat. I would like to read it out. So Bea mentioned here in the Philippines the app is not mandatory. But similar to Hong Kong the app is starting to become more used. But some establishment lets people entering write in the paper, since some people do have Smartphones but no Internet connections or mobile data to be able to access the QR code of the app.
So I guess Bea was also echoing some comments I made regarding the situation in Hong Kong. And they also point out the problem of people with no Internet access in terms of whatever reasons and I guess some people from the older generations have a problem using the app also. That's what we can explore when we have the breakout group discussion. If everyone is ready I guess we will break in to the room. Technical support on Zoom please help us open the breakout room. And I hope everyone can move it smoothly in the room in terms of the discussion for group No. 1.
And also Emilia, please help scroll to the next slide so that everyone can see the policy question in your room. And I will copy these questions in the chat so everyone can refer to it. And technical support, definitely needs some more time to handle the breakout room on Zoom. I guess those in Katowice can start moving a little bit to get closer so we can start the discussion soon.
And, of course, I think I just forgot to mention our speaker will be joining the group. Prateek will be joining group No. 2. And Janaina and Elliott will be on group 1. I'm not sure if you can jump between two rooms. Let me check. Yes. Everyone will be in the room can be in one group.
Or if we have a big group of participants there at the moment, because what I have learned it is a good size of having one group for discussion earlier.
Just a few comments on the policy questions. These questions are meant to address those thematic focus designed for IGF 2021. So I guess that's how we can spend this 20 minutes in terms of relating back to the main theme of this year's IGF. So we can contribute our examples in cases to this questions. And so we can contribute to the output of IGF 2021 later on in our report.
>> EMILIA ZALEWSKA: So let me just check with the technical staff, when we could have those breakout rooms, how many times ‑‑ how much more time do we need? Okay. Five minutes. Okay. I think five seconds. Thank you. Waiting for people who are online to join us. Okay. Just one more minute because we still have some unassigned participants.
So we are all here. Let me share my screen with guiding questions. So I guess we can start with Prateek giving us some remarks on the first policy question. And then everyone who would like to take the floor will have the opportunity to do so.
Prateek, you can go ahead with the first policy question. Can you see my screen with the policy questions? I'm not sure, because I started screen sharing. But I'm not sure if you in a breakout room can see my screen.
If somebody who is in the breakout room could just give me a comment on that. Can you see my screen? Hi. Can you hear me? Prateek, if you are saying something I can't hear you. Are you saying something? Please write in chat.
Okay. I think we have some technical issues.
>> JENNA MAN HAU FUNG: Anyone else in this breakout room would like to talk a little bit or as we will start to explore the policy questions slightly a little bit? Who wants to share their thoughts on your situation and trying and relate it to the policy question? I will be sharing later.
>> JANAINA COSTA: Sure. I think when you see one of these questions, what values and norms should develop in use of technology. I think we have to think about the two golden rules, principles when talking about Human Rights, that's necessity, proportionality and advocacy of these measures. That's a great start. That can impact on Human Rights such as privacy, right to Assembly and so on and so forth such as contact tracing and other isolation pleasures also do.
Also I think we definitely should think about digital inclusion. So seeing that shouldn't deny access to a service or to a place or a right based on digital access. So as you see we just dropped from the Internet. She was being denied access to our wonderful discussion, that's not right. That's my 20 cents of contributions for now.
>> BEA GUEVARRA: Thank you for using me as an example, LOL.
>> ELLIOTT MANN: The whole privacy by design part as well. Especially I think it is a difficult balance because, of course, you got to balance the perfect public health solution. But also the perfect privacy solution. And you got to find somewhere in the middle, but I think it is not too extreme to say that you should definitely have privacy by design at the core of your principles when developing any technology solution. I think that's a pretty reasonable course to take. And I think otherwise the other norms to consider are, you know, really minimization, I think that's my ‑‑ one good thing I kind of picked out of the Apple/Google sort of exposure notification, it was really focused on minimizing the amount of personal identification that could be collected back to a single person. But minimizing the amount of data that you are collecting and making sure it is for the purpose while still maximizing the public health benefit.
Given the gravity of what's going on in the issues that we face, I think I would really appreciate the people developing these solutions to think about those really closely in the same way that they would think really closely about the public health sort of issues.
>> JANAINA COSTA: If I may just to make a point on Elliott's point, it was great that you mentioned data immunization and all these paradox, as more data points we have, more useful and value we have to the aggregated data. So what it is, what's the perfect line of data minimization in this kind of context, that's a topic very interesting to develop. Thank you so much for that.
>> JENNA MAN HAU FUNG: That's a really interesting point. And it inspired me to think about the situation in Hong Kong. I think in our case, slightly different because given the situation in 2019 with this pandemic, younger generations in Hong Kong they don't quite trust the Government. That's one thing given when the pandemic has happened. And makes it even harder to implement the use of technology, even the intention of, you know, using the contact tracing app to control the spread of the virus is actually good at first. But it also makes me think that I'm not sure if it is like a comment or a question. But I guess at this moment, there is like two sides where some countries still stick to thinking they have to make it case free. And some has actually decided to live with the virus.
So this has actually got me and lead me to think that whether this position actually makes one country think, how I should say it, make one country to decide to be more dependent on the contact tracing app or vaccine passport kind of thing. So if some countries tend to believe in case free thing and I think they spend more effort in stopping people from entering the country, being so strict in quarantine, knowing in Hong Kong now we have 21 days quarantine in hotel. This makes me think that it really depends on the country's or city's policy. So if one country is not really believing living with the virus, why they should continue to be so dependent on contact tracing app. That's one thing.
Because I believe when we use the technology nowadays, there are many issues, like people are not able to use it or people do not have access to the Internet or do not have access to the device itself. It even happens in Hong Kong. So I think that would be even, you know, a bigger problem in some other country because not many people have access to mobile data.
And so I think that was something we should also consider in terms of, you know, the use of technology in this way nowadays. One Government ‑‑ I know it is like Digital Age nowadays. But I think one Government when they do the policy they should consider all the things because that's a reality at this moment I think.
I think Bea also mentioned a situation in the Philippines. People do not have Internet.
>> BEA GUEVARRA: I mentioned that there are establishments that use a paper and pen or pencil which is quite funny because again we are trying to stay away from all this COVID, but yet they are sharing a pen and pencil that's going to an establishment. So that's also quite a paradox in itself, where we are trying to again avoid, you know, spreading of the COVID virus, and yet that's one of the easiest ways to contact or trace the people who are entering these kinds of establishments.
So that's what's happening now here in the Philippines. We do have an application as well which is the staysafe.ph app. As I mentioned there is still this thing where people still can't afford the mobile data or have no access to Internet or no public Internet or no public WiFi available for it.
So it's quite rough over here.
>> ELLIOTT MANN: And I think that's one underappreciated thing is when you start talking about, you know, writing down where you have been on the pieces of paper that's almost more permanent than keeping it on your ‑‑
(Switching back to onsite room).
>> PRATEEK WAGHRE: Always at the receiving end of any harmful effects first. There is no single easy answer to this. It is a difficult problem that we need to work together. We need to build out ‑‑ make sure our process is a process for public participation. Can incorporate a number of people, a number of views. Take them onboard and then move on it. Yeah. There is no magic, easy answer to this. We need to do the hard work. And that's my perspective. I am happy to hear the others.
>> EMILIA ZALEWSKA: Okay. Do we have any comments? I also think it is a very interesting question. So I would love to hear your remarks. I think somebody wrote on the chat. Francis, are you in our breakout room? Yes, you are in ours. So I saw that you commented on the chat. Maybe you would like to elaborate on it a little bit more.
>> Francis: Hi everyone. So I saw the comment in the main room. And she mentioned that the Philippines, in the context of contact tracing we do not have tracing apps. However since there is no Internet in many of the establishments, what we do is we do manual logging in with pen and paper. So what we ‑‑ a recent case, there have been issues on phishing, using the ‑‑ phishing and smishing. What you put on the manual app is your name and phone number. And some issues that ‑‑ that many actors or scammers are using this contact number to do some bad things to the users. So I would like to echo what Prateek mentioned. It is a long way to go.
I feel that Governments need to work with the businesses, technical app developers in order to address these issues.
>> PRATEEK WAGHRE: Thanks. That's an interesting point.
>> EMILIA ZALEWSKA: Yeah. Thank you very much, Francis. And I don't know how much time we have left. I think like two minutes. So maybe if we could have a quick comment from you, Prateek, or from somebody else from the participants on the fourth question.
>> PRATEEK WAGHRE: Yes, I think values, norms testing I would like to hear from the participants before I go.
>> It is Phyo again. To reflect on the the second question, my answer of the first and second question, values on our ‑‑ values among ‑‑ have to transparency. That is one of the non ‑‑ policy, wherever we do the thing for the community. So that's one of the main ‑‑ one of the norms that we have to think about, think about it. And another thing is responsibility and accountability. Because if ‑‑ because of ‑‑ it sounds like a very responsibility is one of the things that ‑‑ wherever the Government implements the projects, and trying to lead on the policy, we have ‑‑ they have to take the ‑‑ they have to take the responsible and accountable for the consequences of the ‑‑ of that policy, for amending that policy for ‑‑ to be better. So I think there is a main value, that's my opinion on the norms of ‑‑ norms to develop the technology and to enable the technology. Thank you.
>> PRATEEK WAGHRE: I will just add very quickly. I agree with what Phyo said. We need to include inclusivity and we need ‑‑ we shouldn't be forcing these things on people by making it mandatory. We need to win their trust in the process. And I think it is easier said than done. But I think that's ‑‑ that's the way forward. If we want to avoid the type of politicalization that we have seen. I know the pandemic has ‑‑ of the last two years, we need to figure out a way how to ‑‑ how to make things voluntary and make people trust the system. And then work around.
>> EMILIA ZALEWSKA: Okay. Thank you very much for all your comments. I think that we are having less than one minute to go. So thank you very much for this discussion. And let's see you again in the main room.
>> JENNA MAN HAU FUNG: I guess we are back to the part about the Round Table now, right?
>> EMILIA ZALEWSKA: Yeah. Exactly. So I don't know if your breakout room would like to start to just quickly summarize the discussion you had.
>> JENNA MAN HAU FUNG: Sure. Bea, would you like to summarize the last part that you mentioned? And then I can briefly add to those points that you mentioned out at the beginning.
>> BEA GUEVARRA: I can start on the first question. What should be the responsibilities of the different sectors that are involved? And again what is needed for them to fulfill these in an efficient and effective manner? I did mention that raising public awareness is necessary in order for themes such as what we are speaking on such as digital inclusion and Human Rights to be heard. I did mention that I feel like it is very common to mention this in all the different issues that we talk about. But again raising awareness and sharing issues like this to the public will be part of the baby steps and in order to reach our goals that we want to make.
And one of the questions did ask or say how can we encourage others to talk about digital inclusion and respect for Human Rights if they don't even understand what it is all about.
And again I do think that all of these different multi‑stakeholders should cooperate with one another because they are all interconnected. And the responsibility is to educate one another. And they each have a vital role to again allow people to speak out, and to support people like us who are advancing in this kind of initiative. And again like I did also mention a while ago that being in hybrid mode is really awesome. Especially here in Forums especially as for IGF because people like me who are not able to be there are able to express my perspective on my country. And you are able to see what it is like here in the Philippines. And again it is a game‑changer.
And we get to listen and educate each other about it. So yeah.
>> JENNA MAN HAU FUNG: Thank you. I would like to add a few points from other speakers, attendees mentioned in our breakout group discussion also. So Inku from Taiwan mentioned that transparency is important because it shows the model they are using in terms of apps and then the intention of the use of the data. And that's really important because we are giving out so much information these days, Epsom countries like we use the QR code and then we give out the duration we are staying in one place. And our personal data also.
And so Elliott also mentioned the point about the data minimization, meaning how we should minimize data. At the same time maximizing the public health benefit at the same time which is something that Apple/Google model is doing. And, you know, in terms of values and norms that we ‑‑ when we are using such policy and developing use of such technology we have to find a balance, especially, you know, when Governments are considering the use of such a technology, they also have to take in to consideration that some people might not have access to devices or Internet. And I guess that's something we have to consider, especially that have responsibility on Government also as in encouraging digital inclusion when we are trying to solve such big crises these days.
And I guess there are ‑‑ there are two main approaches of Governments these days. Some are living case free and some are living with the virus. I guess they have to find a balance in terms of how much they are using this kind of heavily, I mean data collecting technology these days.
I don't want to take up too much time because I will get to pass the time to group 1 to do some sharing on what you have been discussing. We can talk a little bit further later.
>> EMILIA ZALEWSKA: Thank you very much. Because we don't have time, we can sum up the discussion and add your closing remarks to that part.
>> JENNA MAN HAU FUNG: Prateek, you are on mute.
>> PRATEEK WAGHRE: It is never a Zoom call until someone does that. Had to be me. Really short time, but I think the general ‑‑ and I will group the four questions together, right? But essentially I think the views were that look, businesses need to be more transparent about their involvement in developing things. Governments have to be the ones that respect Human Rights, right? I think Phyo made a great point. Marie brought up the point about, especially at certain times businesses should not be putting profitable people and they need to think beyond profits and gave the example of Facebook files. Consensus was we needed to have these sort of processes need to be consultative and need to bring different stakeholders together. Not just do it within countries. You need to do it with regional settings as well. Bring together Civil Society, bring together academia and businesses. And, you know, as ‑‑ and as Phyo highlighted there needs to be transparency and accountability all through.
And I think that that mainly sums up and I think ‑‑ what I would add was that we also need to ensure that through these processes we are keeping inclusivity in mind. We are keeping equity in mind. We are making sure that, you know, the vulnerable part to the population are also accounted for and taken care of. And this goes back to the report that I had in my slides as well. That we need to account for errors and bad things can go wrong so we can protect the people who are affected by it. I will close with that.
>> EMILIA ZALEWSKA: Thank you very much, Prateek, for your points and for the closing remarks.
And Jenna, shall I move to our next speaker and their remarks?
>> JENNA MAN HAU FUNG: Definitely. And if anyone who is attending this meeting has any comments or questions, feel free to drop it in the chat also. And we will try our best to address it before we end this meeting I guess.
>> EMILIA ZALEWSKA: Yeah. Definitely. So Elliott, it is your turn for your closing point.
>> ELLIOTT MANN: Sure. Thank you. Yes, I think the one benefit I have got nowadays is really broadening my view of what the contact tracing apps look like. I feel like I came in to it going like it is privacy. It is a collection of data and things around that, but now I'm wondering what's sort of the off ramp. How do we end the collection of data here. Are there issues around the design. And particularly around, you know, in developing emerging economy, how do we engage all of the people without Smartphones and everything. So I think certainly I think the take‑away here is that the issue is very large. And maybe the point that I would end on is that this is all incredibly new as well. 18 months ago the IGF contact tracing apps and all these sorts of technologies being used in this way was incredibly novel. If we are seeing these advancements in the last 18, 20 months I will be excited to see what happens in the next period.
>> EMILIA ZALEWSKA: Thank you very much. And Janaina, the floor is yours.
>> JANAINA COSTA: Thank you so much. I think it was quite a rich conversation. Very useful includes from the speakers and the participants. And as my further comments, I would say that as I said before, maybe you are shifting this conversation to vaccine passports and you should consider these privacy issues. Once the Government and private Government have a taste of so much power, you have to have a QR code or you can track where you are going all the time. We are using this as 18 months and how long more. We know that historically it is really hard to give up these newly acquired powers from private companies and Governments. So I think here we should continue to be aware. And asking those questions all the time.
>> EMILIA ZALEWSKA: Yes. Thank you very much for these inputs. So as we are heading to the end of the workshop, I just would like to ask the reporter of our session to do a quick sum up of the main points that were discussed today.
>> Okay. Sure. I think that a good way to sum everything up in a few minutes is the importance of a few principles that should be highlighted every time we are talking about these apps. And they are responsibility, transparency, accountability, and inclusivity interests. Because based since we can have ‑‑ we cannot have high answers how to achieve a good balance on how this app scan gets to prevent the health issues related to COVID‑19 and also data privacy issues we need to get to start on those principles to build solutions that are adequate to each place, to each country, each region.
So I think that may sum most of the discussion. I can't talk about all that was talked about in just a few minutes. But I believe this is enough.
>> EMILIA ZALEWSKA: Thank you very much, Pedro. Jenna, any final remarks from your side?
>> JENNA MAN HAU FUNG: I don't have much. I would like to say thank you to our onsite Moderators and also our Rapporteur in making our session happen. I'm sorry. I'm sure we will publish a report with our detailed information and discussion points that we made as I know that Pedro was doing a great work on our document and I can see all the points. And, you know, all the discussions that are going on in the breakout room and also the sharing from our speakers captured here. And are definitely not missing anyone's point in our report. And so everyone in this community or other stakeholders' group can know what's happening in this room today. That we spend one and a half hours discussing this topic.
And so I guess that's it from me. And I would like to thank our onsite participants and online participants for joining our session also. And I guess also a special thanks to one of our speakers, Elliott from Australia. It is a late hour. Thank you so much for staying up. I guess it is over midnight now. Thank you for sharing such amazing points with our panel today. And I hope that you guys enjoyed the session also. Back to you.
>> EMILIA ZALEWSKA: Thank you very much. We have eight seconds. So I will make it quick. Thank you very much, Jenna, for organizing this session and thank you to all our speakers and attendees.
>> Thank you.
>> JANAINA COSTA: Thank you.
>> JENNA MAN HAU FUNG: Hope to see you another year in person.