IGF 2020 – Day 12 – WS340 Checks and balances of data privacy within mass surveillance

The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

 

>> VINICIUS SANTOS: Hi, Ian, welcome, do you copy us?

>> IAN BROWN: Hi, everyone.

>> VINICIUS SANTOS: Testing the microphone.  Carlos?

>> CARLOS ALBERTO AFONSO: It's working?

>> VINICIUS SANTOS: Yes, it's working.

>> CARLOS ALBERTO AFONSO: It should because it uses batteries and sometimes they die.

>> VINICIUS SANTOS: As an information to you, this is Mr. Carlos Affonso, one is a speaker and one is an organizer.  The session is organized by the Brazilian Internet Steering Committee, Carlos Afonso is the executive director.

>> At any point the two Carloses can swap roles.  It's a little game we play.  It can happen at any time.

>> CARLOS ALBERTO AFONSO: If we do a mistake, you can choose to blame any of us.

>> FLAVIA LAFEVRE GUIMARAES: Let's go.  I will begin now.  Okay.  So good morning, good afternoon, or good evening for everyone.  I'm Flavia Lafevre in Brazil.  This is the workshop entitled Checks And Balances Of Privacy.  This is organized by the Brazilian Internet Steering Committee.  Thank you for joining our session.

Giving an overview of our session, it aims to discuss the checks and balances of privacy protection related to the worldwide use of personal data for mass surveillance purposes.  The discussion will address conditions in which huge amounts of personal data have been collected and used along with the potential risks and effects of these measures.  Additionally, we will also discuss the different ways our societies have been dealing with this debate and how most stakeholder Internet Governance ecosystems is framing these issues, reflecting on data protection from this perspective in the context of the pandemic.  The context of concentration of services and a few companies and the setbacks of fundamental rights and the political feud in many countries is essential now.

So our session will count on the participation of the following speakers and here we'll have four minutes for each one.  They are Chenai Chair, gender and digital rights research manager at the Worldwide Web Foundation.  Ellen Strickland, chief advisor, international at internet and -- (choppy audio) privacy manager.  Carlos Afonso de Souza, director of the Institute Of Technology And Society.  ITS in Rio de Janeiro. sorry.

>> TALAR KALAYCIYAN: You said it correctly.

>> FLAVIA LAFEVRE GUIMARAES: Okay, thank you very much for accepting our invitation to be here with us today.  Please feel free to add any information about yourself and also, I would like to thank the IGF Secretariat for making this a success under such challenging circumstances.  As we only have 90 minutes to address such important and complex topics, I would like to kick off our discussion with the first policy questions.

What are the demands, conditions to solutions, outcomes and potential effects posed by personal data in order to best utilize data without harming fundamental rights, such as the right of privacy?  So Carlos Afonso, the floor is yours.  You have four minutes and we follow the sequence after.  Thank you.

>> CARLOS de SOUZA: Thanks, Flavia.  Thinking about the balance between data protection and to make sure we're able to have both private sector and public sector aligned with the demands of data protection, I think it's important for us to consider some tools and outcomes that might be gone through some scrutiny to make sure we have this desired balance.

Talking about one example from Brazil, Brazil has now its own general data protection law that has been recently put into force.  It's a law that has been approved in 2018 and only right now has been set into force.  And one thing that comes out of this moment in which we have a new law on data protection is to make sure that this law ends up applying both to the private sector and to the public sector.  One thing I believe is of key importance in this debate is to understand especially how the public sector will adequate to the terms of a new general data protection law.

That will demand a change of culture and this is something that's quite important, which is to understand that law enforcement agents are now data processors.  That law enforcement agents will hand over, will process, will treat data and they will have to apply, and they will have to comply with the law by the end of the day.  This is quite important because in Brazil we're going through a debate on having broad judicial decisions.  That ended up handing over to law enforcement agents a huge amount of data that can be processed and scrutinized in order to find the person who has committed a crime, the person who is being investigated, but we have to think about the order of data that is being collected as well and how this data will be handled, how this data will be erased.  This is a clear concern we have at this point.

A second concern that I would like to point out in the beginning of the debate is somewhat politicization of the data protection debate, especially in times of COVID‑19.  We can have at this point in time a discussion in which governments might want to attach themselves to data protection discourse in order to not deploy certain activities that they might feel like could hurt their interests and at the same time that might lead the population into a situation of extreme vulnerability.

This is a situation that we might want to take special attention.  Just to conclude, we might take a lot of caution on how access to information and data protection end up intertwining because we can also see governments using data protection as a way to shield themselves from access to information demands.  This is something that we might want to take a look on.  A general data protection law should not be an excuse for governments to be more open and to comply with access to information demands.  This might be used for artificial reasons.  Those are my first few remarks, I get back to you so we can continue the debate.

>> FLAVIA LAFEVRE GUIMARAES: Thank you, Carlos, now Chenai Chair, you have the floor.

>> CHENAI CHAIR: Thank you, everyone.  And hi, everyone, thank you for having me on the panel.  So I'm dialing in from Johannesburg, South Africa, which I think it's important to locate where we are with all of these conferences.

My interventions when I was thinking about like the policy question that was posed was in terms of sort of like the process of data collection, the processing, the implementation and how it's used.  The first point when we're talking about the conversation around the pandemic was the data was going to be collected anyway.  So then the point of conversation was how do we ensure the necessary safeguards for the collection, for the processing that are also based in particular contexts of states that have data protection laws and strong privacy regulators or states that don't have, as was in the context of most African countries ‑‑ I think close to about 25 or 28 countries had data protection laws at various levels of implementation. The question is always around like how do we ensure that this is done in a meaningful way and it doesn't result in increased surveillance, especially in context where any access of data results in increased surveillance and does not disregard privacy rights in response to the issues of bias and discrimination.

So the Web Foundation as part of its policy series on COVID released a policy brief on COVID and data and put together some of the conditions that were necessary to ensure that data privacy and data protection are ensured during this pandemic.  And they actually focused on open thinking in terms of the arguments are often juxtapositioned of public health data, this is ensuring the right of privacy and these were not separate camps, but actually there was a need to approach these issues bearing in mind this context.

And also there was ways to think about the way in which privacy laws already existed could not be applied because we're in a health situation because they needed to be adjusted as responsive to context.  For countries that already have these systems in place, it was a matter of looking how do we apply them now in this particular context.  In cases where there was some countries without data protection laws in place, we found that say for example in South Africa where the protection of personal information act was only implemented July 1, but had been developed from 2013, what happened then was the data protection authority, the information regulator, actually stepped into work in partnership with the ministry of health and how the data would be collected and processed.

That just goes to show in such incidents there are mechanisms that needed to be put in place.  I think also one of the big concerns in terms of the conditions necessary is that oversight mechanisms are needed.  Whether it's at a public level, an interregional level, a national level or subregional level ‑‑ even civil society, new solutions are being implemented in regards to data collection.

I think what's also most important is around public trust.  So as long as all of these data solutions are being put forward but they're done absent of the fact that people don't trust governments, in terms of how the information is going to be used, any solution that's going to be implemented needs to be aware of this context of public trust and needs to be able to take into account how they can explain the process of data being collected, the transparency involved and what the data will be done and ensuring they can respond to any inquiries that the public poses with regards to how the technology is being used. For example, implementation of contact tracing solutions that may be privacy by design but if the public doesn't trust them that voluntary uptick is not going to happen and that has an impact on the actual response to the pandemic.

I think it's one thing that personally I found to be missing in all of these conversations, at a community level, the impact on marginalized community members in our society.  Oftentimes the solutions or conversations around privacy and data protection assume a similar experience of issues, similar access of technology, similar understanding of what it is that is happening in this context.  And we actually found that ‑‑ my work has focused on the gender and data protection aspect where women and gender diverse people are more concerned about their privacy.

When you look at the solutions that were put forward, even for governments who didn't have privacy systems in place or when governments were rolling out privacy solutions that seemed to be a conversation that was overlooked and fell more into social safety nets than actually trying to guarantee people that their identities would be protected, that this information would not fall into the wrong hands resulting in being exploited.  Those are some of my interventions in thinking about the necessary conditions that are needed in order to ensure that when we put forward data driven solutions in response to the pandemic these are the things we need to take into account for the solutions to be successful.

>> FLAVIA LAFEVRE GUIMARAES: Thank you, Chenai.  Ellen, can you ‑‑ the floor is yours now.  Thank you.

>> ELLEN STRICKLAND: Thank you, very much, Flavia, and thank you to the organizers and to the other panelists to talk about this very important topic. I'm very grateful to be here to share some of New Zealand's experience on this.

The relationship between data and the internet makes trust and the use ‑‑ really a key challenge I think we face to ensuring ongoing trust and utilization of the internet.  We kind of focus our work on internet for all and internet for good and data being used ‑‑ utilized without harm to people, to their human rights is crucial for people being able to ‑‑ in choosing to use an internet for good we see in our research that the issues of data and trust can be a real inhibitor to people's engagement and use of the internet for benefit because of concerns.

We do a range of work on these issues.  We have some principles we apply to situations around transparency, accountability, community collaboration in figuring out what happens and also, we have a commitment to our indigenous people of New Zealand as a bicultural country.  I think that's quite important to think about from that perspective and include it.

It is, as was pointed out, issues that deal with the private sector and also government on these important issues, both sectors sometimes together play pivotal roles in how personal data is used. for different purposes.  So I'm interested in this group and the discussion about going forward about commercial purposes but also then we have government purposes and how you manage those, too.

For us, one concept that we used recently is the concept of social license, working that community collaboration about why the data is being collected and working through kind of find through collaboration agreement of how it will be collected, managed to meet the why, so design methods that do it well.  It particularly is something we advocate for in government circumstances, but certainly it could work in commercial purposes.

A couple of examples of things that are happening here related to that is we have new privacy legislation that's coming into effect here, which we have an independent privacy commissioner, attached to the government that plays a role in government as well as a watchdog role with businesses.  It's been a long time since our privacy legislation was updated, 1993, so we've been involved in the process of discussions and getting community engagement there because a lot has changed, the internet since then.  And I think the privacy commission's role was quite challenging with the old legislation.

The new privacy act 2020 introduces what intends to be greater protections for individuals, but also new obligations on businesses and organizations.  It includes an important requirement for reporting data breaches to the commissioner as well as having oversight with the commission as well as to affected people, so people understand when there have been breaches involving their data. The new act helps people access their own information and requires businesses and organizations to comply with the law around people having access to their data which is important and the commission ‑‑ as well as oversight ‑‑ helps provide guidance and support.  And one sort of controversial thing that has gone through is we now increased fines.  Previously our privacy commission could call things out, could support it, but wasn't very ‑‑ didn't really have teeth so to speak.  So having fines for complying and that there are rules about data being sent overseas.  Sort of thinking about it from a New Zealand perspective.

That's been an important process for us, updating our privacy act and legislation.  And I think, you know, having an active commission in that area has been an important oversight but also support for people.  It takes kind of a public role of informing and educating.

The other example I would just mention ‑‑ and thank you to Chenai ‑‑ around COVID and the contact tracing.  Obviously, that's been an example worldwide, so it's something we can all think about.  We were very involved in using that principle of community collaboration, trying to get people together to discuss and develop the desires of contact tracing in New Zealand and an understanding of there were concerns from initial research about data and how do you do this in a way that people will use it because they're worried.

We've worked to keep the pressure on to talk to people and understand and have a privacy by design approach.  For us, in New Zealand, that meant rather than centralized data we've had a QR code system where your ‑‑ it's kind of a log on your phone and you hold the data and they ‑‑ the notifications get sent to your phone and it will be only accessed if you need to do the contact tracing system with our prime minister and government being very vocal about privacy, being sort of a primary consideration, so no automatic tracing.

But having said that, we understand that there is a likelihood of Bluetooth auto contact tracing being introduced.  There hasn't been much meaningful conversation about that since those initial conversations.  It's an ongoing space I think for everyone to watch and see what's happening. One of the points we make is that the consulting, the working with the community ‑‑

>> FLAVIA LAFEVRE GUIMARAES: Ellen ‑‑

>> ELLEN STRICKLAND: I'm sorry, I'll wrap up there.  It needs to be an ongoing process, not just at the beginning.

>> FLAVIA LAFEVRE GUIMARAES: Okay, thank you very much.  And now Nneka, please, the floor is yours.

>> NNEKA EKECHUKWU-SOYINKA: Thank you, I'm calling in from the United States.  And I'm a privacy practitioner, so my role is one to operationalize a lot of these, you know, privacy by design principles for example.

When we first talk about the demands that are posed by the issues going on, I'd first like to mention that sometimes we see that there's a tendency to solve with technology without first understanding the problem.  And I think if we see how many contact tracing apps to speak specifically to that example which has been used, were rolled out so early into the pandemic, perhaps that supports that statement.

But we want to first understand the actual problem we want to solve, and then once we know the problem and work towards a solution, we can determine the data needed and then consider whether that data is even proportionate to the benefit that that technology would be providing.  So to provide a simple example from one country's app earlier this year, the purpose was a mobile alert system.  It was just supposed to provide emergency notifications and essential information to those within their country.

However, in at least the Android version, it was collecting precise location data and using that information, even after the app was closed.  It was forcing access to the microphone and it was forcing the collection of an individual's e‑mail or phone number.  So with all these different digital solutions, there is a difference between an app whose purpose is contact tracing versus one that perhaps wants to measure social distancing efforts perhaps versus another like my example, whose purpose is only to provide essential information.  So that data collection, therefore, has to differ depending on the solution and be proportionate to what is happening.

Now, if we think about the conditions and the tools that are available, I'd like to echo Ellen and Chenai's comments around privacy principles and the idea of privacy by design and needing user adoption, user buy in.  Ultimately the efficacy of any of these solutions is going to come from user adoption.  Similar to, you know, a vaccine in the future, there's a threshold of opt ins that are necessary in order for any solution to even be beneficial in order for the effectiveness to be maximized.  And so in order to increase that user adoption, we as users, as those assessing the solutions are going to need to be comfortable with what is there.

So I echo their comments around transparency and data minimization, I'll add security and other privacy principles and even, you know, Mozilla being a company that lives and breathes open source, I'd like to highlight open sourcing your technology to ensure that you have an unbiased third party ‑‑ anyone, really, being able to have eyes on the technology to ensure that it's doing what it says it's doing.  It's protecting in the way that it says it's protecting. one of our privacy principles is no surprises.  That's speaking transparency over all aspects from the point of collection to the ultimate disposal of that data.

From a solutions perspective, I think it's important to remember that technology can play a role in the solution, but it's not going to be the entire solution.  A colleague recently explained to me the stack of Swiss cheese analogy.  If you think about a slice it has holes in it.  Any of these solutions, technical or nontechnical have holes.  But as we layer the different measures, then masks alone aren't the solution.  But if you wear a mask, plus social distance, plus contact trace or any other digital effort, then we'll be able to have a full solution to actually resolve this pandemic.

Lastly, to briefly speak around the potential effects.  As Chenai said, it's important to think about those vulnerable populations, those marginalized communities.  What's the effect for those who do not have access to a smartphone or internet if that's what's the solution is based off of?  If it's relying on social distancing but people are unable to social distance within their households.  Accessibility as a colleague mentioned in the chat here.  So I just ‑‑ I'll conclude with that, regardless of the solution let's definitely make sure that it's done with everyone in mind.  Thank you.

>> FLAVIA LAFEVRE GUIMARAES: Thank you, now Ian Brown, you have the floor.

>> IAN BROWN: Thank you, everyone.  I'm in Spain right now, although I spent a lot of my time in the UK and also in Rio.  I'm going to talk very briefly about two examples of mass surveillance.  We've discussed a lot globally in the last few years.  One is in relation to COVID, clearly has been a huge issue over the last year ‑‑ but also going back a little bit further I'm sure you all remember the Edward Snowden disclosures about mass surveillance by intelligence agencies, particularly of the U.S. and its allies, although of course other countries are undertaking those kind of activities as well.

In terms of a policy framework I saw a tweet today and I believe it's true the world's first data protection law was passed in the German state of Hesse 50 years ago.  I'm not sure if it's today.  The E.U. has a law.  Brazil has passed a similar law.  This year many other countries have as well.  Graham Greenleaf from Australia who writes a lot about global data protection laws keeps account and I think his latest account is 144 countries globally have GDPR‑like laws, although not always as broad as the GDPR.  The U.S. notably is one of the remainders that do not, although perhaps that will change given the new administration.

Very briefly, I think actually the GDPR and the related in the context of law enforcement not COVID and public health.  There's a separate law enforcement directive in Europe from the same time as the GDPR has some, by now, very well thought through principles that Europe has a lot of experience of.  So I'm sure you're familiar with them because they're in many countries, data protection laws, just looking at Article IV of the law enforcement directive, which is where they are.  I won't read them all out, but they're quite straightforward.  They say things like personal data should be processed lawfully and fairly.  That means that should be laws setting out the conditions under which the data can be gathered and processed.  It should be collected for specified explicit and legitimate purposes.  Don't just collect all the data you can and then think what you can do about it.  That's you know, guaranteed to fail from a privacy perspective.

And there are related principles that data should not be excessive.  You only collect the data you need.  It should be accurate, kept up to date, kept in a form for no longer than is necessary.  That relates to the Apple and Google exposure notification API that many countries have built their contract tracing systems around.  That's exactly what that Apple and Google system does.  It tries to keep data anonymous, as Ellen was just saying.  It tries not to link it back to an individual person over a big, centralized database held by a government.  It's stored on a user's smartphone.  They will receive notifications if they've been in contact with someone that has been diagnosed as COVID positive in the last 14 days.  But that does not lead back to the government.

That actually has proven to be very important for public trust, as several of the speakers have already said in the UK where the UK government has given the police access to some data about contact tracing the government holds for the purposes of enforcing self‑isolation rules.  That's been very controversial.  But that has not covered the contract tracing app data.  That's important because it might really put people off downloading the contract tracing app if that had been the case.

Let me just very briefly alongside that talk about mass surveillance by intelligence agencies.  Because that particularly has been in the news again this year because you might have heard of the case at the E.U. court of justice.  That said that American companies cannot use the agreement negotiated between the U.S. government and the European Union, the so‑called privacy shield, to export data about European residents to the U.S. because of this, U.S. intelligence agencies have brought powers to access data about non‑Americans and to use it for intelligence purposes.  And the E.U. court of justice said that is not compatible with the E.U.'s charter of fundamental rights. As well as the GDPR itself, the E.U. has a broad human rights instrument, the charter of fundamental rights, which is proving increasingly important and this will be a debate around the world for the medium term because this judgment does not only apply to the United States.  It applies to all non‑European countries.  So this is something that governments around the world will need to think about.  I'll pause there for now.

>> FLAVIA LAFEVRE GUIMARAES: Thank you, Ian.  Talar, the floor is yours.

>> TALAR KALAYCIYAN: I'll first introduce myself.  I've been working in Amsterdam for three years now, so I'm from Amsterdam.  I work as the official secretary within the chief information office.  And well, for the ‑‑ for us it's very important ‑‑ can you guys hear me?

>> FLAVIA LAFEVRE GUIMARAES: Yes.

>> TALAR KALAYCIYAN: Okay.  Well, Amsterdam as a city believes in a people centered approach to technology.  We have a policy framework that's meant to provide democratic safeguards for citizens' lives.  So that all being said, we have different departments within the municipality of Amsterdam.  We have an officer that ensures that data protection and assessments are correctly filled, that we have correct legal basis for processing the data.  It's important to ensure the public trust and it's also important that we are ‑‑ that we ‑‑ how do you say that ‑‑ that we use minimal data so everything is also since I think through 2018, since GDPR, it's also processed in the GDPR register.  Citizens are able to consult the register with regards to transparency. We use the rule that we are transparent unless there are compelling reasons for not being transparent.  And that can be an example of security for the public order.  So, yeah, so the rule is we are transparent unless safety is a risk, for example.

Next is we have the privacy office, we also have the Amsterdam personal data commission where I work.  And this commission is a steering commission who is giving the municipality advice in the field of personal data.  It's also bringing attention to the organization so we have an overview of what is going on in the municipality and if ‑‑ yeah, if the advices are negative then we can quickly help the department with that.

Next to that we also have the algorithm register which we default together with the city of Helsinki.  We're the first who has an algorithm register.  We worked together with many cities from all over the world now that we're in 2020 and we have to be responsible for AI, we try to develop the public trust by this register.

So, for example, we have ‑‑ just to make sure that maybe fraud is being detected.  We use an algorithm and the municipality gets a notification for say a neighbor and the algorithm checks the notification being credible by checking previous validations for instance and it goes to the office so we have to check if the algorithm is correct or not.

And ‑‑ well, the register consists of three layers, so first ‑‑ the first layer is the layer of the general information available, so what's available on the register.  What the algorithms are and why Amsterdam thinks it's important to be transparent about them.  If you click, for example, on an algorithm or a process that uses an algorithm, you get the second layer information which is the general information for people without technical background explaining the impact of the review of you as a civilian but also the risks that are involved and how we mitigate those risks.

The third layer if you want you can dive deeper into the data sets and source code and also if they use personal data available under the GDPR, those functions are aimed at journalists, academics, civil society organization and politicians.  People that really want to know what municipalities are up to.  So, yeah, that is used to ensure public trust and transparency and also GDPR rules.

>> FLAVIA LAFEVRE GUIMARAES: Thank you, Talar.  Let's move on to the next question.  Given the complexity of the relationship that are established on the internet, as well as the wide range of rights affecting sensitive rights in very specific sectors, such as medicine, public health, agriculture, education, among all the others, the importance of the stakeholder governance has been agreed in several editions of the IGF.  With the development of AI in addition to most stakeholder governance, a multidiscipline has stood out.  Therefore, it's necessary that the mechanism and spaces for most stakeholder governance to evolve and strengthen, especially with regard to the protection of personal data and privacy to avoid the negative effects of surveillance by authoritarian governments.  As we're living in Brazil, disputes involving technology that we are facing today.

So our next round is to reflect about how to leverage stakeholder dialogue in order to reach possible solutions and consensus on these issues.  Now you will continue the order of the speeches and I will begin with Talar.  Please, the floor is yours for four minutes.  Thank you.

>> TALAR KALAYCIYAN: Thank you.  I would say from my experience with the other municipalities we work with, it's important to agree on the ultimate goal, otherwise you might as well not start.  I think it's important to not fall into endless discussions about what a solution could be, just create one.  Start very small, show that it's possible as we did with the algorithm, for example.  That was a nice opening for discussions with other stakeholders.  That is how it started with us.

After that, it is important that you're willing to abandon your first idea.  It should serve as a kind of proof that it is possible to make conditions, tools and solutions.  So what I'm trying to say is if you're developed ‑‑ or you think of a solution as we develop the register, the algorithm register.  You have to accept others may have different ideas about it and the product can look completely different than what you had in mind.

For example, we think the website we now have for the algorithm register is perfectly fine, but that's also how it's arranged now.  But if we continue to develop the register with the other municipalities, it may become something completely different.  Then in addition to a shared goal, shared urgency must also be needed.  This is often difficult because every organization has its own priorities, of course.  But for example, in our situation, the innovation budget helped with this.  That suddenly gave us a deadline and then a time and a frame.  It was now or never.  We had to create this.  Also, we have a trainee that has written her thesis about this and she gave us ten conditions for successful intergovernmental cooperation.  I think we can pick a lot of ‑‑ we can pick up a lot of it.  It's about points, there needs to be agreement in the network governance forum.  There needs to be balance in the dependency relationship.  You have to have shared goals, a strong foundation of trust, sufficient personal capacity, clear vision of possible effects when there's upscaling, sufficient representation to partners, flexible attitudes of the partners and also reflection of the upscaling process.

>> EVERTON RODRIGUEZ: Flavia, your mic is muted.

>> FLAVIA LAFEVRE GUIMARAES: Ian, the floor is yours.

>> IAN BROWN: Just briefly, I think government transparency is absolutely critical, so those examples we're just hearing from Amsterdam's practice from Talar, I think are world leading, I think that's something all city administrations and national governments should be thinking about doing, about having algorithmic registers.  I think that was one of the problems in the UK's government approach early on to contact tracing.  It was not nearly as transparent and open to input from civil society and others as many people wanted.

Something I would add actually I think, you know, COVID has obviously been catastrophic in its impact this year.  But I do think one small silver lining is what is actually ‑‑ we're doing it right now.  The use of video conferencing tools opens up meetings like the IGF to many, many more people than ever could have physically traveled to the IGF.  I think when you're thinking about multistakeholderism you have to think about it in a nuanced way.  I'm not going to hold up a flag and say it's wonderful and the answer to everything and that's all we need to do to improve the situation.

The IGF is a very good example of multistakeholderism in its theoretical openness to input and comment from civil society groups, from members of the public.  But before this year, how many people in practice could have afforded to travel to an IGF meeting to stay in an expensive hotel near the conference center?  To have the time to get the background knowledge and experience to be able to fully participate in the meeting of the IGF.  So I hope this is something that will persist long after, fingers crossed, COVID fades from many people's memories.  Obviously, it won't go away magically next year even if these very promising seeming vaccines are rolled out to many people around the globe.

But linked to that, that highlights an important remaining issue which of course as the IGF has discussed every year since it began, I think issues about digital divides.  Those are critical for participating in these kinds of meetings.  Clearly ‑‑ I said a lot more people can participate in video conferencing than physical meetings, that's true.  But it's still not the majority of the planet.  They don't have the bandwidth, they don't have the equipment to do this.  I think governments need to think a lot more creatively about how many more people can be enabled to participate in these meetings.

I'm grateful to all of you, unlike me, who aren't native English speakers, but we're talking in English.  That rules out the vast majority of the --  I apologize I can't speak other languages well enough to talk those languages to you.  I don't know ‑‑ there will be no magical text solution to that, perhaps automatic translation in five or ten years will make this a lot easier.  But these are already issues.  A final point relating to that, I was just reading a good article this morning about Singapore's reaction to contact tracing.  Singapore a very rich country in comparison to the rest of the world.  A large percentage of Singaporeans have smartphones and are connected.

Even then, many people in Singapore were not able to make access of the app the government has provided.  Therefore, Ellen, forgive me, I'm not sure if it's New Zealand that considered this earlier this year, as well.  I think some of our countries have the Singaporean government have used hardware tokens which Singaporeans who aren't comfortable using the smartphone app or their identity card which some of them were quite understandably worried about losing if they were frequently scanning it every time they went into a supermarket or an enclosed space.  So the Singaporean government has done more than many other governments in thinking about that population, which is often digitally excluded.

>> FLAVIA LAFEVRE GUIMARAES: Thank you, Ian.  Now, Nneka, the floor is yours.

>> NNEKA EKECHUKWU-SOYINKA: Thank you.  So I'd like to first say that it's important to realize when we think about multistakeholder dialogues the conversations can still be swift relatively speaking.  Sometimes and I see it from my privacy career, people tend to be concerned the more voices you involve, it's going to slow things down.  You know, I may never go live with my product because I have to get too many sign offs, that type of thing.

Especially in a time like now when these technologies are related to, you know, resolving this pandemic issue, I definitely recognize the importance of moving fast.  But in an effort to move fast, one shouldn't use that as a reason to skip critical discussions that are imperative to have at the beginning because when you skip those necessary conversations, then ultimately you might end up with more risks in the apps because of how the app was designed.  Ultimately, if we go back to the idea of trust, if people don't trust it then you're going to have low adoption and so the technology isn't going to be effective anyway.

So these early discussions do drive the solutions.  And as part of the multistakeholder dialogue approach, it's, of course, important to bring in the right stakeholders and so that's a cross functional diverse group of people, which is the same way we would approach it internally for a company if one is developing something.  You have your policy and government individuals.  You have your privacy practitioners that can discuss that operational side.  The technical experts that know the technology and those who can weigh in on various solution and help the non‑medical, nonpublic health folks understand is this solution going to have the positive impact that we're hoping for.

We're in a unique position now with any future solutions where, you know, bring in data experts that can help assess the effectiveness of past solutions and read the data and help us to understand how we can leverage that to improve future solutions.  Security experts of course and communications experts because if we're talking about transparency and user adoption then we're going to need someone with the right background to ensure people are truly understanding what's happening with their data, their rights, any risks, et cetera.

But also consider the secondary beneficiaries that may still need to have a voice or at the very least be informed.  For example, if we're talking about a contact tracing app again, and this tool is going to be notifying people that they've come in contact with someone who has tested positive, is something like that going to drastically increase emergency room visit and potentially overload our hospitals depending on country or region?  Likely, yes.  Even if you're not necessarily having a conversation with emergency room physicians, they should at least be informed in some capacity so that they can prepare for the potential effects that whatever the technology solution has ‑‑ does will, you know ‑‑ the potential effects for whatever it will do.

Lastly, the idea of consensus.  What is consensus in a situation like this?  I think it's important to recognize that as stakeholders we all have different roles to play.  And it's important to get that diverse perspective but it's not equal slices of the pie, if you will, as risks will vary.  A security risk is different than a communications risk where, you know, perhaps it will lead to low adoption, but it may not actually interfere with the individual's data.  So definitely take that risk-based approach but ultimately recognize that there's only one group accountable and that's whoever is deploying the solution, let's say the government in most of these cases. They're ultimately accountable to ensure the solutions are safe, they're seeking the recommendations in the first place.  Thank you.

>> FLAVIA LAFEVRE GUIMARAES: Thank you.  Ellen, you have the floor.  Ellen is here?  No?

>> ELLEN STRICKLAND: Apologies, I am here.  I just had a 3‑year‑old wake up crying, so I'm having to do a little bit of cuddling while I talk to you all, but I can do that.

Thank you so much for the interventions.  I would probably echo a lot of what other panelists said and go back to the principles I kind of mentioned at the beginning, transparency, accountability, collaboration.  You know, agreeing on the why and shared goals.

I would add the point I was making at the end of the last session.  As well as setting up that why and the shared goals, meaningful ongoing communication that includes operational and procurement with the right people, you know, the right people who need to have that discussion is really important, too so it's not a kind of do it once and then move on.

I'd echo that multistakeholder multidisciplinary kind of approach is important.  It's not a panacea, but that engagement of diverse perspectives, those impacted, those involved sharing expertise, knowledge, et cetera is really important.  I would add ‑‑ I think it's important that there are both informal and formal collaboration engagements to build the kind of trust capacity and enable people to engage.  I really appreciated Ian's point about language and timing and having those informal and formal ways to engage can be really important to build those things and to enable people.

Yeah, and I would conclude by echoing, I thought it was an excellent point that I shared, which was about the flexible attitudes, being willing to change your mind and that things may need to look very different to what you had in mind initially or what you can do easily.  You know, that you think is easy to do but actually working with people together is, you know, can help work through those things.

>> FLAVIA LAFEVRE GUIMARAES: Okay, Ellen.  Now Chenai, the floor is yours.

>> CHENAI CHAIR: Thanks, Flavia.  I think just to echo and add onto what all the previous panelists have pointed out, for me when I think about the multistakeholderism model I want to complicate it because we don't often place it in the context of the political society that we exist in.  So oftentimes multistakeholderism means everyone is coming to the table and their different positionalities as equals but we live in a context where governments want to be seen as leading the conversation at the moment.  You think about multistakeholderism, it's not going to lead into a conversation.

Does the political environment or the social context allow for a multistakeholder model?  That's the first approach when I think about it.  And then, also, in terms of thinking about participation.  Oftentimes many stakeholder models look at numbers and gender diversity.  The question of the quality of the conversation speaks to what the other participants were talking about having the right people in the room but being able to work together on determining what is good quality when it comes to developing some of these interventions.

Then we move away from the numbers game but look at the quality of conversations.  And at that point as ‑‑ the stakeholders we participate in, it's important to engage people who traditionally do not inhabit these spaces.  Because they may not be aware it's an issue or are aware and have not been invited because we're not intentional.  Or when they're invited, they get lost in the technicality, the technology, and as Ian pointed out, language.  We also have to think about designing these multistakeholder spaces for inclusivity and building up awareness and taking an intersectional approach that people are experiencing these issues at very different inequalities.

Politicians who work on ‑‑ regulators who work on data issues for example, having their minds ‑‑ they don't have the skills to deal with this.  They're going to be in the qualities and qualifications if you have someone who is a technical capacity coming in to talk about regulating data protection, they don't want as much regulation.  So then governments are probably going to limit the conversation that seems like they could listen more to tech companies rather than civil society who in some cases there's an antagonistic relationship.

I think what's also important is thinking about resources.  All of this is not possible if someone is not funding it.  We can't be on this web stream if someone hadn't said IGF will fund this nice and secure platform people can engage with.  At the same time, it's important that whoever is resourcing these spaces does not end up directing the narrative.  Because I think the most problematic issue that comes into thinking about data governance and solutions around data, a lot of times whoever has the money to set up the space and conversations, determines who is in the room, determines how the conversation will go, and actually what is going to be acted upon.

So civil society sometimes will have someone who is dedicated to be sitting in this room and having this conversation.  That's when we have to think about the power dynamics that exist today to allow for people who are going to be affected by the solutions.  Does it allow for us to work with people who are going to be affected by the solutions?  I realize the idea of the algorithm register.  It's so interesting to see how communities would document these algorithm register.  What would they think is important for inclusion within this space? Those are some of the things we have to think about if we're going to take on a multistakeholder model.  Context, power, resourcing, those are three key issues we have to have in this process.

>> FLAVIA LAFEVRE GUIMARAES: Thank you, Chenai.  Now Carlos, the floor is yours, thank you.

>> CARLOS: Thanks, it's great to hear the participants in this workshop bringing up some great examples and really powerful ideas on how we can strengthen a dialogue on data protection, especially in times of COVID‑19.

One thing we should always keep in mind is that data protection, especially ‑‑ sorry, data itself, it's not seen as a by‑product or side protect of one specific activity.  So as much as data becomes central to governments, to companies and how they will handle those activities, we need to pay extra care to the idea that people might not understand what data protection means.  There is a lot of capacity duties that end up falling over the different stakeholders in the room and awareness building on different stakeholders.

Just to look a little bit different on the civil society sector as a whole, I think we're still lagging behind on duty of bringing traditional protection of rights organizations into the data protection debates because sometimes they feel like data protection, it's more digitally oriented debate.  It's something that's connected mostly to digital rights.  So I think there is a duty for us in the following years to make sure that we create the bridges for organizations that are into the protection of a more I would say traditional human rights end up getting inserted in the debates on data protection.

Because by the end of the day, data protection is about speech, is about inclusion, is about how do you access governmental services.  So I think there is a lot of space here for action.  And addressing multistakeholderism we have a recent example in Brazil that ends up showing how the governments can in a more heavy handed approach try to have access to that because of the fights against COVID‑19.  I think it's important to stress that the fight against COVID‑19 is not an excuse for you to scrap entirely data protection.  And scrap entirely the efforts of building up stakeholder dialogue.

That was a case in which the Brazilian geography and statistics institutes end up getting a governmental measure to force telecom companies to provide them with data concerning name, address, and telephone number of all Brazilians for the usage of these institutes to come up with a consensus, statistical study on the COVID‑19 fight in Brazil.  Of course, that's a novel cause, but that should not be done in the way it was done which was simply to force the companies to hinder the data with doubts, any data protection guarantees without security measures into place and it's something in Brazil we call what could be translated as an executive order of some sort.

It's something to keep in mind that the good stakeholder debate is kept in check, is kept into place when we addressed the whole discussion into the context of the fight against the COVID‑19 pandemic.  Flavia, that's it for now.

>> FLAVIA LAFEVRE GUIMARAES: Okay, thank you.  Now we'll read the questions of the attendees.  If any attendee wants to make the question open here, ask us because the organizer can open your mic and so ask us here in the chat, please. Louisa, can you read the questions of the attendees, please.

>> LOUISA: Saying about an example for COVID contract tracing, source code, has not been made available to the general public despite the government's prevailing policy on open source software.  This means the method and standards of encryption are unknown.  How source codes enhances transparency and allow for a community audit of the code.  Leading to greater security that have already been instances where the application has compromised that of users due to a bug in the software. Further in the absence of the source code and an explicit ‑‑ there's no way that data remains anonymous and it's not reverse engineered by the government later for other purposes. She was disconnected and can't follow your words, can you follow your words, whole name into the chat box, please?

And saying that national security is an excuse to harm or restrict human rights with or without.  The only way is the state's own willingness to follow a respectful approach towards human rights.  And then the questions, the first questions to Carlos Afonso.  Even if we have data protection loss, oftentimes governments act on behalf of their political gains.  So some kind of whistle blowing actions from civil society make clear if our data is used for fraudulent aims how can we address and call for a need for whistle blowing?  Do you think we should justify it for our data driven concerns among human rights?  Because oftentimes it turns out legitimate at the end of the trials.

And the second question, how will you define national security?  These two words are used by all governments to crush dissent by human rights activists.

>> FLAVIA LAFEVRE GUIMARAES: That's always ‑‑ thank you.  Any attendee wants to do the question or any intervention here in the live, in the mic, we can open the mic.  Okay. So I will open to the ‑‑ our speakers to comment the questions and the comments that Louisa has read now.  We have two minutes to any speaker to talk about the questions.  I will begin with Carlos.

>> CARLOS de SOUZA: Thanks for that and thanks for the question on whistle‑blowers.  I think that's a powerful example on how the discussion about data protection end up going way beyond the traditional boundaries that we can imagine.  By the end of the day discussions on speech and how free someone is to review something he or she may have been contact with, it's a key feature for us to make sure that we have a bigger picture or a larger idea on how data is being handled and processed by governments as a whole. These revelations is always something that comes to our mind and that's an example of something Ian has mentioned in this discussion as well.  I think that's really good example and that ended up strengthening the bonds between how data protection and speech might end up intertwining.

One specific issue related to Brazil on this discussion is that Brazil has a provision in our constitution of anonymous discourse of anonymity.  The freedom of speech is protected, but anonymity is prohibited.  And there is ‑‑ even with that very direct language on the forbidden anonymity, the courts have been interpreting this provision saying the forbidden anonymity doesn't mean that every time you have to say something you have to put your face and your identification side by side to the thing that you're going to mention and that you're going to express.  The prohibition of anonymity only means that we need to have a way to identify someone if the speech ends up being harmful.

It's not the ideal way to handle this discussion, but it's just to make this connection between anonymity whistle blowing and at the end of the day this engine, it's really something that is key to understand how our country will end up handling the protection of freedom of speech, the freedom of expression, by the end of the day, that's it.

>> FLAVIA LAFEVRE GUIMARAES: Chenai, can you make any comments about the attendees?  The comments of attendees, please?

>> CHENAI CHAIR: So I think the comment around transparency of applications and governments and them being open source, this is one of the conversations that came apart when I was doing an assessment of the contact tracing apps that have been introduced.  And what is interesting, for example, is a case of in Morocco the information regulator actually stepped in to make sure that the solution that was being implemented was available for public audit.  This was part of their own responsibility and ensuring transparency because in that context there was a concern that it had been implemented by the health ministry and people might be reluctant to make use of it because of the concerns of increased surveillance, rightly so in this particular context.  But the ability to make the code open source allowed for that assessment.

In that context, again, you then find that the challenges, say, for example, what was launched together with the ITU, the Africa‑wide communications and information platform, where up until this point what is not clear which countries have taken up the solution, yet it is changed.  And, also, there isn't any clarity on the type of solutions that are involved when the applications are launched in countries where there isn't a data protection officer or authority and in contexts where we know there's increased surveillance targeted towards political actors, whistle‑blowers in the African context.

Just a comment I think my closing point is that it's very important to think for all the solution that are data driven to have some form of transparency in terms of ability for people to actually review the coding systems that are introduced.  It should not only be left to governments or other determined authorities to then be able to assess the quality of these solutions.  Because we do have people who have the capacity in terms of public and this is the way civil society can organize with people to audit these systems.

I think one last point is that data and relation to public health information has been treated as a national security issue.  So in that context, the conversation of transparency also needs to be taken into account of governments then saying this is a national security issue so we're not going to allow for the systems to be audited but it presents greater risks of security risks by hackers.  That's about accountability and how it's ultimately used for further surveillance of society.

>> FLAVIA LAFEVRE GUIMARAES: Thank you, Chenai.  Now Ellen for ‑‑ the floor is yours to your comments, please.  I think she is not here.  I will pass to Nneka, can you make your comments, please?

>> NNEKA EKECHUKWU-SOYINKA: Sure.  And this is in response to the questions or closing?

>> FLAVIA LAFEVRE GUIMARAES: Response to the questions and then we will have one minute to end the session.

>> NNEKA EKECHUKWU-SOYINKA: Excellent, thank you.  With regards to the whistle‑blower portions, one thing that came to mind for me was also looking at what does enforcement actually look like and how are we able to hold those who misuse the data accountable and ensure that enforcement actually occurs.

From a privacy perspective, GDPR, California's privacy law, Brazil's privacy law now, those are some of the most common privacy laws that we hear these days, but many countries throughout the world have some level of privacy law already on their books if you will.  If we look at it from an enforcement perspective, it seems to be a law in name only without any actual enforcement should someone be mishandling the data.

And so I echo the comments on the importance of, you know, allowing for property whistle blowing actions and just say that when we should think through what the process actually looks like and how we can actually hold those who need to be held accountable so that we avoid having a process that is a process in name but may not fully meet our needs if we don't think things through fully.

Then my last points on ‑‑ in response to the idea of national security, I'd like to bring back the importance of something many of us have mentioned, data retention and even sunset clauses to take it a step further.  Once one has the data, it can be very enticing to find new uses for it.  We hear with it governments who have already rolled out solutions, all of a sudden they originally collected it for one let's say legitimate purpose related to the pandemic and now they're seeing that there are oh, I can now start to use this for security purposes.

There was even a fear in the U.S. during the Black Lives Matter protests that data could be used to track protesters as it relates to that.  When we limit how long the data is there, when we have sunset clauses that have a legitimate timeframe, then things like that can mitigate that.

My last point will be some of these solutions are ‑‑ they have retention but they say we're going to hold onto the data for as long as the pandemic is in effect.  While I would ‑‑ I think they should go a step further and actually think about what is the legitimate timeframe necessary.  In a couple months this pandemic will have been a year and so without knowing how long this pandemic is going to last, they risk holding the data potentially longer than they really need to be holding it.  Thank you.

>> FLAVIA LAFEVRE GUIMARAES: Thank you, Nneka.  Ian?  Your time, please.

>> IAN BROWN: Yes.  That part is really, really critical.  It's an absolutely fundamental part of the GDPR and many other countries' privacy laws for very good reason.  Data minimization, only collect the data you need, only store it for the minimum period you need.  Don't link to individuals anymore you have to because it stops hackers getting access but it also stops the government or private sector actors who originally gathered it for a good reason than to keep it to use it for other purposes.

Actually going back to Snowden, that was one of the many slideshows from the U.S. intelligence agencies that I think worried many people where it was from the director of the CIA I think at the time or perhaps the chief technical officer.  He said their ambition was to gather everything, keep it forever, and link all the dots.  And that is the definition of an Orwellian dystopian, you know, omnipresent government that I don't think most people in the IGF would want to see in any of our countries.

>> FLAVIA LAFEVRE GUIMARAES: Thank you, Ian.  Now, Talar, your time, please.

>> TALAR KALAYCIYAN: I saw one question about my work name and I filled it in the chat.  Are we doing a final minute?

>> FLAVIA LAFEVRE GUIMARAES: No, you can talk now if you want.

>> TALAR KALAYCIYAN: It's fine, I'll wait.

>> FLAVIA LAFEVRE GUIMARAES: Carlos wants to make a comment?

>> CARLOS ALBERTO AFONSO: Which Carlos?

>> FLAVIA LAFEVRE GUIMARAES: De Souza.

>> CARLOS de SOUZA: That's our final comments, Flavia?  Super quickly, just to say that this is a very timely debate because we're discussing this in the context of the COVID‑19 pandemics and how people might think about technological measures as some form of salvation to the problems that we have they are far more complex because they'll involve politics, law, economics, the culture of each country, how people understand data protection, and, finally, the tools, technological tools that we have.

My final comment will just mention when we address issues in the context of the pandemic we need to take special care of at least four aspects.  The legal one, the economics, the technological tools, and finally, the cultural aspects.  It's important for us to understand how do people react and understand the usage of data to fight the pandemic.  Thanks, Flavia.  And it's always a pleasure to take part in the conversation that GGI ends up offering on IGFs.

>> FLAVIA LAFEVRE GUIMARAES: Thanks.  Carlos, please, your time.  The mic is open to you.

>> CARLOS ALBERTO AFONSO: The old one?  Okay.

>> FLAVIA LAFEVRE GUIMARAES: The old one.

>> CARLOS ALBERTO AFONSO: I am ‑‑ it's so good to speak at the very end, because everybody already has spoke things that you wanted to say and they are very well said.

So I want to just to ‑‑ besides thanking everyone as a co‑organizer of this event ‑‑ mention that the regulations that we have achieved in Europe are not the end of all that ‑‑ with that regulation in place, everything is ‑‑ we are learning to deal with the regulation, to use the regulation.  And there are loopholes and a lot of special situations which are ‑‑ we are unable to go over by simply describing articles and paragraphs.

It's the case of Europe right now, in which GDPR is under threat because there is a proposal to end encryption, simply.  And this is amazing because it's ‑‑ defeats most of the purpose of the GDPR.  In Brazil, there is a political fight within government, the right‑wing government to control the NPD, the national data protection agency.  So nothing is perfect once you have the regulation in place.  This is what I wanted to stress.  Regulation is fine, it's wonderful, we need it.  But, remember, the application is a harsh reality that we have to deal with. Thank you all for participating in this event.  It's great.

>> FLAVIA LAFEVRE GUIMARAES: Thank you.  Now Chenai for your final comments, please.

>> CHENAI CHAIR: Thank you so much for having me as part of this conversation.  It was quite insightful.  I do hope that we ‑‑ as we move forward we come up with some action items that actually either replicate the work that's been done for example with the algorithm tool that was developed and generally think about how do we make these spaces that we come together as conversations and be able to have similar conversations at our national and local levels with people who might not completely understand these issues because of the languages we use but that we're able to come up with solution and use innovative means of communications for people across a range of age, diversity and background to see how data issues are relevant to them.  Thank you so much to the organizers and fellow panelists.

>> FLAVIA LAFEVRE GUIMARAES: Ellen, please, one minute for you.

>> ELLEN STRICKLAND: Thank you so much to all the panelists of this important discussion and for asking me to be a part.  I think that, you know, it just reminds me that the amount of data that is amassed and that a potential for collecting it continues to grow and that this challenge, you know, continues into the future and it's a growing challenge that requires conversations like this and us to ask the questions that the panelists have asked about what data is necessary and how it should be managed.  So I look forward to continuing this work as we go forward.  Thank you so much, everyone.

>> FLAVIA LAFEVRE GUIMARAES: Thank you, Ellen.  Now Nneka.

>> NNEKA EKECHUKWU-SOYINKA: I echo everyone's thanks for this opportunity.  I really appreciate the invitation to participate.  It's been my first IGF, actually.  It's been a great conversation and I guess the last thing I'd just like to mention is all things being considered, it's important to remember that once made to ultimately infringe on the fundamental rights of your population, it's important to make sure that it's only done to the extent absolutely necessary.  History has shown us the rights are taken away, it's likely impossible for individuals to get those rights back.  I think it's been a very important dialogue and it's been a great conversation and thank you again.

>> FLAVIA LAFEVRE GUIMARAES: Thank you, Nneka.  Ian?  Your minute, please.

>> IAN BROWN: It's been a pleasure, thank you all.  I hope perhaps next year I get to visit Rio and maybe Sao Paulo.  Sometimes as a European you feel lonely having privacy laws, but this is no longer true.  Brazil has one, many other countries.  I know Chenai from South Africa, South Africa has brought theirs into force this year.  Many countries are passing laws.  New Zealand upgrading its enforcement.  Absolutely effective enforcement is really important.  That's been a criticism in many European countries that the independent data protection regulators are not doing enough to enforce the GDPR.

Actually, more broadly speaking we saw in the chat and in other panelists as well your privacy law, your data protection law on its own, even if it's enforced is not enough.  You need strong human rights laws, you need an independent judiciary who will push back against governments if they misuse terms like national security, misuse data from public health perhaps to spy on political opponents, for example.  You need to protect whistle‑blowers.  You need to give them the ability to speak out if they see things going wrong inside their government departments. So all of these issues holistically are very important.  Thank you.

>> FLAVIA LAFEVRE GUIMARAES: Thank you, Talar, your minute, please.

>> TALAR KALAYCIYAN: Thank you for making me part of this, first of all.  I think it's important to just to keep trying a people centered approach to technology.  Like Ian said, we have the GDPR but it's very important to try to do ‑‑ try to use data as minimum as possible and to try to be very transparent as possible and work together.  That's also what I want to say.  It's important to work together.  Be open to a different approach, be open to different solutions.  I think that we'll help each other a lot.  So thank you, everybody, and I also shared the link of the algorithm resister in the chat.  We've just created it so it's developing still but I think it's great to have a look at.  Thank you.

>> FLAVIA LAFEVRE GUIMARAES: Thank you.  First of all, please open your cameras and smile to ‑‑ we will do ‑‑ take a photo of everybody together.  Thank you for very much for the excellent speeches.  I believe that the presentations allow us to conclude that we need to evolve.