IGF 2023 - Day 3 - WS #564 Beneath the Shadows: Private Surveillance in Public Spaces - RAW

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> MODERATOR:  Hello, everyone.  Good morning.  Hello to everyone who is here aren't thank you for being present.  Hello to everyone following us online.

    Welcome to our session which is titled Beneath the Shadows:  Private Surveillance in Public Spaces.

    The general idea of the session for us is to discuss a little bit the role of the private sector in surveillance solutions and public security solutions.

    So we are here he trying to cover in general how the private sector has been, has been present in public security and surveillance solutions and the risks, implications of that and with safeguards are important.

    For today's panel we will have three online speakers.  Unfortunately our on site speaker was not able to be present today, Estela Aranha.  But we will have three online speakers plus bar ba Simao, the Head of research in privacy surveillance at InternetLab and who also will be introducing a little bit the subject.

    I will introduce briefly Barbara and pass the word for her so she can give us overview of the topic and afterwards we will pass to our online speakers.

    So Barbara is the Head of research as I mentioned for privacy and surveillance at InternetLab.  InternetLab is a think tank on digital rights and Internet policy based in Brazil.

    She holds a master degree in law and development from the University in Sao Paolo.  She are graduated from the Faculty of Law in the University of Sao Paolo.  She was an exchange student at the par is institute in Sorbonne.  She worked on digital rights at the Office of digital defense in 2022 to 2020.  And served as counselor for data protection at another organisation.

    Barbara, the floor is yours.

    >> BARBARA SIMAO:  Hello, everyone.

    Good afternoon.  Actually, good morning or good afternoon, good evening, depending on the tie zone you're at.

    As Luisa mentioned I'm Barbara, I'm Head of privacy and surveillance at InternetLab.  I would first of all like to thank you for coming here, for being present.

    I will give you just a brief overview of what we are talking about and why we decided this would be an interesting topic for discussion.

    I will just share my screen because I have a few images that I would like to show you.  Let's see if this goes smoothly.

    I think you are being able to see it, right?  Yes.

    >> MODERATOR:  Yes, we can see it.

    >> BARBARA SIMAO:  Okay.  Well, the topic of the section is Private Surveillance in Public Spaces.  As Luisa mentioned and introduced a bit about what we are talking about.  I would like to give you an overview of what is happening in Brazil that made us think that it was interesting to bring this topic to discussion today.

    So in the past couple years we are seeing the growth of these private companies called Gabriel, yellow cam and different names for one same kind of business.  That is these companies that sell private solutions, private cameras, private things with cameras for 24/7 ability to track with them and that are shared between neighborhoods.

    So any group of neighbors, any type of local community can buy a camera or access them by a monthly fee.  They are installed into a public street.

    So they are offering these tote ems or cameras easily accessed by anyone.

    I bring here some of the excerpts of their website.  It is in Portuguese but I will translate it for you.  In general they say, they claim that, I will see if I can point so it is more clear.  Okay.

    In general they claim that modern cameras, they are solutions for security anywhere.  They claim that their mission is to make streets, neighborhoods, and cities more intelligent to protect anyone inside and outside the home.

    Yellow cam one of these cameras includes the app tracks the camera is 100 percent free.  The download can be made by anyone and placed in the Apple store.  India it is able to locate and city map the cameras and visualize the images 24/7 and it is possible to search for images that were took in different times or dates.

    They claim also that the installation of these cameras can make the region safer and they can be accessed by the public authorities, including the police.

    And they claim the tendencies with the cameras is that it can decrease criminality rates from these regions over time.

    So they are basically selling the 24/7 solutions that can be acquired by local communities, by a group of neighbors.  They can be accessed without any kind of oversight and accountability.  That was what is concerning for us.

    Also the fact that some news people in Brazil announced these companies were having private channels of communications with the police station.  So the police stations weren't actually demanding a warrant to access the images held by these private companies.  They were accessing it almost like realtime because of niece private channels of communications that were existing between the companies and the public authorities.

    So we think that this is an important topic for us to cover because it can pose an amount of risks for privacy and human rights.  It can have impacts on transparency.  And data sharing, between public and private bodies.

    Besides that it can even affect the rights of the city considering the fact that surveillance may affect in the biggest, in the bigger form certain groups of people that are already suited.

    This case is somewhat relatable to what happened within the PUI case for those not familiar with it, it was a company that had a database with over 3 billion pieces of data kept.  They data scratched public spaces of images and they shared these images with police stations over the world for identification and resolution of criminal cases.

    And these images were collected without any kind of information, without any kind of oversight.  And there wasn't any kind of accountability about the company's practices.

    It was a case that caught the world's attention because clear view AI was cited by many jurisdictions because the lack of legal grounds, it was involved in it.

    This question is to discuss this topic.  It is to discuss the relations between public security, criminal procedure and these private surveillance solutions that are arising.  Not only in Brazil but many countries have these home security solutions that are also being sold.  So I think it is important for us to discuss it, having the lens of the impacts it can have to privacy and human rights and transparency.

    We prepared a few policy questions for you.  And in general, we want you to understand to understand whether the greater broader societal implications of extensive surveillance and their impact on human rights, how does it private surveillance affect historically marginalized groups?  How does lack of transparency required from private surveillance companies affect the human rights?  What are the dangers concerning third party sharing with other private institutions or public authorities without transparency?  What are the liabilities that insufficient legal protections regarding the shared use of data post to individuals and grews?  Does the current regulatory landscape for privacy an data protection ... these are a few questions we prepared for today's session.  These are a lot of questions to discuss in one hour.  Giving this brief introduction I would like to pass the floor to Beth Kerley who will will also join us for this panel.  And Luisa, I think you will present here, right?

    >>

    >> MODERATOR:  Yes, Beth Kerley, thank you for joining us today.  Beth is a programme officer with research and conference section of the national endowment for democracy, international Forum for democratic studies.  She was previously associated at editor of the journal of democracy and hold Ph.D. in history from Harvard University and Bachelor of Science in foreign services from Georgetown University.

    Thank you, Beth, for being here today.  And the floor is yours.

    (Pause.)

    >> MODERATOR:  Beth, can you hear us?

    >> BETH KERLEY:  Hi.  Sorry, I was muted.  Barbara, I think you need to unmute my video as well.

    >> MODERATOR:  Now we are hearing you.

    >> BETH KERLEY:  Yes.  I am able to unmute my audio but not my video.  I guess I can start talking and perhaps my face will show up later on in the proceedings.

    So thanks, Barbara.  And thanks, everybody.  I'm sorry I can't be there in person.  Really looking forward to this discussion.

    And so Barbara, that was really -- I haven't seen those slides before, but I think those cases that you shared are really great illustrations of some of the broader points I was hoping to make here.

    And so -- oh, there, I have video too.

    And so I think what I'm going to do in these remarks is first try and situate those examples in broader social trends we have been tracking and also highlight how the potential of use of emerging technologies like biometric surveillance in connection with cameras in public spaces poses additional risks.

    So to frame the comments a little bit, in an essay on what he calls the version Inc., rod at Citizen Lab wrote about the risk from surveillance vendors making more widely available to government and private clients capabilities that would previously have been available to just a few well resourced states.

    His focus in that Article is on the profound challenges to democracy from commercial spyware which tracks us through the devices we carry with us.

    But I would argue that this question of the growing accessibility spread and, if you like, Democratisation, quote-unquote of surveillance technologies and their intertwining with the broader surveillance capitalist ecosystem very much applies to the devices that other people place in the physical world around us as well.

    In that regard, there are three main points that I would like to cover.  First, network surveillance of physical spaces is rapidly emerging alongside traditional digital surveillance as a pervasive reality that changes the conditions for engaging in public life and exposes people to targeting by both public and private entities.

    Second, emerging technologies such as biometric surveillance or so-called emotion recognition are enabling the entities that control cameras in public spaces to do new things with them.

    Third, commercial suppliers play a crucial role in the spread of physical surveillance technologies to both public and private sector clients and their involvement as Barbara very correctly stated presents will Cha Engs to enforcing transparency and account interest norms.

    On the growth of surveillance cameras, already in 2019 there was an estimate that by 2021, which is two years ago, the number of surveillance cameras installed globally would exceed 1 billion.  A significance ant number of those cameras, more than half are in the people's Republic of China.  But established and emerging democracies are homes to staggering numbers of cameras, including smart home cameras as well as cameras installed in commercial settings which was traditionally an anti-theft measure but now you see installing surveillance is a consumer convenience, allowing people to skip the cashier line and.

    The system Barbara described would be a perfect example.  In India a app allows people to share their images with the police.  Partnerships can refect public concerns about driem but raise challenging questions on how privacy and anti-crime safeguards can be when they are out-sourced to private citizens.  Private citizens can, digitally stalk strangers or acquaintances or engage in blackmail.  The blurry line between public and private surlts works the other way around the private vendors who supply surveillance tech to public entities play a role that is increasing as the tech gets more complicated.  When companies sell smart city packages they are selling profit logic can play a large role in determining what is included as part of those packages.  Great reporting from access now showed companies have incentivized individuals in Latin America to adopt surveillance tools by offering so called donations.  Finally, vendors after the point of adoption can become closely involved in managing the tools and, of course, the data from public surveillance projects.

    So simple CCTV cameras present plenty of risks.  The new AI tools that researchers like IPVM have identified as the drivers of the video surveillance market are multiplying these risks by letting people make sense of the images that are captured quickly and at scale.  In a twroot report for the Forum Steve felt ski identified countries where countries using AI are in using.  Given all the trends around us, we can say that number has grown.  Facial technologies are the most quote-unquote enhancement to surveillance cameras.  They might be sold as part of the package, together with cameras for so-called live facial recognition.  But facial recognition can be applied to ordinary camera footage after the fact using things like clear view AI mentioned earlier.  The risks of facial recognition have been discussed.  It is the best type of AI surveillance.  To recap, when it doesn't work facial recognition can lead to false arrests, which has affected black communities in both north and South America and that is foes.

    When it works, along with other forms like voice or gate recognition make it easier to use cameras to track specific individuals.  This again has potentially legitimate purposes but it can easily lend itself to political abuses as with the abuse to track and identify a protesters in Russian and Belarus as we have seen and it puts the potential for private citizens to abuse facial recognition technology in greater reach.  Also in Russian there's a publicized lawsuit in 2020 that started when an activist was able to buy will images tracking her own movements from the black market for over $200.  There are other challenges on motion recognition which imply their emotional 78 based on these.  This has been called pseudoscience.  It is not hard to understand the ways in which it might be abused to ensure the conformity with government policies but there is a strong interest in that technology, whether to monitor students, drivers and criminal suspects in China or to test and target ads in Brazil and the United States.  Again we see this kind of technology actually installed in public spaces so that the billboard is looking back at you, so to speak.

    Finally, AI means that surveillance cameras in public spaces aren't working on their own.  Analytical tools can combine information from biometric information tech with information from other sources like shopping records or government databases to build profiles of people and groups.

    On an aggregate level all this information collection can exert a chilling effect and enable abusive behaviors by data Holders, both public and private.

    To go through a few of those, first profiles of people and groups can be the basis for targeted information operations meant to deceive and polarize something, something that Samantha has worked on.  Profiles can enable discrimination whether in the form of withholding state resources, targeting advertisements in a way that disadvantages certain people or through negative treatment by law enforcement.

    Third, digital rights activists worry that the mere presence of biometric cameras and cameras generally, whether or not they are working have a chilling effect on people's willingness to join public protests and journalist sources for activities.  Niece can track people's behavior in minute detail and exercise control through awards and penalties in a manner loosely envisioned by China's social initiatives.

    Why does it matter that private companies are so deeply involved in surveillance?  I think whether we are talking about genuinely private surveillance or public private partnerships, there are a few basic challenges.  These include first data access.  So vendors who partner with governments on surveillance projects are likely to have a commercial interest in keeping the data that is collected.  And that's all the more true as companies are seeking to train and refine AI tools that depend on data.

    Democratic governments, on the other hand, have an interest in following principles like data minimization and purpose limitation for data collection.  And in the projects that we worked on together as part of the Forum smart cities report, Barbara and her partner pointed out a lot of the ICT contracts that she was seeing in Brazil did not have specific provisions on how those public private partners could use the resulting data.  That is a broader trend which raises a lot of risks that vendors may be reusing data which perhaps there was some privacy infringement but it was collected for a public purpose that was so important it was worth it.  Then it gets reused by commercial companies for reasons that would not have justified that infringement or it may get resold through the ecosystem of data brokers or even shared with foreign governments if we are talking about foreign companies operating in different countries.

    Sengita, transparency, public institutions and democratic societies are supposed to follow transparency norms.  Private companies are not subject to the same rules and not surely going to be inclined to protect their intellectual property.

    This can make it difficult for citizens, nofgs and journalists to find out about how the surveillance systems watching them work.  I would argue this is going to become a more important issue as surveillance technologies themselves get more complex and need to be evaluated for issues liken coded bias.

    And finally, when private surveillance feeds into public surveillance, it can be difficult to maintain quleer lines of account interest for abuses.

    Again these challenges are likely to grow as citizens experience infringements such as unfavorable government decisions that they can't have explained that are made by inscrutable technologies based on a mix of public and private data that has been collected about them.

    So private surveillance, especially in an gauge where the trend is towards cloud-based and AI enabled surveillance is deeply entwined in a broader ecosystem that crosses boundaries of sector, of country, and of the physical and the digital world.

    And that ecosystem is enabling new types of infringements on human rights.  We see these being taken to an extreme and authoritarian settings, but they are relevant to you will a of us as we grapple with ways in which surveillance is changing the landscape for privacy.  Niece raise urgency especially with multistakeholder, with dem congratulating guard rails in a world where incredible powerful surveillance tech is available to governments, companies and even private people.  On the question of solutions since I am about at my time I am going to shame Leslie turn things over to the next speaker in the hopes that they will provide some answers.

    Again, thanks very much and look forward to the discussion.

    >> MODERATOR:  Thank you, Beth.  Thank you so much for the rich contributions to the discussion.

    And now I will pass to Swati Punia.  Swati is a technology policy researcher based in New Delhi, India.  She is a lawyer by training and has earned certificates in digital trade and technology, cyber law, and corporate law.  Currently she works for the Centre for economic governs in a centre based in the national law University of Delhi.  An issue the intersection of technology law and policy in society.

    Her focus areas include privacy, data protection, data governance and emerging technologies.  At present she is examining the known Crypto blockchain ecosystem in India and studying potential for addressing socioeconomic challenges creating inclusive governance moldings and embedded privacy in the context of developing countries of the Global South.

    Prior to yoin joining CCG, Swati worked with a leading voice in the digital economy.  Swati, thank you for joining us today and the floor is yours.

    >> SWATI PUNIA:  Thank you so much.  It is so lovely to be on the same panel as all of you.  And thank you to Beth for doing such an apt and elaborate impact and implication summary.  It allows me to deep dive into the question that was asked of me which is essentially what are the solution qualities and discrimination regarding these kind of surveillance acts and what Civil Society doing in terms of bridging some of these big acts and addressing these developments.

    To sort of bridge to what Barbara mentioned happening in Brazil.  It is not a stand alone thing.  We are seeing it across the world and India unfortunately is not behind these trends.  We are seeing these trends in terms of automatic surveillance.  A number of states that I know were named the biggest states in the world not just in the country.  It seems like every state in India sort of is competing to auto surveillance.  That seems to be the top priority.

    Having said that, the good part of Civil Society has been an active player and been studying and researching and looking at this development.  They do sort of move to courts in the last couple of years that we have seen when we see these kinds of instances come up.

    But I think essentially what I want to highlight is that given that you have all of these instances happening and you have these kind of systems put in place, the most important thing is the public private partnership aspect.  Often we are seeing these public private partnerships add efficiency.  But here I think the main question that is to what end and for what purpose?

    Not just deployed in sort of developing the technology for the state and deploying it, but they often are also involved in management of it.  And sort of upgrading the systems.

    Nobody really listens to anybody, nobody really knows how they are involved with the data management or whether or not, anybody knows.  Nobody knows when the police is sort of stopping a person on the road for random biometrics, random facial recognition and all of the clicks, whether, where do the data land and what purpose it is used for.

    And unfortunately this is despite India sort of back in 2017 having the landmark judgment on the right to privacy passed by the Supreme Court of India, which sort of gave a very spectacular turn to the jurisprudence and fundamental right.  But the Supreme Court tied the right to privacy with the right to liberty, life and dignity and sort of reading it as an important facet to ensuring quality of freedom of speech and expression.

    Also at the same time placing people at the heart of new age policymaking.  But we have seen not enough happening on this.  But one that is going to be positive, but new data protection act coming to place, all of that.

    One important thing that the Supreme Court categorically mentioned was that privacy cannot be used to thwart the system and policies.  That means that everyone recognizes the fact that automation is not creating something new.  It is often exact rating which already is existing in society.  And we all know that the kind of society that we live in are not exactly balanced and we have a range of inequalities.  Sort of within our societies already deeply entrenched.

    I think the main problem then is not the way, you know, exactly we shouldn't really -- what I mean to say we shouldn't go to automation but to take a step back and see how do we really understand crime and criminality as a concept to really go back over there and start from there.  That if automation is just a tool to exaggerate everything, then should we take a step back and then try and see what are the misunderstandings and misconceptions on what really a crime is.  If you see all these CCTVs and all these gadgets and stuff is bringing to force to handle petty crimes on the street, right in a very set place.  Where you are putting in so much of resources, money and effort to handle this one type of crimes.  But is that what credibilities to the larger criminality in society?  What is the percentage of it?  Where did this kind of behavior of the state or of the private sector in conjunction with the state leading us to create what kinds of society?

    So in that sense I think if we go back to just see that every state is aware of defining crime and criminality, we can all, I think, come together to on this understanding that a lot of the people that we look at as criminals are often people from these historically marginalized communities.  People who come from the below poverty, people who are already experienced and lived inequal treatment from the state and from the society.  That is indigenous people and sects who already suffer these kinds of discrimination.  That kind of are inequality gets highlighted and exaggerated and entrenched.  The fear is a lot of these inequalities through the use of all these automated techniques that Beth talked about, sort of will regularize, categorize regularize the way we function going forward.

    Related to, the main question then, the fact who is going to make that assessment.  That the kind of crimes we are trying to handle are they the real crimes?  Or I am not thinking of the fact that some of this should not be done.  But to what end and for what purpose?

    Another thing that we also know is the reason some of this act is being put together is that to check people's behavior.  That sort of understanding seems to develop and then become popular.  If you sort of check somebody's behavior, then good behavior will get internalized if they are constantly being surveilled.  You cannot deny this because we have seen a lot of studies that support that constant surveillance creates a dent in creating good behavior and some sort of internalisation can happen.

    But again then I would go back to the same question that, you know, how much of these kind of crimes which are getting corrected through this behavioral surveillance we are trying to tackle.  What are these crimes?  Are there bigger crimes, financial crimes and everything that maybe needs more attention.

    Maybe what needs to be looked at, are we trying to plug small loopholes and small gaps and turning a blind eye to big gaps and holes getting deeper and wider?

    And the fact that one important aspect of criminality and crime is that is crime generally as a concept behavioral or structural?  I think if people go back to thinking that, because my limited understanding of the whole issue is that crime is generally structured.  It is not behavioral.  And a lot of studies that I'm aware of in the Indian context written by people across Civil Society, and one work which I would love to highlight from a colleague who did an effort no graphic study delta variant one of the states, the national capital, Delhi, where she categorically gets into how policing and construction of the idea of criminality impacts society and why we define and decide to employ certain kinds of measures and how they do not really work for creating a better society or arranging a better society.  But it is actually just the opposite.

    So in that sense we need to go back to some of these ontological and taxonomical related questions and assess where are we moving towards and why are we moving towards that role?

    Civil Society's role is extremely important.  It is of course working in its own silos, within Civil Society, academics are working within that close space.  Lawyers are with themselves.  The larger NGO system, you know, working in their own space.  A lot of conversations with each other is important so that you can share, work, and build up that understanding.  For example, we might be looking at laws that exist even today despite India having the nine month judgment on right to privacy six years back.  A lot of the way things are defined in India in terms of surveillance is getting decided by predated laws to put this on judgment and even after that.

    We don't see much change sort of happening.

    At the same time lawyers working on these laws and understanding are talking with people like NGOs who are looking into ground research who understand how these marginalized and vulnerable communities do get impacted and bring out those instances and experiences in conjunction with their study on second research and policy framing.  That will help us build a bigger picture and better resources.

    One of the reasons could also be that these kind of issues now will sort of help us go towards this direction.  Another thing is, I think conferences and discussions like the one that we are hosting which is allowing people from different geographies and people across the world and the cotton belt and the majority come together to discuss these issues and figure out these are the similarities and these are the differences.  My understanding is a lot of similarities and synergies and shared experiences that we go through in the kind of familiar social, political, and cultural context that we have in this part of the world, in terms of growing together and understanding and learning from each other the experiences, what we can change and how we can look at the subject.

    I stop there and happy to comment later again.

    >> MODERATOR:  Thank you, Swati.  Thank you for the rich reflexes you have made.

    Now I will pass the word to Yasadora Cordova who is representing the private sector.  Yasadora is the personal private researcher at Unico, a data company.  She worked with various organisations such as World Bank, United Nations and Tik Tok on products related to digital security and civil engagement.

    Thank you so much for joining us today, Yasadora.  And the floor is yours.

    >> YASADORA CORDOVA:  Right.  Thank you so much for the invitation.  It is always a pleasure to be in any event that the InternetLab invites me.

    And I have just a little bit of information to add.  The first one that I would like to feature is that I know that we navigate the intricacies of education technologies.  I wanted, I want to devil into the Nuanced distinctions between biometrics and facial recognition.  It is where the question of user control takes the centre stage.

    Biometrics is a pre-hence I have concept, involves recognizing individuals through unique physiological or behavior attributes such as fingerprints or iris scans, for example.

    Crucially, what sets biometrics apart is the insistence on user consent or authorization.  So, for example, in countries where there is a wide amount of people that have no digital literacy, it is easier to use their biometrics to buy or have access to social benefits or even to complete transactions using their own identity.  If they are using biometrics, then keeping passwords, for example, because it is safer.

    So I think when you call biometrics, you have to also emphasize the importance of user control over the data collected about them.  The users are seeing that their data is being collected and they are using these biometrics because they want to open up a set of opportunities that they didn't have before because they couldn't keep their passwords safe or they couldn't use the system because it was too complicated.

    And in contrast, facial recognition, which is a subset of biometrics, hinges on the analysis of facial features for identification.

    This method can operate without explicit user consent or even awareness.  It raises concerns about privacy and freedom of expression and personal control.

    So here the crucial point emerges, user control is paramount.  The fact that entities like law enforcement can retain and edit videos recorded by body cams, for example, underscores the potential misuse of data.

    So the power to control such sensitive information should ideally rest with the neutral third party citizens, Councils, something like this, at the very least.  To preserve the user's autonomy over their own identity.  Preserving user control becomes not a matter just of privacy but safeguards against potential misuse.

    It is not an expensive safeguard.  It highlights the need for robust ethical frameworks and regulations, but also highlights the needs of putting the data and control of those who actually are the origin of the data.  If you are talking about biometrics.

    So you could give, we could create rules, international rules or talk about rules that could separate those two types of, different types of technologies, of identification technologies so that we could have better frameworks to protect people that are being filmed, having their biometric surveillances, their facial biometrics collected.  Like, for example, clear view AI.

    And kind of demand that these companies have a way to inform the users that their data is being collected.  And offering an option for these users to withdraw the consent or withdraw the permission of these companies to negotiate this data or to collect or to keep this data in their user base and their database instead of just, how can I put this?  Instead of just assuming that this is an impossible question.

    There is use to biometrics.  Biometrics is already being used to create opportunities in some countries and make technology better and safer.  But this is not going to happen if the user is not part of the decisions over their own data.

    So I think the crucial conversation should be around not -- should not be around the type of data that is being collected because it could be biometrics.  Or it could be very sensitive type of data that is being collected and you are not aware of that.

    So I think control is more important than -- it is a more important question right now than who controls this data and is more important than what type of data is being collected.

    I think that's it.  And this is also a solution that can reach the end users.  And kind of help us build trust and give back the control to the users.

    That's what I had to say.  I am happy to take questions or feedbacks later.  Feedback later, if you have.

    And that's it.

    >> MODERATOR:  Thank you so much, Yasadora.  So now we are, we have around nine minutes.  So I will quickly open the floor to those who are here and may have a question.  I will ask you to come close to the table to get a microphone.  And we will do a quick round of interventions for those who are online and have any questions or interventions.

    Please write these on the chat.  We have someone here who will get this.

    And after that we will do a quick round of wrap-up with our speakers.

    So we do have two questions here.  Please.

    >> AUDIENCE:  Thank you for sharing your very interesting thoughts about the data security and who should control the data.

    I would like to hear your opinion on the blockchain technology.  How and whether do you think that the blockchain would be a solution for specifically collecting the biometric data?

    Do you think that might be a solution to just help to control the access to the data?  The blockchain technology itself?

    >> MODERATOR:  Thank you.  I think we have another question there.

    >> AUDIENCE:  Yeah.  My question pertains to India essentially.  There is a very recent development just as earlier we heard it was known, made known to the public that there is something called realtime surveillance happening.  And this was in a reply to a right to information request and the reply was from the Internet service providers association.

    So in light of this, with the act having come into play, which is yet to come into force, but my question is are there any safeguards that the speakers would like to highlight?  I understand one such safeguard was just mentioned.  But in terms of the others for protecting users and giving them certain actionable rights, for instance.  Even being made aware of all the data that is being processed and even a notice showing that they are under surveillance in public areas, specific public areas, this is.

    So just wanted your thoughts on that.

    >> MODERATOR:  Thank you.  So now I think we have one question on the chat.

    >> We have one question online from Ayella Sabeshi from Australia from Civil Society.  The question:  Increased advanced societies such as AI, blockchain, IOC and increased private surveillance in public spaces.

    All these technologies are creating big data information.

    And these days data and information are wealth.  The wealth of communities in developed nations.  All these technologies perform activities and services via the Internet.  So the question is:  What will be the solution for end users?  So far that is the only question we have in the chat.

    So thank you.

    >> MODERATOR:  So thank you.  I will get back to our speakers and do a quick round of wrap-up.  And I will ask you if you want to add any considerations in the final considerations and any considerations you may have on how regulation and policy in general can work to address these concerns also.

    And please feel free to pick the question that you feel most comfortable to answer so you don't need to answer all.

    I will start back with Beth.

    >> BETH KERLEY:  Sure.  So difficult questions there, but I think on the question of types of safeguards, it definitely does depend on what type of tech we are talking about.  I would distinguish, following up on Yasadora's remarks not just between facial recognition and other forms of biometric but also between biometric surveillance.  What you talked about would fall under biometric identification where users basically intentionally use a certain physical aspect as the way to access a space or access their accounts or what have you within a particular system.

    And in that context, I think, it is easier to apply the consent framework.  Of course, there are also forms of other biometric surveillance besides facial recognition that are very hard to opt into, like voice recognition or gate recognition, something like a ... that's the one I'm willing to use on my phone and computer.  It is harder for someone to get from you unawares.  I agree with that distinction.  I think that there certainly, it is a different question.  So when we are talking about biometric identification, I think there are indeed valid purposes for it, but there is a heightened need to establish appropriate safeguards because sometimes even if you are giving it over for a legitimate reason right now it can end up later on in the hands of entities who you would prefer not to have it.  Unlike a password you can't change your fingerprint easily.  That's a fundamental distinction there.  I agree that identification versus surveillance is important.

    And in terms of blockchain, I am less of an expert on blockchain instinctively I think putting sensitive data in a system that is designed to be unerasable is a move that we should definitely think twice about.

    But open to arguments on that one.

    And realtime surveillance finally, I think that is really the hardest thing to put safeguards around.  That is why a lot of European digital rights groups in the context of the EU AI act have been arguing that is something that should simply be the end, having constant awareness of who is going in and out of public spaces.

    I think at the very least you need to delete any data that is collected that day, very clearly.  Definitely agree with the suggestion of making people aware of when they are being surveilled and what information about them the government possesses in systems that have very, in settings that have very elaborate E government systems.  Estonia, that's part of the safeguards built in to ensure trust so that could certainly be part of the answer.

    I do not have the comprehensive solution, unfortunately, to the challenge of emerging technologies.  And surveillance.  Otherwise I could write one report and go home.

    >> MODERATOR:  Thank you, Beth.  I think none of us have the solutions.  So thank you, actually, for your contributions.

    And I will pass now to Swati.

    >> SWATI PUNIA:  Thank you.  It is good that we are coming together to discuss this.  There are a path that we can walk together for a better response in society.  We talked a lot about blockchain.  My next panel right after this is about blockchain and those interested please join us there.

    I will speak to the point on consent and notice issue.  Again maybe this is how my brain is wired in the last few days.  I want to step back and look at some of the issues or the concepts that we are bringing in the digital era of policymaking and regulation.  Notice and consent, how is somebody who is very vulnerable, in marginalized communities or even us, who call ourselves educated spectrum class of people.  We don't have a lot of, a lot of us don't have the digital literacy.  I would say for myself, I don't have enough of financial literacy despite being educated.  I think that is the main issue that the government is doing badly about in terms of using the word empowerment.  Of course, that is very dire, it is across all sorts of regions, but causing somebody to use and implement and understand consent.  We need to develop that digital literacy.  People wouldn't even recognize, I think, harms when they sort of happen to them.

    So I kind of feel that a lot of technology that has been used in the name of trust and everything should be focused towards building privacy and security by design with a kind of attention to the public that we have and need to build on digital understanding should be taken more seriously.  That is where the CSOs are playing a massive role.

    To give you an example, at the centre of communication governance, we are building the privacy with traces, jurisprudence across 17 jurisdictions in the world.  We also do like a regional high court tracker where we sort of map what is India sort of looking at in terms of privacy and the expanding rights over there.  How is it tackling.

    This is also to capacity building for not just students and professionals but also for judges and bureaucrats.  A lot of these people who will now come into forcing and implementation of the new act and everything really don't understand the nuts and bolts of how to go about things.

    In the end a lot of similar countries are jumping directly to a privacy 3.0 where they are living it gradually like some of the other countries.  We have to be cognizant cant of that sort of social cultural environment and think of ways that will fit into our specific, you know, pegs and not just copy paste.

    >> MODERATOR:  Thank you, Swati.

    Now we will pass first to Yasadora and then to Barbara.  So we can close the session.  Due to time constraints we won't be able to take any more questions.  I know there is someone online with their hands up, but we really need to close this session.  But I do encourage you to get in touch both with us and with our other speakers online.

    Please, Yasadora.

    >> YASADORA CORDOVA:  I'll be quick, I promise.

    So I think we find ourselves in an era where data is a massed indiscriminate Natalie not biometric data.  This is both outside and within governments there is a demand for data.

    So this deluge of information, maintaining and assuring the integrity of personal identifiable information has become increasingly expensive.  It is a daunting tack.  These intricacies of cleaning and structuring data which are integral steps in the machine learning cycle, they are a challenge and this process is undenyably amongst the most expensive activities in the machine learning process.

    I propose a shift in focus towards the user control.  We know that you can control what you don't see and this resonates in the realm of data privacy.  Because if we need permission or consent over data sets, we need to make sure these data sets belongs to that person.

    So if we demand this through a regulation, we might end up compelling both governments and then industries to bring light to their data practices.  This shift is not merely about implementing complex blockchain solutions.  It is a call to collaborate, to build transparent systems that are hand-in-hand with regulation, the regulators and technologists.

    Of course, we will still have lots of work to do even though we can't conceive such systems that can be transparent over the user data.  But it is good to recognize that transparency is the bedrock upon which user controls things.  So it is not just a technological challenge, I know.  It is a societal command.  It is a societal imperative.  I believe that we have to work cooperatively to shape a future where individuals have a meaningful say in how their data is utilized, but this is for real.  Lying in systems where considerations guide technology as a feature backlog.  When in a sustainable data-driven future, I guess.  That's it.

    >> BARBARA SIMAO:  Well, I think the other speakers already answered a lot of what I was wanting to say.  But I think when we are talking about solutions and regulations, especially in the case of Brazil, I think that the appeal for private surveillance solutions from population in general comes from a place of insecurity and not trusting the public governmental solutions.  They look for it in a way to like overcome their lack of security in general.

    And I think this comes from, the solution would be societal as Yasadora mentioned in the sense that this would also require a big level of trust of the people in general and to the public institutions.

    And I think when we are talking about regulation also, especially in Brazil we have a lack of regulation regarding the use of technology and data collection for public security purposes.  And not that these private companies actually do public security because they are private solutions and then they are not exactly providing public security, but I think when we ask them what they are doing, they can use the argument of public security.

    So I think it is a tricky scenario.  It is a tricky regulatory scenario.  In Brazil we have a lot to develop yet in a sense.

    I think there are many room for more guarantees and more legal reference -- guarantees regarding it.

    And I think it should be awareness should be also raised in the sense that the people that acquire these solutions are also informed on what are the risks and what are the grounds in which these companies can share data with public authorities and who might have access to it.

    And well, I think that, I'm not sure I added much to the discussion but I would like to thank you all for coming.  And especially for the time zones which I know weren't so good for everyone.  But thank you so much.  And that's it.

    >> MODERATOR:  Thank you, everyone.  Thank you for our speakers for all the contributions and for having joined us today.  Thank you for everyone who were here today, both in person and online and who made excellent contributions.

    And thank you.  I hope you continue to have a great IGF.