IGF 2021 – Day 2 – WS #266 Data justice: What is to be done?

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> ALISON GILLWALD: Good afternoon, everyone. We've got a very short time. I see almost nobody in the room in Katowice, but we do have Shilongo from Rio there. Hopefully, you can help us take the questions from the floor as we go along.

    We have a very exciting panel for us today on our panel on Data Justice: What is to be done? A very big question to address in such a very short period of time.

    My name is Alison Gillwald, Research ICT Africa/University of Cape Town. I'm in a doctoral program in the Nelson Mandela's school.

    I'm very pleased to have with us today a number of partners also working on the subject of data justice.

    Some practitioners, representative of Civil Society, advocates for data justice. I think, at the moment, we have Linnet Taylor, Technical Community, Western European and Others Group who is professor of data at Toolbirk and the (?) At the information law of society.

    We have Parminder Jeet Singh, Civil Society, Asia‑Pacific Group.

    I'm sorry. We don't have Alex Walden.

    Peter, can you see if she's joining us. I don't think we have much time.

    We have Jamila Venturini. You can probably do a nice, proper Spanish pronunciation yourself.

    And we have Sizwe Snail, also from the South African regulator step in for us.

    I think if we don't have Alexandria Walden from Google in, we'll just have to hope that she joins us.

    And we'll start.

    So this panel really arises out of a project that we have been developing within Research ICT Africa with a grant from the center that's supported our work and the work of a number of people collaborating with us in this area.

    Really, it is built and drawn on the work of Linnet Taylor, who has been pioneering the work in this area and developing it and supporting us in thinking about these issues, particularly from a Global South perspective and the challenges of implementing them.

    Also, the work of Parminder Jeet Singh and (?) Who have provided a nice aspect to this work about realizing data justice beyond the human rights framing of it that has really distinguished it from the other data prediction, quite technical work that has been done around the world but often sort of exemplified in the GDPR and how that applies to the Global South.

    Part of our work, like many people on this panel, we are concerned with taking this research to policy influence to ensuring just outcomes, social justice, and digital equality, equality more broadly.

    We do quite a lot of technical assistance. We've recently been involved with the commissioning of the African Union Data Policy framework. We've helped them develop it. It's just been validated and with member states. And tried to see how practically someone can bring these concepts of justice into creating enabling frameworks practically on the ground and moving them from quite normative or theoretical and principled arguments.

    The Africa Data Policy Framework really goes beyond data protection. I think, in this way, we were able to also move from the quite negative regulation components or the kind of regulatory compliance methods of data protection, you know, preventions, penalties, that kind of thing, for breaches mainly around privacy and individualized notions of privacy to broader conceptions of data policy and what is required in order to ensure that an even distribution of opportunities and harms that we see under the current sort of data default frameworks.

    In the broader sense, data processing, management, treatment, especially for artificial intelligence increasingly, kind of largely self‑regulated. Although, various panels, including a parliamentary channel aimed at AI regulation.

    I think something is arising there, but, basically, in terms of what we've got up to here, you know, social networks, use of artificial intelligence ‑‑ although there's been some focus on algorithm regulation, but the regulation is largely a self‑regulated one. We have a number of standards, a number of ethics by design frameworks that have been ‑‑ kind of moral ethics by design frameworks. It's sort of really hard‑core standards.

    It's equated ‑‑ once you've dealt with ethics by designs, you have things sorted out. Jamila will talk about things like that.

    What it really doesn't deal with is some of the issues of economic and social redress. So, really, looking beyond just first generation rights of privacy and looking at second and third generation, social and economic rights and how we realize those as well, which current data policy and framework, in Africa, by in large, we've adopted from elsewhere, they don't really address those issues.

    So lots and lots to discuss. Let me not take up any more of the time the panelists can give us.

    Linnet, can you start?

   >> LINNET TAYLOR: Do you want me to introduce myself and give a provocation or get into the content.

   >> ALISON GILLWALD: We've put your short bios in the chat. So anything you need for your preclude.

   >> LINNET TAYLOR: The only prelude is I'm standing on the shoulder of giants here. In 2017, I brought together points of view of a bunch of people who think about how data affects Civil Society around the world. And I brought it together because I thought that we needed to bring together issues of surveillance and rights in relation to surveillance with the rights of development and people's rights with the collection.

    The word data justice was very well formed to describe. So I've been working on 2017 on that problem, how you bring together people's right to be seen and respected with their right to be left alone when they choose to be and how their subjective needs can be brought into the picture when we think about the data economy.

    So that's my intro.

   >> ALISON GILLWALD: Sorry. Just trying to field the noise in the background.

    Parminder, can you give us your first intervention and provocation. Thank you.

   >> PARMINDER JEET SINGH: So my introduction, I won't go into it. If you call it a provocation, the data protection and GDPR are made relatively nonpolitical in that sense that it's a large consensus. What is really critical in the southern sense is the economic rights. You wouldn't simply get northern experts to do it for you. I completely understand that in intellectual property and justice, there's a huge amount of western intellectual contribution. Someone has to stand up and do it.

    What I want to introduce and talk about later is in government, part of the committee, a committee has a report one year later that completely lists out the conceptual framework of community rights on data. It actually analyzes biodiversity, goes to water resources and collective rights, and then extracts five principles. It applies those five principles to data rights.

    So it does the conceptual work. On the other side, it goes right up to what does it mean. What does community rights, the term, means? How can the committee enforce sharing of data. What kind of data can we enforce? What cannot be enforced? What are intellectual property?

    My provocation is that nobody even looks at that report. It's already one year old, the second draft.

    That's discussion spaces that needs to be fought against.

    You asked for my provocation, and this is it.

   >> ALISON GILLWALD: Thanks, Parminder.

    Peter, we don't have Alexandria onboard?

   >> SHILONGO KRISTOPHINA: I was able to get ahold ‑‑

   >> ALISON GILLWALD: I thought that would be a good idea of capture, but I think Jamila can respond to that.

    Thank you.

   >> JAMILA VENTURINI: Yes, I think what Parminder was sharing about the provocation he did from the Indian context and the document that he was mentioning.

    I guess one thing we have been seeing is that there's this digital agenda being pushed in different context without any further considerations on legality and the deployment of specific technology and in the complete absence of transparency and participation mechanisms.

    I can mention some examples while we develop the conversation, but what we see is that there is a blind trust in technology. There is a complete disregard to evidence‑based policy and planning. In a human rights language, again, it can be translated into these principles. It seems there is a condition behind. That the state collects data, but they do not rely on data for their own actions, for instance. No?

    Maybe I can bring a quick example to give more materiality to what I'm trying to say. In the research we did around the use of technology during the pandemic, what we saw is that in some countries, they tried to implement such technology that were considered effective in other countries while there was no underlying infrastructure that would facilitate the same implementation of technology. So the state was basically transferring to citizens the responsibility to produce data that they were not able to produce themselves.

    I guess that reflects a bit of the moment we are in terms of the development of such digital agenda that is naturalizing the use of technology as something that by itself will solve several social and economic problems.

    And the final point, to let another provocation, is that most of the discourse behind the promotion of such digital agenda that is arriving in Latin America has to do with development, inclusion, and innovation. I guess there is no room for experimentation when it comes to exploring people's data, particularly in our country.

    So we are still to see the types of consequences that the idea of predictive systems in sectors like health, education, social welfare, et cetera, we'll have in our societies.

    So I guess this is a point that we can start this conversation, and I would love to share more examples as we continue.

   >> ALISON GILLWALD: Sure. Thank you so much, Jamila.

    Sizwe, I'm going to sort of ask you for a higher‑level response first. I know I asked you to talk about how this is practically be applied, but I think we're still dealing with some of the broader conceptual issues. So I just wondered if you could provide us with an African perspective on some of the issues that have been raised around different notions of privacy of data, of collective data, not only individualized privacy and these kinds of legal concepts that you're working with on a daily basis.

   >> Sizwe Snail: Yes. Thank you very much. It's quite interesting from an African perspective to see, you know, the uptake of data protection legislation, cybersecurity legislation, legislation related to critical infrastructure, and, you know, it's quite refreshing to see that the development in a stifling ‑‑ I'm saying stifling innovation and new technologies such as AI.

    On the other hand, looking at the human rights effects and the issues that AI brings. For instance, the automatic decision‑making that may result from an AI process, things like being exposed to data breaches, these are new concepts to Africa. You know? It's very refreshing to notice that the former dark African continent is now very open to privacy and cybersecurity issues as well as the human rights matters.

    You know, it's very refreshing to be here and to give you that perspective.

    I just finished my term seven days ago.

    Nevertheless, thank you for having me.

   >> ALISON GILLWALD: Thank you, Sizwe. We can come back to that because you've been there and it's faced institutional challenges. Though it was really leading at the time it was introduced, it took a long time to get enforced. Some of the challenges, we can get all the concepts right ‑‑ well, maybe we can't. But actually getting these implemented, getting these realized on the ground, is what I really want to come to in the next round.

    Before we do that, you know, I just want to return to ‑‑ I mean, you've spoken very strongly about data justice, you know, being this compliment to our other legal human rights frameworks, et cetera. The importance of establishing the concept of data justice as a kind of hard stop on some of these things that come into force as a default, you know, make sure that communities' data is treated justly throughout the entire process and these kinds of things.

    I think COVID and the pandemic has really highlighted, also, the public value that they may be, you know, in collecting data and using data in an informed and fair way, obviously. It's really highlighted some of the public interest component of mass surveillance that I think is quite common in medical research. So, you know, the surveillance of people's condition has created an approach to protect and prevent harm and risk and the data framework that's, in the first resistance to the surveillance. How do we deal with that in a semipandemic future?

   >> LINNET TAYLOR: This is a great point. I think it's important to note that the epidemiology that's going on, it gives it scope and power. The power has been fed back to the public around the world in very unequal and uneven ways. Although we have a good medical data structure that's come out of the pandemic ‑‑ and I think this is really important and we need the science ‑‑ there are also issues where communities have had uneven access to that for different reasons. Just because some have not been included in decision‑making as a result of the data.

    We still see these practices reflecting and refracting the problems that we have with data already and the inequalities that data can amplify.

    I wanted to add something to my earlier provocation, if that's okay, that we're using our work to move towards the idea of a normative commitment that comes above law with the idea that as we move further toward the will of law with regard to the digital, we need good law. Otherwise, we just get coercion and enforcement.

    In order to do that, we can't just seek that in our AI acts and these things. We need an overall normative framing that Parminder is talking about, I think. And that normative framing should be that data should follow the interest of the people it represents. It's extraordinarily hard to do because our practices means data gets aggregated and deidentified and becomes a commodity that can be openly traded on the market.

    I think we need to think about the implications of that and when it's not appropriate. We need to think about a Golden Rule where it follows the interest and remains true to the interest of the communities that generate it regardless of what happens to it, which requires a bunch of new infrastructures, a bunch of new modes of governance and which breaks large chunks of our data market globally as it is. You have to have the way of transmitting the wishes of the needs of the community, which don't currently exist.

    I think I'm going on the same lines as Parminder and Jamila here, in saying that connecting the knowledge of uses and harms to the international governance structure is lagging way behind our ability to channel and use data.

   >> ALISON GILLWALD: Absolutely.

    Jamila, your earlier provocations sort of tied into the examples of the work that you've done during COVID. Would you just like to build on those a bit more. You said you would like to provide some examples of kind of what has gone wrong and also tell us how this might be corrected.

    So Linnet spoke about a complimentary normative basis to the very strong legal basis that is required. Practically, how does that happen? Is it through advocacy? How does that actually allow us to realize data justice?

   >> JAMILA VENTURINI: Yeah. We saw very precarious infrastructure implemented in the public sector that were not able to provide relevant information for society when it was needed around the events of the cases, for instance; right? So the answer that they try to give, coming from this logic of innovation and tech deployment and individual use or individual collection of data was to develop apps, and they explicitly said through self‑diagnosis apps, they would be able to obtain information on how the pandemic events ‑‑ considering in some countries, they did not have the infrastructure to share diagnosis results in real time to the competent authorities.

    I guess that's very serious and brings us to the discussions of how do we develop this digital agenda grounded in what is a public interest and not in the logic, a corporate logic that is based on data exploitation, et cetera and how do you embed justice also from the tech design?

    We saw interesting examples on these types of COVID apps. They included different functionalities, but you could see in some countries, they were completely individualized. One person could only use one application on one specific device. That's completely away from our reality in terms of connectivity. Other countries considered the need to allow more than one people to use the same application.

    That could be an application for self‑diagnosis. That could be an application that would give people the right to circulate during the quarantine times that were implemented. It could be allowance to cash transfer during the pandemic. So we're talking about very basic socioeconomic rights. It was proved necessary in this case. We could think of that, but there is a lot that we have to continue to build on that.

    So I would say to your final question on how we could move forward, I guess, a serious digital agenda for the Global South, at least, needs to begin by creating these data governance structures that allows the flow of public interest and that has the safeguards that includes sharing rights to access to information as a basic pillar that is not present in many countries and several of the cases we saw and also the protection of privacy in a balanced way.

    Of course, we still have a lot of challenge to come in that sense in Latin America. We have lack of normative frameworks, even for data protection, lack of institutions that are capable of supervising that. I guess we have to shift the idea of the digital agenda to build this basic underlying governance standards from which we can continue to develop other initiatives.

    Just to finalize, I would say it cannot be disconnected from basic connectivity infrastructures, and I imagine you, Alison, would have a lot to comment on that. And, also, policies that foster some type of local, technological development. Otherwise, we tend to be dependent on the types of narratives and logics that are embedded on for technology that's being pushed to our countries.

   >> ALISON GILLWALD: Thanks, Jamila. I'm sure Parminder wants to come in on your last point.

    Before he does, I just wanted to go to Sizwe.

    Sizwe, building on the points that were made, particularly this one by Jamila, in South Africa, with the highest rates in South Africa, all the technological hype of what this could do to assist us, gather information and protect us from disease, was moot because, in fact, we just don't have enough smart devices to be able the use full contact tracing. We were not even able to mobilize the operator's data to create the dashboards that we needed.

    The thing about information emergence on the continue tent ‑‑ it's very exciting to see ‑‑ they're primarily occupied with these conventional notions of data protection and privacy in dealing with harms but not really the ‑‑ you know, very fundamental harm in fact Global South of exclusion, of the invisibility of people from these datasets and, you know, at best, underrepresented.

    The implications when we start seeing these being used for decision‑making ‑‑ you know, as the outgoing now, but you served in a period of time as a start‑up regulator. How do we link to those social and economic points that Parminder was making in his first provocation.

   >> Sizwe Snail: I like the context in which you ask me that question. I recently wrote a piece on access to information and how the rights to access to information is actually paramount in order for individuals to, number one, access the Internet itself. In other words, access to the Internet has now metamorphosized in South Africa, which, on the other hand, one has to look at the data protection effect of that.

    So South Africa is a very weird DPA because the DPA doesn't only deal with data protection but also with access to information.

    So I actually look forward to seeing the information of access to information and data protection. For the longest of time, you've correctly pointed out that regulators have spent, you know, trying to get people to understand data protection, conventional data protection, and trying to get people to understand how data protection looks.

    But I think it's very nice to see that this data protection laws that we have, they don't just sit there in isolation. In other words, for you to have data protection laws, you need data subjects. Data subjects need access to information. And access to information, like I said, is a metamorphosized right of the legal right to access to information.

    So the access to the Internet, the ability to communicate, that in itself is one thing we need to look at when looking at the overall issues pertaining to better justice.

    So you are quite right, you know. I think the DPA has opened the horizons. When it looks at issues, look at things pertaining to surveillance. Look at things pertaining to law and how that intersects with protection.

    Look at cybersecurity and cyber criminality. I always say cybersecurity and cyber criminality all come from a ‑‑ (distorted) ‑‑ but, like I said, from an African perspective, maybe one can start looking at these specific areas, such as artificial intelligence and the effects thereof and whether people will be affected by decisions that will be made as a result of dataset that are not ‑‑ (distorted) ‑‑ due to the fact that those datasets are not accessible.

    I hope I have not said too much.

   >> ALISON GILLWALD: No. Thanks very much. You kind of remind us of those fundamental rights. I think what is interesting about the POPA Act are these two sides to it that is not evident in a lot of other data protection or digital policy for the continent.

    So looking at the more comfortable terrain for some of the governments or some of the member states in our community that don't have very strong bills of rights or constitutional protections of rights, and they've been able to deal with these things. Very often, they've embraced it because they've been used for public surveillance.

    I think very interesting in the context of South Africa where you have this strong constitutional framework and this protection of citizens, but, in fact, we still have half the population who is not enjoying the benefits. They don't actually enjoy what you've described as a metamorphosis of the freedom of expression or the right to information, not in a digitalized world that we're largely engaging in, which really takes us back to Parminder's point.

    Firstly, the Global South not just being data subjects. We also want to be data producers and developers and sell our stuff, you know, contribute to the prosperity of our nations. And, essentially, the kind of frameworks we have at the moment that are, at best, human rights ‑‑ first‑generation human rights based don't allow us to address those on our participation.

    That's what the ICT for change work has been exploring and quite a bit of resistance from the self‑regulatory framework we find ourselves in versus what you're calling for, Parminder.

    I'm wondering if you can tell us a little bit more about that approach and why you feel that's necessary.

   >> PARMINDER JEET SINGH: Thanks, Alison.

    There are daunting issues which Sizwe was talking about and Jamila. So I will use my two or three minutes to discuss the whole part.

    Jamila was right when she kept saying about data being left on the table and who decides what happens to the data. There seems to be the feeling that norms are needed. Norms and laws are needed, and laws are not being made, but, actually, they are being made. There's a responsibility to visualize that.

    Before we started working on the data thing, which I just told about in India, we were working for 15 years about global governance mechanisms.

    I can give you an example of AI governance, how it happens today ‑‑ I will relate the whole chain of events. It's the same with data governance and (?). There's a committee called the (?) Which is at the center of much of this norms making. Somehow everybody calls that system the multistakeholder system. In the UN, the system somehow becomes different.

    Now, this system adopted ‑‑ I'm just taking a thin streak of the AI governance. They adopted AI norms, AI principles. Remember, as a legal instrument, it was a legal instrument. It's on their legal instruments page. They adopted AI principles. They decided. They adopted.

    How does it work, then? Five months later, they go to G20 and tell them to adopt this. And G20 ‑‑ I'm very happy India was there and did it. They're not being settled. They're adopted and mentioned in the NX. It's not that they juice the language. They're in the (?) And were adopted. That's fine. They want to go beyond and get all the countries.

    Then the partnership on AI. It's only governments. And now, to make it multistakeholder and all countries are in, it's very clear there is a fact question, continually asked question. Why is this AI partnership, the secretariat is in the OI (?) Once we accept these systems downstream, they will have the effect we're talking about. This is how principles and norms are made.

    Let me tell you the last thing. In 2011, India went to the UN and said at the UN, the make the same digital norm‑shaping structure as there is in the OCD. I know it because I wrote it. I copied the exact structure with advisory, and then I tell them just expand 35 countries to 182 countries. Then it becomes multistakeholder.

    We have various ideas about Internet. In 2021, people know we need to go on it. I think unless we get our seat on the table at the global governance level, we will still be talk of these downstream effects because they are the natural necessity causes of that.

    So let's tell that OCD cannot make global digital policy. All countries will have an equal role in making it. As many times as we've said it, you would think we set the ball rolling. We're not only working on these procedural issues, we've gone down and made a complete community framework on infrastructures.

    Certain parts of it is not going to help. Right now, the structures of government matter as much as the outputs and the substantive elements of justice framework.

    Sorry I took a little longer, but it was a necessity.

   >> ALISON GILLWALD: Not at all. Very interesting.

    In full disclosure, I should indicate that I'm sitting on the data governance working group and managed, actually, to get this data justice workstream into and otherwise very technical ‑‑ not even that ‑‑ economic kind of approach to data governance, very kind of ethics by design and quite technical.

    I suppose the question there of Parminder is, you know, I suppose whether you can work within the belly of the beast or whether you do by participation. The working groups certainly are participatory, but you're right in that the member states to which their research support are provided are, of course, the OCD members.

    So that's why I suppose a number of us have raised a question: To what degree is it a global partnership? Some people from the Global South, there are not necessarily a lot because this is a research support group, mainly academia.

    Linnet, perhaps that takes us back to how we can get broader participation and agreement on these kind of normative things in which is clearly not a normative consensus.

    Like Jamila was thinking about, it's fine for us sitting around this table to have normative consensus, but China, the largest producer of AI ‑‑ challenging America, certainly, in terms of that ‑‑ there's no normative consensus there as there have been around digital advancements previously, which has been dominated by the U.S. not even considered on some of these issues between the European Union with a far more pushed‑back, self‑regulatory frameworks that have gone on in the U.S.

    So what is possible with that? Before you come back to us, I just want to urge people to please pop their questions in the chat or to put up their hands, please.

   >> LINNET TAYLOR: I really appreciate Parminder's view on what is happening. I want to give a view of what is not happening.

    When I think about history, I think about the non‑alignment movement and how that formed in relation to really a basic threat to the existence of a bunch of countries who were affected by the Cold War; right? I'm wondering how far we have to go with rich state governance on AI, basically, before that kind of threat emerges for all the other states in the world and we see some sort of unaligned coalition emerging around governance.

    We've seen a non‑aligned coalition around data led out of Latin America at the moment by Ulysses Mahiyas (phonetic). There's an interest around AI that forms itself around the non‑aligned countries. I think that's really worth exploring because, as Parminder says, there are power‑play dynamics that are at play here that will be long‑lasting and speak to the core interest to the group of the most powerful actors in the world.

    We've seen revolution in those countries. That has been disruptive internationally. Or there's an underlying movement that is effecting.

   >> ALISON GILLWALD: Sizwe, what is your response to that? Africa has historically not participated in some of these global governance forums. Of course, South Africa, together with Brazil and India particularly, have often resisted some of these efforts to enforce, you know, global positions on things, whether it's trade, whether it's on WTO, on digital things, digital taxation, various things. It's resisted. Often, that resistance has been seen as counterproductive to various efforts to, you know, get improved trade, to get innovation, to get these kinds of things.

    And the arguments have not only come from foreign companies related to data organization and flows and that sort of thing. It's also come from local companies who say, It's no use to sit on this data. We have to be able to realize the value and do things.

    So we don't have consensus on the Africa continent. How do we respond to those things? Or is it just going to be the traditional countries on the continent that speak about and on the various issues?

   >> Sizwe Snail: I think a lot of resistance, as you've mentioned, to conform to certain international standards or conventions, I think it's primarily been political reasons. You know, political reasons, countries align themselves with other countries. In terms of the foreign policy, they would want to be seen to align themselves with certain countries as well.

    But the reality of 2021, it's quite simple. During the COVID‑19 pandemic, it's either you comply or you don't compete. By compliance, I mean internationally accepted norms, internationally accepted standards. So this resistance you're talking about, I think it's something that is slowly coming loose.

    I mean, we've seen in Africa, for instance, rights to privacy. It's not in our Africa Charter of Human Rights. Last year, there was an additional protocol on access to information during COVID. In that, they started speaking about data protection and the realization of protection for data for personal information, and we've also now seen recently there's a development in Africa framework dealing with artificial intelligence.

    I think it's more the political issues that make it seem like there is no convention or Africa is resistive of being internationally aligned, but, at the end of the day, we want to compete internationally and survive internationally, you need to do what is happening internationally right now. You need to have regulations in place.

    In South Africa, we don't have a lot of specifically regulated AI, and we've had a most fascinating decision of our patent office recognizing the registration of AI as a result of (?). So the development is there. It may be very slow. It might not be seen from the outside as happening, but it is action.

   >> ALISON GILLWALD: So the AI patent is very interesting because I think after the initial, oh, great excitement that this has happened, it's actually not as positive and progressive as it seems and probably legally problematic as well.

    But I suppose I wanted to take it back because I suppose a lot of the economic logics might be there. Although, as Parminder says, they're certainly framed to have different interests and outcomes in their expression. Obviously, at the nub of it is, obviously, these politics. For example, when I was speaking about the kind of resistance on certain issues that has possibly permeated opportunities for Africa to do other things it's suggested that the forum on the WTO has impacted on the pushback or resistance to some of the data policy framework, which is absolutely critical for the agreement and for other countries.

    That has permeated and requires a solidarity and interoperability and these kinds of things. I think we've seen it around digital markets and the implications of that in this kind of political‑economic bind that we sort of see.

    Jamila, I wonder if you just wanted to share your view on some of these political issues, which I think you alluded to right at the start.

   >> JAMILA VENTURINI: Definitely. We've seen a shift in the type of leadership that we have in Latin America, in terms of Internet governance and how we're approaching other types of technology nowadays. For instance, to dialogue a bit with what Parminder was mentioning before, right now, several of our countries, Latin American countries have adopted the principles and have tried to develop their own AI strategies in terms of language on ethics and use of AI, et cetera.

    Right now, we could have the opportunity in Brazil to develop one more specific type of regulation on AI. In this very moment, this regulation on AI would try to foster and try to implement some of the principles and some of the ideas that we are sharing and are thinking about right now. It's mostly capture by private interest, also, and risks putting ‑‑ or fragilizing the data protection framework we approved some years ago, a few years ago here in Brazil.

    So it's very concerning to see that Brazil that had the leadership at some point when it came to digital rights, Internet rights, et cetera, is now and could have the opportunity, also, to advance in thinking about AI from a Latin America perspective, a Global South perspective, bringing contextual issues and problems into question and bringing some limits to the events that sometimes technology is also running into adopting regulation that would basically facilitate with safeguards and protection and fragilize people's rights.

    At the same time, another point that I wanted to add to that, using the opportunity to question these global infrastructures that are being deployed. One thing that is also something that you were touch and go on, Alison, right now, has to do with how countries like Latin American countries that are mostly consumers of foreign technologies, offer technology and knowledge to countries, and we continue to increase global inequalities.

    I can talk about, for instance, how global tech companies have gone into the educational system, and by the deregulation and facilitation of access even with scientific knowledge, et cetera, with no considerations or safeguards.

    So yes, I would also put that into the table in our conversation about how do we shift the spaces for global regulation of global standards on these technologies and think about how we build public good infrastructure, common infrastructure that could be shared among countries also as something we have to continue to develop in order to think about justice and the shrinking of inequalities, not the opposite.

   >> ALISON GILLWALD: Absolutely.

    I'm very keen to see some hands. I don't know if I've missed anybody. Please put on your mic and speak if you have questions. I don't see anything in the chat.

    Shilongo, on the floor, it doesn't look like there's questions there.

    I can see all these interesting people sitting in the audience. I'm wondering if I will be allowed to ask Yik Chin if she can share some of her insights. There's been very detailed work on China and Internet governance, and particularly in the African context, my very limited understanding of Chinese policy in this area, but in some engagement with Peking University, some of their ‑‑ you know, they're actually focusing on those second‑ and third‑generation rights, and once those are taken care of, these rights may follow.

    Of course, enormous physical resources for digitization and dataification on the African continent to (?) This is actually a new one. Others are very concerned about China's, you know, human rights records and these kinds of things.

    Yik Chin, do you mind?

   >> YIK chin: As far as I understand the data flow and AI ‑‑ because this is all related to your politics, you know, as we all know that. Basically, we can see the global tendencies. It's really hard to reach any international convention, international law at the moment regarding how to govern the data or govern AI. There's initial proposed framework. What we're seeing is attendance. There's a reliance on hoops. In America, they have UAs and Mexico, with Japanese, these kind of agreements, China is doing the same thing because they also propose a global government initiative in 2020 by the Ministry of Foreign Affairs.

    The country has responded to that. So we can see, based on the future, there's an international convention, but there's regional multilateral agreements. It's about the data, but in the future, it's probably about artificial intelligence or standards.

    I think what the next step is for the UN, the United Nations relations, they published a report saying they want to modify their strategy to make the approach more inclusive. So I hope they can play an essential institutional role to at least build up some laws or standards or framework to kind of govern the flow.

    Other things about the data, I think, is also aligned to the WTO.

    I think the main war for China is they know all these things are playing with politics and the political struggle between the old, like a powerful country in China and developing countries such as Africa. They will be excluded from all these agreements because of the (?) I think they're trying to be not as related to those groups, but we'll see if they will be allowed to join. I think if they're not allowed ‑‑ and they probably will build up the division, but one of the group, they think it's a more democratic country. China is classified as a non‑democratic country, so they're not allowed to join. But whether you're democratic or not democratic, there's many issues related to, you know, developing countries' needs and also infrastructure issues as well.

    So I do not think we should talk about all these issues. Just use the term of democracy or non‑democracy to divide the camps.

   >> ALISON GILLWALD: I think this is absolutely the question. It kind of takes us full circumstance toll the point that Linnet was making around the non‑alliance movement. I think great parts of the Global South, certainly Africa, are kind of, you know, really just in the fallout of these efforts to try and get kind of normative consensus essentially around the big powers.

    There's some break put on by India, Africa, all these things. I think the work on the various people on this ‑‑ the work of the various people on this panel, the agreements that on a very high level are normative agreements or even on standards and those kinds of things don't deal with the issues of redress of justice, of actual equality and equity, not just inclusion.

    It comes from sitting at the table.

    There's obviously an enormous amount for us to explore here from a research point of view and from a policy point of view.

    I think at this moment, we're likely to just disappear into the ether as the IGF removes us from this room.

    I just want to thank everybody very, very much. If anyone has a last burning thing that they want to say, please use the few seconds, but realize that you just may be cut off midstream.

    I saw people wanting to say things, but perhaps there's just not enough time to do so.

   >> YIK chin: The concept proposed by the UA is very useful. It's not about equalities but some big countries, you see more responsibility to help. So I think there's a key concept for us to help each other.

   >> ALISON GILLWALD: I think you're right, Yik Chin. I think there's been some skepticism of what some of these declarations of solidarity mean. It would be great if those could be overcome.

    Parminder points to the Digital Economy Report this year. It goes beyond the report to try to speak about development and ensuring that these data developments are actually more equitable, the benefits are more equitably spread.

    I think the real challenge ‑‑ I think you would agree, Parminder ‑‑ is how do we deal with that? How do we deal with profound structural inequalities? How do we do that?

    I see Parminder has his hand up. Perhaps you can close it out for us.

   >> PARMINDER JEET SINGH: What the point Yik Chin made, all these things operate differently. Philanthropy, goodness, ethics, it's being on the table, hard things of a different kind. I just want to be at the table, even if I made the most screwed‑up law.

    So philanthropy is one sphere. They're all good, after all. We, as an NGO, we try (?) That comes from all developing countries. But who makes those norms and who makes the soft laws and who sits at the table is very important. It's slightly different. Both layers, I think, are fine.

   >> ALISON GILLWALD: Thank you very much. IGF is urging us to say goodbye. Thank you so much for your time.

    Take care, wherever you are.