IGF 2022 Day 1 WS #269 Data privacy gap: the Global South youth perspective – RAW

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> THEOROSE ELIKPLIM DZINEKU:  Hi, good afternoon, everyone.  Thank you so much for joining and thank you for being patient with us.  We are going to start shortly or we are about to start now.  I do ‑‑ I would like to officially welcome you to this session on data protection.  Especially on the role of data agencies and what the policy regulations are on that.  I'm also holding that from the youth perspective.  We hope for a lot of input and a lot of input in those areas.  We try as much as possible to make it engaging.  And we will honor really that the issue of data protection is high.  We recorded a lot of data surveillance issues happening and so I must say that this topic is handy.  And now we have ‑‑ we have to speakers joining us online and we have three wonderful speakers here.  I'm just going to let them introduce themselves shortly and then we'll move on to that.  I will just start right by my side, give the microphone to Karla to introduce herself.

>> KARLA GIOVANNA BRAGA: Thank you so much, my name is Karla, I'm in sanitary engineering.  I'm also cofounder and director of sustainability and project Amazonian for astudent ‑‑ and so I pass to Shelby to present herself.

>> ABRAHAM SELBY: I'm sell by Abraham, I do love volunteering, I do privacy policy.  I work with Ghana protection commission for a year, and I do the data projection awareness impact assessment.  I have organization and privacy issue.  I'm also a media speaker on data privacy, and digital awareness.

I am on the IS3C, on the Dynamic Coalition, under the IGF.  I'm also volunteer for Internet Governance fellow from African Internet school of governance, and Ghana school of Internet Governance and finally west African school of gonance.  Nice meeting you here and I'm happy to be a speaker for this session.

Thank you.

>> MODERATOR: Thank you so much.  I guess that he's an embodiment of Internet Governance.  So before Daniel comes, I just want to move quickly to those joining via Zoom, if you can hear me, I would quickly move to Emmanuela to introduce herself.

>> EMANUELLA RIBEIRO: Can you hear me?

>> MODERATOR: Yes, we can hear you.

>> EMANUELLA RIBEIRO: Yes, I'm a project manager at this Safer Net Brazil.  I have been working on privacy and data protection since 2020, and I was a youth fellow for Internet Society, and for the Internet Steering Committee in to 19 and '20 and I would like to give a shout out for all the youth who are participating in this panel, especially Brazilian youth who are there with us.  We really hope you can engage and participate.  This is really our main objective here.

Thank you if Shradha is online, I would like you to introduce yourself.

>> SHRADHA PANDEY: Yes, can you hear me?

>> MODERATOR: Yes, we can hear you.

>> SHRADHA PANDEY: I'm currently joining from a rural location in India.  I thought this place would have it, but it doesn't.  And this is one of the issues that we might discuss today, regarding access for youth community members.

I am Shradha, and I'm from India and my clear area of work entails policy research and I have been an Internet Society youth ambassador in the year 2020, and after that, I have been associated in one form or the other with the Internet Society and Internet Governance, either as a volunteer or as a mentor or in one form or the other.  Currently I'm the head of ‑‑ I'm a board member of the youth special interest group of ISOC.  So if any of you after the session are interested to take the conversation forward, the youth special interest group, would be very very delighted to you have and it would be fantastic if you could join us.

So the youth special interest board tries to look at policies and we try to understand where the governments are coming from.  We are working on multiple governments, and some of them in Africa, and some in other countries across the globe.  We will discuss that.

I'm also the Generation Connect youth enjoy for the Asia Pacific region and so the International Telecommunication Union, so for the Asia Pacific region, I'm a youth envoy.  I know a lot of youth envoys are attending the IGF.  So if any of you are here and you would love to continue the conversation on the youth special interest group, you are more than happy to join.  They will be delighted to you have.

And I'm so so excited to be here today.

>> I understand that we have four minutes each to opine our thoughts and I will make Min succinct.  And I'm Daniel.  I'm a lawyer, data rights specialist and cybersecurity enthusiast and also a cofounder of the cybersecurity fund.

>> MODERATOR: Thank you so much.  So those are our speakers.  Now quickly moving into today's session, we tried to make ‑‑ break the session into three different areas.  We will be looking at policy.  We will be looking at agency and we will be looking at data protection itself.  So now, behind the backbone of the issue discussed, in terms of the authorities we are really looking at how data protection agencies ensure that all personal data are kept safe.  Now, when you submit an information online, right, or you fill a form online, what happens to that data afterwards.  For those who register physically, what happens to the paper or whatever the registration form is after that?  So we are going to base the conversation on to that space.

Now, in terms of stakeholders, as well as, I'm sure that we all here belong to one stakeholder group or the other.  What kind of awareness are we creating for individuals in our various institutions or our various stakeholder group, right?

And in terms of data control, right, how do we actually process data?  So this just a business of how we are going to discuss.  Moving quickly and then I will start with Karla probably because she sit next to me.  I would want you to pinpoint some of the challenges that we would face when its to data protection.  This is not a new topic, right?  And we keep doing that, discussing this all the time.  What are some of the challenges we face, when we talk about data protection.  Explain on that.

Then we have ‑‑ if you could project that, we have a special application called Menti where you can submit your comment or question if you don't want to be identified, just ‑‑ we'll talk about that for you and we will project that and you can do that so the contribution happens.

>> KARLA GIOVANNA BRAGA: Thank you Shadrach for the question.  Hi, everyone, my name is Karla Giovanna, and I come from the state of Parana, and it's in the Amazon region.  And in the state of Para, we have big problems.  And within the various challenges we have in the Amazon region to achieve the sustainable development goal, I briefly talk about the vulnerability that we and the human right activists half in one of the states that most kill environmental activists in Brazil.

And we are struggling for our territory as a legacy, terrorists are constant.  As well as the deprivation of our rights assessed to our quality education, that understands that data provided and protection are essential for our survival.  Especially after COVID‑19, since we often needed to migrate, and even come to the leakage of personal information about your location that endangered us.  In a context of access to connectivity.  Our territory is very environmental rich and we like it to economic guidelines that destroyed our land, and our bodies.  While our forests, our land, our water, our life is living, because amzonian people don't exist without earth.  So how can we raise awareness among the most vulnerable population who have not had access to contextualize the education on data privacy and protection?

And they can receive this type of complaint.  In our context where violence is accompanied by institutional extruders, making it difficult not only to report but also protecting activists and the other endangered populations.

This was a concern that we from the Amazonian youth, and during the 2022 elections and the violent government where we had several data, health, education, deforest erasage.  Having been a genocidal government for the Amazonian people in front of the struggle.  Building a context of violence of the rain forest, and we understood that the way out to build the education training and based on the pillars fundamental rights, territorial development, major literacy and digital security, for activists aiming at contextualize education, using pillar methodology.

And was to guarantee the exercise of citizenship of this youth by taking security measures that ranged from avoiding sharing their location in realtime or not.  To the formation of a support network for young activists from the state of Para.  This enables us to equal and carry out our denouncement.  It's not at the state level but at the international level.  The major change in the systems that sentence us to death.  Sadly, we still have a lot of change to face in the Amazonian territory.  Especially within communities who live in the environmentally Richter Tories because most of the case of activism that happens in this place often digital exclusion is a challenge for us to communicate to our network about what has happened before the worst happens.

In conclusion, by saying that it is urgent to have a training programs that you contextualize and sensitive to Amazon realities on data privacy, and protection.  So that I or any other Amazonian on the front‑line of the struggle don't have our life cut off like several people would died in the defense of our territory.  Data protection and privacy does not protect all individuals.  It's protects the actual lives of engaged people who materialize our dreams in the life of a fight for human rights.

Thank you.

>> SHADRACH ANKRAH: Thank you so much.  That was an extensive submission, right, but usually when we are talking about the data protection, we don't really look at people that have lost their lives just because of how exposed their data were on the Internet.  But I will pose the same question.  What are some of the challenges that the youth, the younger people are facing when it comes do data protection and, I mean, of course you could talk about West Africa.

>> ABRAHAM SELBY: Thank you very much for elaborating on this section.  Data protection, why is it important?  We are human.  Data is not an evolving gold in this era.  We use data to do analysis in business.  We use data to make decisions and to make provisions for people.  With this 17th IGF, data has been a key role.  Data has helped us to mobilize the session, arrange the rooms.  Everything.  So data is very important.

But when we are taking data, what is the privacy behind it?  That is the most important thing we have to know.  And now let's go back to the roots in.

My origin, West Africa, people were taught how to learn English and mathematics and they know how to use the formulas very well.  There's a serious gap within that.  Data protection must be must be basic in schools, whereby people, youth who understand the concept of protecting their privacy, they must, one, understand the concept of their privacy.  They must also understand the context of their authorities who are interested.

Most African countries have not passed the data protection laws, and it's a serious gap when it comes to Africa.

Now, I'm looking at the perspective of the technical community.  We have people, young people developing applications, website from schools, universities, and we trained them about privacy as part of development.  We trained them about privacy aspect of data, that they are taking into their various applications.

We have to also include academia, when we mean data protection.  This is very important, because the academia trained the technical community to come up and we need to address that.

Now, let me take the last section of this to address.  In Africa, what we lack is that we don't have a common data protection regulation which have been passed out ‑‑ as the GDPR also works.  So there is a lack.  Most youth don't see that as a opportunity in it because unlike the European and other western countries whenever you do anything related to data, there is regulations.  And we must adhere to it, that ‑‑ we must take responsibility as an African Union to equip the youth in terms of resources and other stuff.

The last part will be academia.  Why academia?  We have school on Internet Governance, but what we want to ensure is that the academia board, their accreditation board, should bring out a privacy protection courses in our universities.  Degrees are not staff.  We should have people pursue computer science and so these people can come out as an expert group and we will be able to achieve this prooivey is awareness we are creating.

This will help us to close the gap in Africa, and we can have supportive section with other countries.

Thank you very much.

>> SHADRACH ANKRAH: Great.  This works.  So in the same line with just last submission, I want to move online to Manu to talk about language that is used.  Now, when you read most agreements, for example, if you are going to open a social media platform, do you agree, and all of that.  Now, how many of you really understand what those things are?  Do you really understand?  Now, you know that when you are going to open a Facebook page, without saying I agree, you don't have a Facebook account.  Right?  And you want to be on Facebook what do you do?  Usually ‑‑ I don't want to talk to anyone, but I just click I agree because I really need to be on that platform.  So I'm moving that to Manu and there is a great comment on Menti from someone, who wanted to know, how the language, in terms of privacy agreement is made simple and the person is specifically wants you to give an explanation of the issue of Trinidad and Tobago and all of those issues.  And if you can hear me, take the floor on that.

>> EMANUELLA RIBEIRO: Thank you.  Just to understand, can you hear me?

>> SHADRACH ANKRAH: Yes, we can hear you.

>> EMANUELLA RIBEIRO: One saying that the language is difficult to understand and we had someone say that there is an issue where not everyone understands how Big Tech companies store or score their data.  One, I think I will compliment Karla's speech about the importance of contextualized education, especially in the peer‑to‑peer education as an awareness strategy, but I will compliment that only education is not enough.  We have to have a come promise because we can't depend on d compromise because we went depend on the person for data privacy and protection.  We can't just burden the individual so that they have to take all the control of their accounts, their check list, their information, and this is their problem and this is their responsibility, and so be it.  Were need to have like a compromise and we need to have terms that really guarantee effective transparency and not just something that you click like you agree because you want to participate.

So for my first point, the importance of peer‑to‑peer education, recent research shows that not only youth, but especially children and adolescents when they have a problem on the Internet, they learn from each other.  So why data'dcation is so important and why does it affect our lives?  Well, safer net has a child helpline where we aid child with issues regarding data protection and human rights violations on the Internet and we see data, hacking, lack of control of personal data, including images.  This is really important because when we see this data pattern, we see that the latest ‑‑ Brazil this some big, big data breaches in the last few years where we had compromised passwords and accounts and this is exploiting in the online environment that we had and this creates hits.  And this creates really, really big harms.

For instance, an email reached female that their photo was stole and they should pay some type of money so they wouldn't can issue close nudity photographs.  When we are talking about data protection, we are talking about the control of our flow of information.  So to enhance awareness, we need to have strategies for this.

One thing that we like to use is humor, engagement, games, and it's to facilitate the language so we can allow for people to understand and more than that, we also like to allow child participation t to understand how child feel about this and how youth feel about this and allow them to have the tools to create their own projects for awareness.

Now, my second topic, how the burden can be on individuals for all of.  This today when we go to set our privacy beings we have, like, check lists and apps to download and we don't understand which one is safe or not.  We have difficult languages or we have this really big terms, saying that this and that and we really don't understand how this works.  And we have something that is really specific in the Global South, that some terms of youth, they are more protected for European citizens than Global South countries.  So even though Brazil has a regulation, we had a recent case about this, where there was no such transparency for us, even though we had a regulation and European countries were in a transparency issue.  And this is a really important thing.

How does the Global South affect us in this perceived privacy and data protection tabulation?  We have a few needs.  We into Ed to enhance the debate about the data privacy by default and demand.  Data privacy is a human right.  It is not negotiable.  I shouldn't have to change all of my settings and have the burden to change all of my settings because this is a human right.  I should be protected by default and by pattern.  And this is a compromise that we have to take with developers, with applicationsW. private companies that they are providing this service, that almost ‑‑ that it complies with the public function allows us to speak and communicate and connect.

We need to give more actual control and actual transparency, like understand exactly what is happening in a friendly language, and we need to have control over settings, and more than that, how can we engage more youth in this debate?  We should be part of the design of tech, of the decisions about tech.  We need to think about enhancing civic participation of these decisions because if technologies are affecting our lives now, they will affect more lives of youth in the future.  So we really need to think, think about youth councils, consultations and research that consider context and difference in access and we have to have a perception, I think of the critical education and not only that, as Global South, I really believe that we can build new ways to think about technology.

How are we viewing our technologies?  Do we really need to have, like, this feedback look where we think that the platforms that we have, the apps are everything now, or can we build something new?  Can we have new examples, new applications?  And what is the paper for each stakeholder for that?  And those are a few considerations I would like to give.

Thank you.

>> SHADRACH ANKRAH: Yes, so back to the room, I would quickly want to move to Daniel.  Now we have, again, issues of infrastructure and ‑‑ infrastructure in terms of data privacy, education, there are a lot of challenges in that I think Selby has shed some light in that.  What are some of the challenges in data privacy education, I mean in those areas?

>> DANIEL OPIO: Okay.  Thank you so much, Theorose and panelists.

First, I would like to say that about one‑third of the Internet users population in the world is below the age of 18.  So when we are looking about ‑‑ talking about the challenges that young people face, especially in relation to education, on some of the data privacy challenges, we would have to look at users who are probably ‑‑ probably haven't yet developed that kind of maturity and judgment.  So I appreciate what goes on online, some of the challenges such as cyberbullying and surveillance, normalization, because if you are groomed from a particular are age, let's say you have access to Internet from the age of 10, with time, you get to accept certain elements that take place, which affect your privacy without notice because there's some type of surveillance capitalism that takes place as a result of some of the terms and the conditions that you need to accept to be able to access a platform.

And some of these young children, who have access to the Internet do not know some of those terms.  They don't understand them.  And in the Global South ‑‑ I mean, some of these laws are so new.  For example, in my country, in Uganda, the data protection and privacy act is of 2019.  So even if we are not looking at the young people and we are looking at those that are older, they also don't understand properly the element of data protection and privacy.  So how will the young ones be able to understand that?

I believe the first element of ‑‑ the element of education should ‑‑ the burden of education should actually, in my opinion, be the developers of some of the platforms that draw the attention of the young people, Emmanuela has mentioned something about privacy by design, but I think the awareness can take place on some of those platforms.  If a young child is going to join a particular platform, there must be consent from an adult to help educate them on where they should go and what are the boundaries that they are doing to avoid them coming across inappropriate age content.  And some children get addicted to pornography, simply because they do not know what to indulge with, and what not to indulge with.

So I think the element of education and literacy governance has to put a lot of effort and probably start with primary education, secondary education, even at universities, have some form of training that takes place to let young people know how to conduct themselves online.

>> MODERATOR: Thank you so much.  And I would move on to our last submission, and then I would open the floor for questions and inputs and contributions.  Now, I'm moving straight forward to Shradha who is online.  What are some of the opportunities that are opened for youth to flaunt the agenda of data protection in their areas, so stakeholder groups, Shradha, if you can hear me.

>> SHRADHA PANDEY: Yes thank you so much.  Theorose, I hope you can hear me too.

>> THEOROSE ELIKPLIM DZINEKU: Yeah, we can.

>> SHRADHA PANDEY: That's a very important and relevant question youth of our age.  What Emmanuella and Daniel, what they have highlighted, they are all very relevant.  And what I would like to discuss are the opportunities in the field of policy, policy decisions and policy making, that the youth can and should influence.  Because right now, what we are seeing is a trend of very old people deciding the future of very young people.  So there is an intergenerational disconnect, which we need to make sure is turning into a dialogue between the generations and they are taking our insights.  How do we do that?  There are ways you can indulge that at your local level, so the ‑‑ so everything, every grass root movement starts from your own backyard and the starting of each and every such movement should happen within your local levels.  Try to see what the policies are within your governments.  So usually what the Global South is seeing a trend in terms of more and more regulation, in the past years, what we're seeing is that since 2019, the governments in the Global South are moving more towards regulation and less towards online autonomy, which is slightly problematic.

And the way we can influence that is to keep a very keen eye and be aware of what rules and regulations your governments are bringing in, and how they will potentially impact your freedom, and your expression, your privacy in the cyberspace.  Because privacy is a human right and a fundamental right in almost all the Global South countries, while they do give lip service to the idea, they also do not consider it to be the most extent where they actually implemented at national level.

So we need to make sure that the governments are bringing and implementing these policies at the national level.  We also need to take into consideration how we can influence and indulge in these policies and one of those is by focusing more towards the idea of youth participation.  So what we are seeing is a government who is trained in taking youth views into being and youth community views are taken into account through feedback mechanisms, through indulgences to make sure that the youth community is participating and that they are ‑‑ (No audio).

The youth voices.  So organizations such as NetMission ‑‑ (Garbled audio).

Can you hear me?

>> THEOROSE ELIKPLIM DZINEKU: Yes, we can hear you now.

>> SHRADHA PANDEY: There was a slight disconnection.

Asia is trying to give voices to youth community in Asia.  The youth special interest group is trying to do policy analysis.  The ‑‑ (Garbled audio) and all of these are activities and avenues where you can indulge.  (Garbled audio).

Thank you.

>> THEOROSE ELIKPLIM DZINEKU: Thank you so much.  At this point, we will open the floor for questions.  So any question?  Yes, thank you.

Questions?

>> AUDIENCE MEMBER: First, thank you very much for the amazing panel.  My question is about, what do you think the tech companies should do when it comes to such matters?  Because what we see is that politicians ‑‑ they are very quite slow when it comes to catching up with technology.  They don't really understand the mechanism that we're behind and I come from Turkey and what we do in our work is that we try to kind of show that actually the changes about privacy, and child safety, they can all happen through product changes.  So just even thinking about the terms and the conditions.  They are so small, right?  It could be different.  So I'm really curious about your opinion, what you think the Big Tech should do and how we can hold them accountable on these matters.

Thank you so much again.

>> THEOROSE ELIKPLIM DZINEKU: So she ‑‑ okay.  Do you want to ‑‑ oh, okay.  So we can take two and then we'll answer them.

>> AUDIENCE MEMBER: Okay.  Hi.  I have kind of like three questions.  So my first question is, is there an African youth institution or something, just an institution that's formed to create ‑‑ since this is a youth policy‑focused conversation, that focuses on creating policies from the ground up rather than adopting policies based on the global state that's solely focused on building policies from the ground up?

And then too on the implementation aspect, since it's ‑‑ what's it called?  Something which comes up in all these conversations that you have about policies because we can come up with good policies and all that, but at the end of the day, it's the implementation that matters.  It's an institution that is focused on policy auditing or youth one considering, like I said the perspective that we are focusing this on, that's solely focused on auditing policies?

And then my last question, ‑‑ sorry, you have to answer and then I will say it again.

>> THEOROSE ELIKPLIM DZINEKU: Okay.  Yes, we'll take the last one.  We have a hand raised up online as well.  Okay.  So we'll take it in batches so that the speakers will not really forget the questions, right.  So after you and then we'll move on.

>> AUDIENCE MEMBER: Okay.  So good afternoon.  Congrats for this panel, I'm here representing the Elena Institute, whose mission is no honor children.  Part of our work is related to the digital rights of the children and adolescents, especially related to commercial exploitation of the online vulnerabilities.  I would like to hear more from you on the practices like targeted advertisement and so on.

Especially on the Global South.  So to children, and especially who are especially vulnerable.  So what are for you the roles of each part of this digital ecosystem and decision?

>> THEOROSE ELIKPLIM DZINEKU: Okay.  So I will go over the three questions shortly and then I will ask any of them that are ready.  If I can remember the first one is on the role of Big Tech companies, what exactly are theying to do in terms of digital protection and how they can keep the data safe, right, and how they can educate more people on that, right?

And then the second one is on the data protection policy, right?  Can you help me in the data protection policy?

Okay.  And then the third one is on targeted acts for children and aisle send the first.  And Selby, you start, and then we take the rest of the submissions.

>> ABRAHAM SELBY: Thank you for the question.  Data protection regulations.  You start with the government.  United Nations IGF as part of our team, they are also preaching about data protection and privacy.  It is mandated that each government, each country, must set up a data protection act.  In Africa, as of last year June based on my analysis report we check, we have 14 countries to data protection act.

In Ghana, we have Data Protection Authority.  They become a regulator.  The regulator ensures that entities, companies, businesses, individuals who process or collect personal data of any data subjects must be able to register with the commission.  It doesn't end there.  After you have register with a commission.  You have to be compliance.  The compliance means that they one audits every two years in Ghana to check.  How do you safeguard your data?  Do you seek consent before you share people's data?  What are the third parties you share the data with.  When you fail, when you breach, you define the key stakeholders.  When you are register for the data protection act, in my country, Ghana, you must provide the fee people, if there's any irregularities, the key decision makers will be held accountable.

But this is the gap, because we are talking about a gap.  Some interest could tries have it.  The gap in Africa is that African Union, you champion one data protection regulation.  That's going to cover every country so that we have state institutions who are going to regulate their company businesses.  So if I have a business in Rwanda, in Ethiopia and Ghana, we have one clause data protection regulations that's going to be regulate us.  And this happens on GDPR.

If I'm in Belgium, Germany, GDPR is a regulation that covers that.  So the gap is that all the government institutions must come together, and work together with the United Nations to formulate this process if we ever do that, we close the gap and compliance, that's the auditing section will be effective.  Thank you very much.  I hope I have answered your question.

>> THEOROSE ELIKPLIM DZINEKU: Okay.  Thank you so much.  So quickly move to Manu to answer the first question, the role of big data.

>> EMANUELLA RIBEIRO: We are going through this in Brazil and a lot of other countries and like they said about data protection, this also creates an opportunity for us to think about what are the questions that he with want to pose as the Global South because today we know that companies they work a lot in a maximize shareholder profitability.  How can we have civic engagement to create a pressure and see the community as a stakeholder that allows for pressure from us to think a little better about this use, the massive use of data and I will be really short in this response because it's a complicated and also we have four hands raised online.  I would really like for people to contribute if you are asking a question but you want to answer a question, you are allowed to.  So this is just preconsiderations.

>> THEOROSE ELIKPLIM DZINEKU: So before we move online, I will let Daniel answer the last set of questions in the room, on data privacy and children.

>> DANIEL OPIO: Thank you.  That's quite a tough question.  I'm hoping if there is someone in the room who has something later on could jump in.  But behavioral advertising or profiling, the nature of exchange of children and young people online is not any different from those who are above the age of young people, usually you have seen things that come up on, for example, Facebook for businesses, Instagram business act and all.  I think there has to be a form of conversation on moderation, how we can, you know moderate the tech companies, the tech giants and the businesses that are operating online.  That they do not serve the user profile of children and young people that engage with their platforms.  Case in point, like this duck, duck go that use special words to allow advertisement to go online.  They don't profile anyone.  They declined to share that personal information of behavioral, you know, patterns of use their clients with Facebook.

The same with the "New York Times" they declined to share that type of information with Facebook.  I think if we can get more engage on businesses and I think we can achieve a certain level of success.  Otherwise, that's a conversation that needs more than just ‑‑ I think it's a lot in the hands of the developers of this technology, just like you know, you create YouTube for children and there has ton authorization for parents and guardians for them to access.

I think the element of design comes in a lot to ensure that we help else behavioral advertising which can draw young children to places that they don't want to be.  I think I too Id to respond to that, but if there's anyone in this room that thinks of a better way in which we can address that issue, I would be very blahed to hear your views on that.  Thank you. ‑‑ glad to hear views on that.  Thank you.

>> THEOROSE ELIKPLIM DZINEKU: We have some hands raised online.  We have four hands online so far.  So we can take ‑‑ Manu, you can help to moderate that.

>> EMANUELLA RIBEIRO: Yes, the first one is Bibek.  You can introduce yourself as well.

>> AUDIENCE MEMBER: Thank you you everyone be, this is Bibek, I'm from Nepal.  I want to make a quick comment in the interest of time.  So while looking at the Global South, there's an extremely large number of the mobile users.  They are using for social media.  So when we talk about having dreams and regulations for governing the data I think there's a lot of risks with the small and medium before they don't have the security compliance and the government regulations cannot actually overlook at all of these aspects and it comes at the due diligence of the company to look at testing.  So I want to make the comment that you should advocate of better practices on both ends and it is very small start‑up companies, they are working with the due diligence.  That is my comment.  Thank you.

>> THEOROSE ELIKPLIM DZINEKU: Thank you K. we have the second person?

>> EMANUELLA RIBEIRO: The second one here is Chris.  You have the floor.

>> AUDIENCE MEMBER: Okay.  Can you hear me?  Yes, we can.  It's great to see that we have a conversation on digital privacy.  Digital privacy needs to be shared ‑‑ (Distorted audio).

And some developing nations.  So I would like to know, how do you tend to serve these challenges where we have those ‑‑ in my context, the African region?  (Distorted audio).

>> THEOROSE ELIKPLIM DZINEKU: Okay.  Can we take the third one and then we give over to Shradha.

>> EMANUELLA RIBEIRO: Yes, this is Nicolas now.

>> AUDIENCE MEMBER: Good afternoon, everybody.  Can you hear me?

>> THEOROSE ELIKPLIM DZINEKU: Yes, we can hear you.

>> AUDIENCE MEMBER: Okay, my name is Nicholas.  Congrat, guys on excellent panel.  So my comment has to do with aSemitic treatment, and another practical example.  In the UK, when you go to Google, you have to consent to their cookies before you get access at the platform.  In other areas, that is not there.  You good et cetera more data mined from users.  So I think it is important that as the Global South we champion discourse to get Big Tech to treat us similar to how they treat the other countries in terms of data handling.

But more importantly, in relation to the younger generation, I think that it is important to educate them on the value of privacy itself.

Because I fear that our kid brothers and sometimes ourselves are so used to being in the social media space that we no longer value what privacy is, and in which case even the education on the data protection may not be of any value to them.  So we have to keep the need for privacy, and so it becomes meaningful for them to apply.

It thank you.

>> THEOROSE ELIKPLIM DZINEKU: Thank you.  We have a hand up in the room.

>> AUDIENCE MEMBER: Thank you again.

In terms of the political ‑‑ sorry, the advertising profiling of children, I want to add that what we believe works quite well are the dimensions that make children and adult vulnerables, for example the psychological tendencies and should we use other sites 50 us that type of information.  If we go into the data categorize it and understand what type of information we are giving consent to.  We know we are giving consent to and whether that can be websited against us.  So, for example, we did some analysis on the political targeting operations in Turkey and we identified that there's political advertising.  And when micro targeting is used, let's say you have a Turkish flag on your profile picture, Facebook knows this might give some information about your political profile and further gives you the ideas about which you can do.

So we are giving way more information than we think we are.  Even if you just take a look at the fact that the last thing we do going into the social media.  That gives a lot of information about our sleep patterns.  That tells us about our depression and personal daily lives.  So with we really understand the depth of the information that lies there, I believe that's that will help people like you and thank you so much for the panel to really regulate what these companies should be doing and should not be doing.

>> THEOROSE ELIKPLIM DZINEKU: Thank you.  So we are going back online to our last submission online from Lucy.

>> Thank you so much.  Thank you for ‑‑ my initiative is connected to this.  And my submission is on the children and I hope to address the protection of children.  One the main things that we need to understand is the protection of children, that's one the major things that has to be addressed.  Thank you.

>> Do we have any more comments, or questions in the room?  Yes.  I will go to the speakers for the last submission.

>> AUDIENCE MEMBER: I have a question or a contribution regarding targeting and children online.  I come from Colombia and we studied not only direct advertisement but how influencers are not directly advertising.  I think is a much more complex issue because you cannot understand this under the data protection regulation ‑‑ yes, it does have relationship with data protection but not really because influencers get to the kids, through other ‑‑ that other ways.  So I wanted to add that this is a much more complex issue that deserves more research and we might talk about this further thank.

>> THEOROSE ELIKPLIM DZINEKU: Thank you.  I will go back our speakers with 30 seconds to one minute.

>> ABRAHAM SELBY: Thank you.  Before the minute.  The data protection policy, for children, the age varies from country to country.  The GDPR takes 16 years and then others take 13 years.  And there's a content on Article 8 and also some regulations for concept and other stuff and also Information Societial services and now my final take is that data protection is key and I will advocate that as far as we have courses, people have understand it, we must let it be a programming cost which we can train our youth in.  I believe that is the only way we can close the gap which the truth is I believe to under and around them.  They must know the data protection right.  Why are you taking my data what are you using my data for?  These are questions should ask why when you are proceeding your data and gem set your privacy policies in all the policies you set up.  And I'm humbled to be here.  I want to say thank you so much for this panel and I believe that we are talking very good about how the Global South is really vulnerable when I think you are in data privacy and we have a lot of ability to build the community that can construct us and people don't include us in the way of this platform so I think that's true that we can participate ‑‑ I don't know how to say the word in English.  We can create spaces and we can old more Muirarity and hold more people, like me, like us to build our resilient space to reach the sustainable goals of the the sustainable goals of the UN and thank you so much.  Sorry, I'm a little nervous.

(Laughter)

>> DANIEL OPIO: Karla you are doing just fine and well.  Thank you for your time and indulgence.  I hope to maybe meet the amazing audience that are committing about behavioral profiling.  I think we have a lot to talk about, both the two ladies and the gentlemen at the back.

>> THEOROSE ELIKPLIM DZINEKU: I will move to Manu and followed by Shradha.

>> EMANUELLA RIBEIRO: I would like to thank everyone who engaged with us.  We wanted a lot of participation and we had it and I'm really happy for it.

And I would like to also call you guys, like, let's keep discussing this as youth from the Global South.  Let's, you know research for the spaces we have for this and I think it's really important.  I will close up by a comment by Gene he made a comment that I think the data Protecion education gap is only the tip of the ice person it's a major economic issue and we do not forforget that there's economy interests that should be taken into consideration for these issues and how do we see ourselves in this and unite our fight to understand as Global South for better conditions and better technology that serves us really.  Thank you.

>> SHRADHA PANDEY: Thank you.  Thank you so much.  And I want to end by giving all the youth ambassadors and everyone in the audience whoever is interested in youth participation one key insight, is to keep the conversation going.  The questions that you are asking right now are really important, and we can all turn into one‑hour panel sessions on their own.  Try to participate more in such IG spaces and join the youth special interest groups and other youth organizations that are currently working.

For example in Africa, we have a mapping project.  We have in Asia the NetMission which is trying to make sure that the policies, et cetera are analyzed and they have a youth wing which works exclusively in this regard.  With youth ambassadors, after your year is over, do not think that it's an end.  It's only the beginning.  Continue working in this field.  Reach out to the previous youth ambassadors.  Some of them are doing exceptional work in the field of policy and in the field of making sure that our world becomes better and the data privacy is protected in their respective countries.  We need to make sure that it works hard and manages to do well in this regard.

For that, we need your participation, your support, and most importantly, your interest and passion to make sure that you leave the world a better place.  Thank you.

>> THEOROSE ELIKPLIM DZINEKU: Thank you so much.

I guess we will have to end here but just a quick summarily of what we discussed.  The session was basically on data protections, the perspective of the youth, and we have conversations around Big Tech companies and their role in data protection.  We spoke on legislation.  We spoke on education, more understanding.  We spoke on targeted ads for kids and their understanding in all that went we must say that this conversation has not ended here.  Let's continue with the conversation in our spaces and those of us here or online that have a greater role to play in making policies work or making policy inputs as you belong to the civil society government or anywhere, do the work.  We are so grateful that you are here.  Thank you for spending your time here.  My name is Theorose and you will have a nice day or afternoon, wherever you are.