IGF 2022 Day 3 NRIs Cooperating to protecting data at local levels

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> MODERATOR: My name is Jennifer Chung, I will be co‑moderating this session.  Next to me is Mr. Claudia, from Brazil, he will be co‑moderating this session with me.  Maybe a few words of introduction, first? 

>> Thank you, Jennifer.  Good morning to you all.  My name is Claudio, I'm currently a Professor at the University and in this capacity, also a member of the national privacy data protection Council in Brazil.  Although not speaking on behalf of the Council in this opportunity here.

>> MODERATOR: We have a full session of one hour.  It will be quite tight.  We'll have two sections.

The first section is looking at existing data protection frameworks and whether they provide data protection or not.  The world is filled with many data protection frameworks.  World had the general data protection GDPR.  Africa developed the Malabo Convention.  In the Caribbean, discussions are ongoing with the regional data protection act.  Asia has framework.  And out of 193 Member States, 137 have developed legislation to secure the protection of data and privacy.

But as we all know, data breaches still happen around the world.  They're on the rise.  You see it in the news, another breach of credit card or fraud information, personal data has been breached as well.  They take different and more sophisticated forms more and more, from the unlawful use of personal information without notice or consent, all the way to fraud.  So an ability to correct data you put there and illegal data trade, why do the challenges exist when international, regional, national protection frameworks are in place? 

So now, we really want to go around the world through our NRI network to look at the legislation, regulations, what is happening on the ground there?  We want to take a look to see the Best Practices happening, what is the reality, what kind of legislation and protection measures are coming down the pipeline? 

I would like first, really, to turn to our first NRI input from Central Asia IGF.  We have Artem.  What is the situation now for your Region? 

>> Artem: Thank you, Jennifer.  What I would like to say about our Region, Central Asia, first of all, all of our States have personal data protection laws, each in one area where human rights are thought to be banned.  Are they compliant with GDPR?  More or less.  Do they work?  No.

And I would like to say about one of the trends that are highly discussed in the Region.  So the allocation of data.  When Government requires all companies that works with personal data to store this data only on their Territory of the state. 

So of course they do it under the pretext of protecting our data.  But it went from mainly from Russia.  In Russia, for example, they put penalties on Facebook, on LinkedIn just for not complying with the law.

In our countries in Kazakhstan, they have the requirements and it is highly discussed in Kazakhstan and Uzbekistan.  So the problem here, of course, they try to oblige tech companies to store the data in the territory.  Tech companies don't want to do it because it is additional expenses.  They don't want to share the data.  But actually, I own this data.  As a person, I own this data.  They don't ask me where I want my data to be stored.  That's awkward.  Because I'm an owner, but Governments and tech companies, they handle my data.  Not like I want it to be handled but how they want to handle it.  That's I would call it of like data slavery.  Because ‑‑ so nobody ask me about that.  So tech companies treat me as a piece of data.  And Governments treat me as a piece of nothing.  Like that.  That is a problem.  Broad context. 

For example, in Europe, they say about human rights.  Europe, U.S., Canada, whatever, in our countries, we may talk more about humans’ rights.  It is a group of humans who have rights.  Usually they have rights to exploit other human rights, including our data.  So the problem not exist in laws.  The problem in how they're implemented and it's about more about culture of rule of law.  And it is most systematic.  It is not about data.  It is a systematic problem.  Until we have rule of law, everything in our countries, we won't have our data protected.  Thank you. 

>> Just trying to complement a little bit here.  When you put it like that, it is already complex enough, talking about data that you have, but at times, if we consider ‑‑ we all consider it also a human right.  It is data that you are, data that we are.  It is a very relevant aspect of who we are.  It is not only data we have as a possession.  They have the economic dimension.  It is clear for everyone that we have to protect and foster the economic dimension.  But also data that constitute who we are.

>> MODERATOR: Thank you, Claudio, for that very ‑‑ bringing home to how important it is.  We are our data.  And thank you also Artem for highlighting, we need the rule of law first before we even talk about all the data protection frameworks and legislation.

I would like to turn to the home continent here, Africa.  From North African IGF we have Dijami. 

>> Thank you, data protection framework.  That is one of the tools of data protection.  There are the technical and legal tools.  The framework is the bigger tool.  But with the frameworks, the frameworks are a lot.  (Feedback) the GDPR is the most famous, how to say, people try to comply with it.  Why?  Because it imposes fines.  Canada had a good framework, but there is no fines.  That is why nobody care.  GDPR, everyone care. 

So those frameworks are, as I said, one of the tools of data protection.  There is technical tools and legal tools.  All these tools, whatever they are, they don't give you a perfect protection of your data.  Because any framework can be regulated.  And because any technical protection such as encryption, whatever the complication of the model of the encryption, there is always someone to decrypt it. 

You never have to rely on the tools.  That is another aspect nobody speak about.  Which is the user behavior.  We have a big problem because people are not aware of the threats.  People are exhibiting their life, their business, everything on the Internet.  You are giving people your data.  So don't blame them.  They take them like this.  Your problem, if you had ‑‑ if you comply with GDPR, they're not allowed to store your data without your consent.  Because collecting, storing, processing, transferring the data needs two things.  First the consent of the user, of the subject.  And second ‑‑ how to say, the legitimate purpose of that action.

If there is one of these two conditions, which is not verified, they don't ‑‑ they're not allowed to take your data.

So to make it brief, in Tunisia, we have a very good framework.  It is very close to the GDPR.  But there are breaches, as you know.  All of the tools, we have to think more broadly.  We have to use those tools, have to use the most updated of those tools, but we have to be aware of the threat and we have to be someone who has a very good awareness about what they have to do on that threat.  Thank you. 

>> MODERATOR: Thank you.  I think you brought forward two really important points.  First is implementation.  You mentioned that Canada has a very good framework, but the implementation is very lacking.  So nobody really cares.  There is no fines. 

The second thing is user behavior, if you are not aware of your data, how it is protected, what your rights are with regards to the data you are not utilizing the national protective framework available to you.

Since we have been talked about GDPR from north Asia IGF and Africa IGF, let's turn to a European country to hear what the situation is in Italy.  Giacomo.  What is the situation in Italy? 

>> I don't want to disappoint you but GDPR is far from perfect.  But fix the principles, fix the fines.  Violate the principles.

Now the real impact of the GDPR will be seen when the procedure in the Court have been launched for violation of the privacy concept.  Will arrive to a decision.  Because at the moment, as you know, the procedure in Europe is if there is a violation of the rules, the regulatory authority asks for explanation, confrontation with the company, then ‑‑ I can try to be more to the mic.  Yes. 

I will try to regroup, naturally.  After the regulatory authority decide for the fine, then there is an appeal procedure.  This go to the normal tribunals.  There are already some important fines that have been given to the companies in Europe.  And we're now in the tribunal phase.  If the tribunal confirms the fines, the company will understand that they have to behave.  As you know, they tried many times to bring it to the Court.  If the Court give them reason, then they stop to comply. 

There is ‑‑ but the problem is that between the fine and the decision and the application of the fine, the most important part of the GDPR, the challenge is to be operational immediately.

At the moment, many authorities, Italian authorities are trying to be firm but on the other side also try to get the agreement of the platforms.  Because if you don't get the agreement of the platform, you go to the tribunal, for years, the situation you detected as dangerous will remain.  That is the first thing.

Second thing, there is a new development, very recent that this was entering in force last week.  That is the DSA, Digital Services Act.  The Digital Services Act replaces the former Director for e‑commerce.  It applies to all services online.  This gives ‑‑ creates ‑‑ gives power to the authorities, not only to intervene and assign fines, but to stop the services.  That will be even stronger than before.  Because before the procedure was long.  We don't know yet how efficient this will be.  With GSA, if there is a strong violation, the authorities can intervene.  There is a six‑month period in which each country in the European Union will decide which authority will implement this procedure.  Once the authority is in place, then ‑‑ I stop here.  But many other things that will be interesting to discuss later. 

>> MODERATOR: Thank you, Giacomo. 

>> It is important to consider this is a regional framework.  That is the strongest point.  Because Tunisia, against Google is an unfair competition.  Europe versus Google is something that makes sense. 

>> MODERATOR: That was a very important point.  I think Europe is leading quite a lot when we're looking at pushing back on making big tech accountable, especially with the Digital Services Act.

I want to drill down more, I think we talked about ‑‑ pivoting to looking at creating safe space for children online in this context of protecting data.

I know you mentioned there was a recent case in Italy.  Maybe we can hear more about the situation there on this particular case?  I think it was TikTok or social media? 

>> With the time and the audience.  Yes, there was a case last year, on 22 of January 2021, this is the decision of the authority.  It is in Italian.  What happened is during the Christmas break holiday, one young girl of 11 years old made a game on TikTok.  The game of sun gold, to stay as long as possible without expiring.  The game went long, it was suicide.  She was alone looking at TikTok and she died. 

This creates a vague reaction all over the country.  The public media were very severe on that.  The privacy authority decide exceptionally to intervene with this order on 22 January, and they stopped TikTok and went beyond their prerogatives.  They said it was a case of public interest for all the country.  They did this order and stopped it.  TikTok for two weeks was suspended of serves in Italy. 

Then they made the suspension asking TikTok to react and to try to comply.  What was the main problem?  Normally, if all the rules would have been executed properly, this young girl, 11 years old, was not able to be on TikTok.  The minimum age is 15, according to TikTok.  And in Italy it is 14.  The GDPR, 16.  In Italy the age of consent in Italy is 14. 

So there are discussion.  But any case, they were acting beyond the law.  Because this lady was 11 years old.  And nobody verified the age.

The authority ask that they will not revocate the suspension of TikTok in Italy unless they start a new verification of the age.  Even if they accept the self‑Declaration of the interested person, it is not the best solution.

But to introduce secondary tools to verify even those that themselves declare they're of age.  Another important provision they ask, that ask TikTok to understand the language.  Not only artificial intelligence, but also real moderators in the web that can intervene in the understanding the language.  The problem is if somebody doesn't understand the language, moderate about what?  This was a second important point.

And then they gave two weeks to the company to react.  The company, within two weeks put in place a system.  Then they are now again operational.

The procedure against them is going ‑‑ is still going on.  We have to see how it will end.  But what is important, there is a possibility of immediate reaction.  And the platform, if they don't want to lose an important market like Italy, they have to comply with that.  It is interesting to follow, and we're following carefully, of course.

>> MODERATOR: Thank you for bringing that responsibility, the discussion about age verification being whether or not it is a silver bullet or not.  And the current case against TikTok happening in Italy.  You mentioned in the end of the previous intervention that regional framework, that is why it is so powerful and you can have this leverage to be able to push responsibility or push big tech to become accountable for what they need to do and act.

I would like now to go to Asia‑Pacific.  Next to me I have from Bangladesh IGF, I have Mr. Mohammad Abdul, executive Director General of IRBF.

>> Thank you, Honorable speaker and audience in room and around the world good morning, good afternoon, good evening everyone.  I'm Mohammad Abdul.  This session name is Cooperating to Protect Data At Local Levels.  A very interesting subject in terms of global Internet governance politics. 

First of all, we support data localization.  Data clarification, data localization or data law require that data about citizens or residents to be collected, produced, and stored inside the country often before being transferred initially and regionally.

In business, the data classification has close ties to data clustering at the country where data clustering is described ‑‑ descriptive.  Data classification is predicted in enhanced data clarification and classification consistent of using variable and no values to predict the unknown of future values of others, variable at country context.

Which data open for all?  Which data restricted?  Ensuring data security and data privacy and data security guidelines.  We need to know exactly challenges, opportunity, progress, and way forward regarding these matters at the country context. 

In Bangladesh, we organize each year Bangladesh Internet Governance Forum, gives the IGF, youth, women and also Bangladesh School of Internet Governance.  Through this Forum we discuss protecting data at local levels for accountable, all multistakeholder regarding these issues.  The IGF is already organizing lots of policy knowledge about data protection draft act.

Thank you everyone for giving me this opportunity and to deliver my thoughts. 

>> MODERATOR: Thank you for the insights from Bangladesh.  It is interesting to hear that there is a draft on data protection in Bangladesh and the good work BIGF is doing to convene local stakeholders to discuss more on what data protection means for Bangladesh.

There are a number of NRI colleagues on the Zoom session as well.  Is there someone that would like to speak on the first segment. 

>> I don't see them online.

>> MODERATOR: All right.  Well noted.  We will go to the second segment about regulation.  I would like to pass it to my co‑moderator for introduction.

>> Thanks, Jennifer.  We have a slot here assigned to a representative of the youth.  If there is anyone from the youth that we don't know of here, please come forward.  Dijani has been a youth for a long time.  (Laughter)

Jennifer, I would like to start.  I don't know how many of you are connected to regional or national Internet governance initiatives.  Maybe it is good to give a keyword to instructors in the Internet Governance systems, in a two‑way mission.  National and regional items are important in two ways.  The legitimate way to reflect best practice and findings we have in global events. 

The other way around it is where we identify and prospect local ‑‑ issues that are only possible to be spotted and identified at a local or regional level.  Those are platforms.  Very important platforms from which regional and national issues come forward to global tables of discussions like this one.  We're now moving forward.  Giacomo mentioned the difficulty of dealing with knowledge that recognition of the path that might not be efficient at all times.

That is where we're moving now.  I would like to start giving you an insight on a personal experience, on a side issue with which we worked together with the foundation based in Panama.  In the middle of the pandemics, we conducted research for them looking at cybersecurity issues in Latin America during the pandemic.  One of the figures, the metrics impressed me a lot.

I use it to illustrate every time we talk about the dynamics of governance in innovation.  As of February 2020, Chile had 0.5% of its workers developing some kind of remote activity for professional purposes.  February 2020.  0.5%. 

As of April, three months later, that figure had risen to 5%.  5.6%.  June, new metrics, 18%.  There is absolutely no way to prepare ‑‑ we're talking about cybersecurity issues related ‑‑ how you prepare to perform your job from a remote location.  There is absolutely no way that regulation will follow that path, that evolution, those dynamics.  So bringing that to our data issue, data protection is not going anywhere.  To that extent, IGF has contributed a lot to raising the awareness and to urge the solution of the problems, that's why we're talking now.

I would like to revert to our next speaker.  Chang Choi is there now.  He can enlighten us on beyond regulation and beyond the rule of law in the evident issues we have to deal when we talk about data protection.  What else could be available?  Thanks for joining us. 

>> Am I audible there?  Do you hear me.

>> Yes, we can hear you.

>> Yes, yes. 

>> (Audio barely audible via Zoom) (audio skipping).

(Distorted) so many Government data to the whole industries.  To compare with OECD.  (?) two years ago (audio skipping) (distorted).

I meet with Government officials and they are focusing on how to (?) (distorted). 

Open data for the public views.  And the potential use throughout all of the institution mechanism.  This is really, really important. (?) it is public resources for the industry and then some start‑ups and then some Government commission some research companies (?) (distorted).

A public Government or public agency to share their own data for the public much the industry and general public.

This is what I want to propose.  Thank you. 

>> MODERATOR: Thank you Chang Choi for the context from South Korea.  I know it is quite interesting from the Asia perspective.  Unlike Europe and other continents, it is difficult to look at it from a regional standpoint, but the national legislation and regulation is the strong point of enforcing this legislation, also providing the protection that we really need for personal data.

I would like to also turn to the next speaker who's online.  I believe it is Jenna Fung from the YIGF.  Youth IGF is what Claudio was hoping to hear from the youth.  Jenna had something to mention about the bill in Canada.  Jenna, the floor is yours. 

Jenna, you are on mute.  

Can the host please unmute Jenna Fung from the Zoom room? 

>> JENNA FUNG: Can you hear me?  Thank you for the time speaking from my personal capacity.  This it is my experience living in Canada at the moment and digging and learning from podcasts knowing they have a bill in Canada in recent year.  This is an act that ‑‑ an act the consumer privacy protection act and personal information and data protection action in Canada.  It may change and consequential amendments to other acts.  Right now, it is not completely regulated, I believe.  It is more or less regulated by the policy of private cooperation that holds our data.  I believe regulations on how this company handle our data is very important.

One recent update, latest update about this bill C11 in Canada, it is on this online streaming act.  Which will essentially change Canada's broadcasting policy.  And give new powers to the country's broadcasting regulator.  This may result in international companies like Spotify or other company that provide streaming service to comply.  Taking Spotify as an example, right now, what we see completely depends on how they put our data into the algorithm and then prioritize and suggest content to the users.  Essentially that may marginalize production or creations of certain creator.  I believe that is what Canada is currently trying to do by discussing and taking action and protecting the competitiveness of their creator in their country.

That is like something I believe different countries can work on.  In terms of protecting creators.  By having this kind of act to deal with how this company like streaming company, taking care of the data and prioritize like the creations on the streaming platform.  At the same time, I think that protects end users, too, because it may naturally change how the end users see the content.  Taking Canada as an example, it is easy for us to see content from the U.S.

And then maybe some input from -- Canadians eliminate it because of the algorithm.

So I believe some act like this, like the online streaming act is like some steps for different countries to do in order to protect economic value of their own creators as well as their end users in the country.  Thank you. 

>> MODERATOR: Thanks, Jenna.  Thanks for the situation and giving us an overview of the situation in Canada. 

I would like now to pivot back over to the Pacific Region, from the Australian NRI, we have Cheryl joining us online.  If the host can please unmute Cheryl from the Zoom room. 

Host, if you can please unmute Cheryl Orr from the Zoom room and allow her to speak. 

>> Cheryl Orr: All right.  Hopefully you can hear me now.  I see Jennifer nodding.  I will take that in the affirmative.  Thank you for the opportunity.  Our Australian initiative, running in a rebooted form in the last few years, has privacy, protection of privacy as a core subject for all of its events in recent times.  But I just wanted to share with you a couple of things relating to what I have heard today in this meeting. 

We had ‑‑ I should say have a relatively privileged position of having a quite robust privacy principles embedded in our Privacy Act.  The act is circa 1988, so it is very much due for a refresh.  But even so, because the 13 privacy principles that are enshrined within the act ‑‑ and this is something that is effective on anything, instrumentality, Government, or business that has a turnover of more than $3 million.  You know, that is a big whack of what happens in a country.

This act has put us in a good state for quite some time, but I would like to suggest because it was developed as national privacy principles was developed with the input from the broader sec of the community ... so for example, the Internet Australia, the Internet chapter, Internet Society chapter was a major ‑‑ one of the big bodies that contributed right back there, pre '88 on the privacy principles.  So we were thinking Internet even before it was the modern thing to do.

However, almost 25 years on, it does need a refresh.  We also, however, have been tested recently with some very significant hacking events.  Which have hit both Telcos, large Telco and indeed Government instrumentalities, unfortunately an instrumentality which is looking after Medicare, which is our private health issue, with the risk of great exposure.  And those principles are being used now to ascertain what type of punitive measures as well as remedial actions will be enacted on those companies.

I guess what I was trying to share with you today is if you do a reasonable job of developing these protections, these personal privacy protections, even if they're in a very general and in our class rather ancient form, they still have utility.  And that is certainly something that is being played out as we are in Australia, literally, today.  This is the last lot of data from the Medicare issue has been dumped on to the document.  Thank you. 

>> Thank you, Cheryl, this idea of a multistakeholder construction of the framework is the exact and perfect ending we needed for this round.

Now we have plenty ‑‑ a good chunk of time to hear from you.  I will try to wrap up a little bit.  We heard alternatives starting from the idea of rule of law, without which it is difficult to move forward.  Gianni mentioned the divines and Giacomo had other issues with the Government ‑‑ regulated sure, sure.  Issues that interest us, the effectiveness of the data protection frameworks.  Other issues that we should leverage on is the incredible opportunity of the IGF held in the African territory.  Are we leaving the Global South behind in some way?  All interest in that.

And another alternative that you think can add up to the ones we have up here.  Please feel free.  Okay. 

>> ATTENDEE: Hi, everyone.  I am the Vice‑Chairman of MAG south Africa.  Most of the discussion is around the data protection laws only.  Regardless of other laws that affect the laws.  In the African Region there is laws that are affect would.  It gives authority to the powers to breach the privacy.  And also in some cases, like Sudan, they have data protection.  Egypt has one that is good.  We have national security in Sudan that gives the breach.  This is one. 

Also I think the last thing was a reason that the Government is used to protect privacy is national security.  What is the meaning of national security?  We need to work in this.  I think in the IGF, if we adopt something, a recommendation to try to define what is national security?  What is the criteria to consider some action, it is hard for national security.  It will be good to move forward.  Thank you. 

>> Important topic.  We have that in Brazil.  We have a recent specific law.  We go back in time, we identify over 40 instruments that touched upon issues that included segmental or specific area or subteam of law that was there.  We didn't give that, that is something to take into consideration all.  Do we have anyone else?  Tiago.  Thank you. 

>> ATTENDEE: Thank you, I'm a representative from the data protection authority in Brazil.  For sure, I believe multistakeholder cooperation is important for this.  But at national level, maybe even for smaller countries, there is an approach that might work.  It is hard coordinating but it can work as well is the experience I can share about the WhatsApp case in Brazil.  It happened at the beginning of the data protection authority.  We were a small team with no leverage to grasp the privacy protection policy of WhatsApp.  It was an interesting experience.

The competition regulator and consumer, they act up together, make a corporation.  This is unique.  We didn't have agreements beforehand.  It was very Ad Hoc.  But it was our first experience that showed us how the challenge to deal with digital platform is not only data protection.  It has several fields of law.  Having several regulators working together.  I know it is difficult, but on the technical level it brings bigger leverage against big tech and other digital platforms.  That is something to share as an experience.

>> It is the great example of the intertwining of laws brought up by our friend here.  Please. 

>> ATTENDEE: Thank you very much to the panelists.  Very interesting, insightful presentations.  My name is Jonathan Andrew, I work with the Danish institute of human rights.  I'm here with other colleagues from national human rights institutions.  We're working on sensitizing National Human Rights Institutions to the work of data protection authorities.  It is interesting to hear from one of the data protection authorities specifically on issues around enforcement.  I think another point to consider, particularly in listening to other human rights institutions the last couple of days, they have really been asking for information guidance on how institutions, public sector authorities, can better comply with data protection laws and protections of privacy also on the constitutional level.

I think an important point is to consider how regulatory oversight bodies, like data protection authorities can also help institutions, organizations, businesses in terms of compliance.  We have done work with the information Commissioners' office at United Kingdom.  On their website, they have a huge array of resources, templates, guidance, checklists, different ways to help all manner of different organizations in terms of informing them, aiding them.  They have a help line you can call with questions.  I think this is a key point.  There are an awful lot of different organization, private and Public Sector, they want to know what they have to do, how to inform their employees or colleagues, how to conduct training.  So I think this is a really key point.

Also sharing best practice resources, thinking about the education aspect.  Because, really we should be, you know, enforcement should be the last line.  We should be encouraging as much, you know protection and compliance.  We might forget a lot of interested parties that want to do more to comply and need help and guidance.  Thank you. 

>> MODERATOR: Thank you very much for the important insight Jonathan.  When you say enforcement should be the last line, I'm sure ‑‑ this is regarding the importance of enforcement.  But really pushing it back to other ways ‑‑ we're talking about collaboration and cooperation.  So collaboration should be at the base. 

>> ATTENDEE: (Off mic). 

>> MODERATOR: You were not misunderstood.  I kept that because we heard it from many of the members of the Board of the data protection authority in Brazil.  It was an authority set during the period of pandemics during a time when people could not meet.  There was every reason for the authority to be far from the public, far from the companies, far from the Civil Society.  And I'm proud to say they followed exactly contrary path.  They connected and held all possible points of contact and opened all possible points of contact with the Civil Society, Public Sector, and companies and community.  The good thing is the spirit built due to the period of the pandemic is the spirit of the national data protection authority.  There are issues that cannot go off the table.  The spirit of collaboration is where we should start from.  This is the baseline for our conversation here.  Thanks a lot.  Someone else? 

>> MODERATOR: Any more hands?  Keep it up.  I can't really see.  Okay.  One, two, three, four.  We will take four more from the floor and we'll have to wrap up afterwards.  Please. 

>> ATTENDEE: Hello, I am from Dominican Republic, Caribbean.  In the Dominican Republic, we have a law on data protection that is basically related to people, to person, to protect their data.  But it is very focused to put off for the financial sector.  So there is a movement.  And part of the Internet Society and the Chair there.  We're part of a collaborations process to improve the law there.

From the point of view of organizations that contract different Cloud services, there is no regulation yet.  Especially my focus in this issue is that usually the main focus is for people who uses application services need to have their ownership of the data to move it and use it.

This is just a question.  How do you manage the ‑‑ what is your experience about handling data from organizations to conduct serves on the Cloud?  Thank you. 

>> ATTENDEE: Again, the European experience.  Cooperation is the key, again.  In this specific point, Europe is preparing regulation about the Cloud.  And as you probably know, the European Commission is stifled by the European Court of Justice.  Because through the trans‑Atlantic agreement they could not secure the data of European citizens are stored and kept on European soil.  The European courts asked the Commission to do something.  So now they're preparing a Cloud act to ask that this data will be stored, treated, not leaked outside of the European zone, even the big platform.

The problem is, if you do this on a national basis, no way.  You cannot impose that.  There is one data storage in Bulgaria, Finland, you can say whatever you want, within the Region, under the same legislative framework, then it could work.  It is working.  I'm sure that even big companies would comply with that.  Again, comes to cooperation and regional approach, not individual answers. 

>> MODERATOR: Thank you, Giacomo.  

>> ATTENDEE: Thank you very much for the input.  I will know short.  My point is elaborated by others.  I am a senior advisor and MAG member of the international IGF.  I want to talk about the cooperation and communication between different regions.

It is good to have good initiatives, good and well done regulation framework.  But it is more important to share the Best Practices.  That is first.  Second one, it is important also to talk about mechanism to implement that.  Mechanism to spread awareness.  I'm speaking about specifically the African continent.  Everyone knows the African continent needs more work in terms of information and awareness at all levels.  I want to point out there is a need to work more between the different region to guide or help or assist Africa, as an example.  I'm from Africa in implementing and doing a complete action plan towards protecting African data.

>> MODERATOR: Capacity, capacity building is a side note.  We needed that.  Coming the Global South I know how much a good capacity building program build on that.  Thanks a lot. 

Our friend over there? 

>> ATTENDEE: Hi, I'm from Chad.  I coordinate the Chad IGF.  In terms of protecting the end users in rural communities, how can we create awareness about digital literacy in the global content to make the end users actors in their own protection?  Thank you. 

>> ATTENDEE: Hi, I'm from Togo.  My question is like in most of the African country you have the (?) law.  Those law does not include the new technology or new challenge.  Before the law is brought, another issue is coming up, they have to do another update.  This keeps going on.  At the end of the day, the law ‑‑ they're not reflecting what is happening actually in the digital world.  My question is what do you recommend in that case?  When putting in place the laws.  Thank you. 

>> MODERATOR: I propose we frame the two questions to our panelists and move to the end.  I don't know if you heard the question from Chad and Togo put up two questions.  What else can we do to build capacity within the national spaces.  And also, how do we deal with the fact that laws come up for technology that no longer match?  Do our panelists have anything on that? 

>> MODERATOR: Dijani? 

>> I wanted to speak on another thing, but I will speak on that.  It is a problem of the awareness of the deciders and the organizations, they're not aware of the threats.  So they don't think to build capacity for users.  They need to raise awareness of the users that using the Internet is not put anything on the Internet and don't care about what you put on the Internet.  So this is very important point.

And we, as a North African IGF, we organize the school on Internet Governance, back‑to‑back with our IGF to raise awareness.  This is the tool to do it.  So we try to give to people the sense of the threat that they are facing.  If they don't use the Internet as it is to be used. 

Second, another point about the enforcement.  Yes, enforcement is the last point, the last step.  But the enforcement is necessary.  If there is no enforcement, it is useful.  As you see, there is a lot of legal framework.  But how to be enforced in different jurisdictions?  This is a big problem.  Even if the GDPR is little bit cross‑border because it is punishing even companies that are not located in Europe, since they're serving European customers.  Other frameworks, it is local to the country.

I think if we want something to be done, we have to think about the Treaty about data protection.  That every country respect and comply with. 

>> MODERATOR: Thank you.  Maybe a 30‑second wrap up?  Maybe 15? 

>> So from the situation ‑‑ I represent the local NGO that is actually a local point of advocacy in digital technologies and working for more than 20 years and we're well known in the Region.  We can work with Government, we can work with Private Sector.  And we organize probably the same as they said, the digital rights school to gather Government, NGOs, private experts to just talk about different digital issues, technologies and to build nexus between them to work further.

For example, what we do now, we write probably first in the world, digital code.  There is criminal code.  We are writing digital code to regulate the digital space.  What we want to do is not make it dependent on technology.  To make it exist for long time.  Yes, it won't be dependent on 5G, 7G, whatever.  It will be in place for many years.  Then of course it will require some regulation and so on, but it will be there.  It is going to be a framework for digital technologies. 

>> MODERATOR: Thank you, Artem.  Anu? 

>> ANU: Thank you, Jennifer.  A little observation.  Data protection is very sensitive issue for business and security level.  So my observation, we need more meaningful multistakeholder dialogue and more regional level dialogue. 

>> MODERATOR: Giacomo. 

>> 20 seconds.  I'm sorry.  There are many points to be developed.  First, I just send to Jennifer the guidelines for digital media and literacy documentation. It will be submitted to develop curricula in the schools in education to the children for the school and empowers of the digital dimension.  That is important but cover only one part.

The second part missing is Cloud dimension.  That people have grown‑up, not going to school, how to catch them?  In Europe, one of the tools that we have our public service broadcast that exist in several countries.  What is happening is most of the mission has been revised, including the Digital Divide and how to tackle it.

This media can reach all the populations.  They're not following the traditional media.  You can catch more through the school.  Others on traditional media, radio, and television are still powerful.

The third point to mention is about the dimension.  In all of the legislation the European Union has made, there is a dimensional discourse.  If you have to face a big tech company, then this the national authority has to cooperate with the other international authority but goes beyond this mandate.  It says how many people can live with.  In that case, the authority can ask ‑‑ if they don't ask, the other authority thinks the risk is all the region.  They can and in the national authority.  The problem is with a small authority with 20 people, you cannot deal with 100 of lawyers of the big tech.

But if you do, on a regional basis.

Last point to mention ‑‑ sorry ‑‑ the recent agreement between African Union and European Union.  African Union can ask European Union for assistance in the processes.  Don't be shy.  Ask for that.  For the moment the cooperation is more traditional.  But this is the time to do this. 

>> MODERATOR: Thank you, Giacomo.  I think our greatest take away from this session is we need another hour to talk about all of the issues.

The first step going forward, I think Giacomo sent a good link to the Zoom room chat.  Anya will encapsulate that and send to those interesting focusing on the capacity building and education and other aspects touched on including interregional dialogue and cooperation.  Maybe that is something to take up in 2023, the work to do together.  That is important not to lose the momentum and learning from the regional implications, you know, all of the legislations coming up, and frameworks that are working right now.  Thank you, everybody, online, in person for really enriching and wonderful session.