IGF 2021 - Day 1 - WS #170 Child Protection Online - How to legislate?

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> JUTTA CROLL: In two minutes we will begin with the video and then we will go on with the discussion.

>> We all live in a digital world.  We all need it to be open and safe.  We all want to trust.

>> And to be trusted.

>> We all despise control.

>> And desire freedom.

>> We are all united.

>> JUTTA CROLL: Hello, everybody.  Welcome to our workshop at the 16th Internet Governance Forum in Poland and around the world.  My name is Jutta Croll.  I am leading the project Children's Rights and Child Protection in the digital world.  And that is what you see from my background and what we would like to focus on.

The IGF 2021 started today.  And when we look back 16 years ago in 2006, regulation, especially government regulation, was not too much in the focus of internet governance.  The strategy focused on enabling of free market for digital services and the assumption was that such a flourishing market would regulate itself.  Nearly no one expected, then the situation like that we have to face now.  Huge amount of illegal content, in addition content of Nobel illegal but still harmful to children and also to the community of internet users at large.

Opportunities to communicate and to interact with other users, with people around the world, but also bearing the risk of unwanted grooming and solicitation into riskful behavior, be it cyber grooming, sexual molestation or recruitment for terrorist and criminal activities.

We have heard it in the video at the start of this session, we all live in a digital world.  And children, young people under the age of 18 do so even more.  So, after more than a decade of deregulation and suffering regulation, now government features more prominently also on the agenda of this year's Internet Governance Forum.

I would like to take this opportunity to refer you also to the main session on regulation that features first day at 11:15 central European time in the plenary room.

In this session we will discuss national and European approaches, including the digital services sect to regulating the protection of children on the internet.  I would like to start this session with a quick round of introduction of our honorable speakers we have today here, assembled in our Zoom room.  And I would like to start with Beeban Kidron.  Beeban comes from UK and she is leading the 5Right Foundation, who has been also working on the general number 25 on children's rights in the digital environment, together in close cooperation with the Child Rights Committee of the United Nations.  She is an English film director.  So, she has a long-standing experience in the media and in production of media content.  But I have learned her to know as a humans rights campaigner and she is also a member of the UK house of law.

From Egypt, we welcome engineer Hoda Dahroug.  She has been studying at the (?) in Paris and has a Bachelor in Interconnected Communication from the Cairo University.  She has assumed several leadership positions, most recently as a member of the Presidential Advisory Council for Community Development, in addition to her position at head of the Central Department of Community Development in the Ministry of Communications and Information Technology in Egypt.  Welcome, Hoda, to this session.  And I am really interested in what you have to tell us from your country.

Then we have Kenneth Adu-Amanfoh from -- representing the African continent, the voice of Civil Society from the African continent.  His organization is the Africa Cybersecurity and Digital Rights Organization, shortly ACDRO, from Ghana.  Welcome, Kenneth, to our panel.

From Germany we have two colleagues from the -- sorry.  I am missing my words because I know you so well in German language.  It's Thomas Salzmann and Michael (?) from protection in the media.  Very recently established organization within regulated in the Youth Protection Act that came into force in Germany on May 1st, this year.  And they will probably tell us as well from that organization and from the work they are heading to.

Then we have from the government -- from the industry sector, David Miles, my good friend representing now Meta.  He is currently Meta's head of safety for Europe, Middle East and Africa, so covering also various regions.  And he has more than 20 years of executive management experience within the technology regulatory and charitable sectors including IBM, Compaq, the Family Online Safety Institute and the British Forum of Film Classification.  I'm pretty sure Beeban and Dave, you will know each other well.

We had expected to have a speaker from the Polish government.  But, unfortunately, Tomasz Kulasa had to withdraw on very short notice because he has an appointment with the Polish Prime Minister and, therefore, he cannot represent the Ministry of Education, Science here in this session.  But they will forward us the Polish position for our report of this session.

Last, but not least, I am very glad to welcome Agne Kaarlep.  She is a policy officer and works in the European Commission on Developing Effective Policies on encountering harms in the digital space and developing policies for safe online space for all, which is the issue of today's session.  She is also part of an international team where she is negotiating the Digital Services Act with the Council of European Union and the European Parliament, and we know that the trialog is going to start very soon and that there is very much in the Digital Services Act for the protection of children.  So welcome to you, Agne, to you as well.

We have on the agenda short introductory notes on the status quo of child online safety, regulation on national and regional level and after we have heard from Egypt, Ghana, from UK and Germany and from the European perspective, then we will go to the industry perspective from David Miles.

I would like to invite Hoda Dahroug to make a start in this session.  Please go ahead, Hoda.

>> HODA DAHROUG: Hi.  Good morning, good afternoon, and good evening to everybody in this work.  I am very happy to be in IGF 2021 as I have been there for twice before and this is my third presence in this great event.  And I have to express my great appreciation for you to organizing such an important session and for the organizers unveiling this iconic experience for all of us.

In this century the internet controls all life spheres, humanities, express ideas, (?) and opportunity across the world.  In this (?) Egyptian ministry of public and information of technology added additional steps and revolves around availing additional structure, legislative framework and for sure building next generation (?)

We have around 100 million citizens in Egypt.  This is a big number.  But most of them have mobiles and internet connection.  And as we believe that no one shall be left behind, so the internet is a basic right.  In this we work with implementing a promising approach entitled (?) Egypt Activity.  It is a presidential project and initiative, all the governments share in it and it is in Arabic word that means guarantee decent life to 4200 poorest villages in Egypt.  And MCIT by the full sharing by connecting them with fiber optics cables, provide all Egyptians, especially young generations, interest internet, hence, applying the inclusion and equity principles and preparing them for the future.

However, the internet creates opportunities, alongside with challenges and risks, definitely.  It gives the eyes to cyberattacks and internet times like cyber bullying especially by social media platforms.  This impacts the profound when experienced by younger generations.  Egypt is obliged to most international and regional child protection treatise, proposals, agreements, charters and where the government (?) at July's national committees to follow up the exclusion of this (?) and we as Egyptian MCIT through the center department of district community development that I am honored to have headed, have early tackled the internet challenges in an effective and responsible way to ensure that internet is a safer place for all citizens with a special focus on children and teens, as well as for additional business.

And this is adapted through three main pillars.  Supporting regulators -- regulations and policies.  Setting out the responsibilities of all industry players toward the users.  And finally, producing age appropriate content and services.

I will stop here and will describe them soon.  Give now the floor to the panelists.

>> JUTTA CROLL: Thank you so much, Hoda Dahroug, for your first intervention.  I would like to mention that all participants in the room are invited to pose a first round of questions after we have had these interventions from the panelists.  And then we go on to discuss the policy questions.

I am here with my colleague Torsten Krause from the German Children's Charity (?), who has helped to set up a session and the agenda and who will also support with the questions you pose in the chat.  And having said that, I would like to go to Kenneth from Ghana and speaking for the African continent as well.  You have the floor.

>> BEEBAN KIDRON: Kenneth, you are muted.  You are muted.  You need to --

>> JUTTA CROLL: Now we can hear you fine.

>> KENNETH ADU-AMANFOH: Can you hear me now?

>> JUTTA CROLL: Yes, we can hear you very well.

>> KENNETH ADU-AMANFOH: Okay.  So, can you hear me?

>> JUTTA CROLL: Yes, we can hear you very well.  Just go ahead, please.  Tell us --

>> KENNETH ADU-AMANFOH: Thank you.  Exactly, exactly.  Thank you so much.  And I am glad to be part of speaking at this very important IGF, global IGF forum.  I have always been speaking and I'm glad to be here again.

Just let me begin by quoting from an article that I read from -- an article by the UNICEF on action to end child sexual abuse and exploitation.  Profound statement, more and more children are connecting for the first time every day, either on personal or shared devices.  However, wider and more easily accessible to data and internet technology poses significant challenges to meaningful connectivity and children's right, including safety.  And I found this a very, very, very, very relevant.

Now, it's important to know that the impact of this whole threats range from the child's personal data, privacy, harassment, cyber bullying, cyber stalking, harmful online content, grooming from sexual forces and sexual abuse and exploitation.  The global challenge on children, child online protection including that which is African requires a global response, an international cooperation, national coordination, and even within the country specific, within the region and within country they have to be effective coordination among all the various stakeholders, the relevant stakeholders.  And I believe that with more relevant and digital technologies, this issue of child protection in line with UNCRC would be ahead.

There are challenges and (?) due to the nature of the online environment that we all find ourselves, the (?) environment and I believe that internet is an unregulated and as we know, people are trying to regulate, unregulated now and that forces more risks to our children and for the children who don't even know their rights.  And when it comes to advocating the uses of the internet on their rights, it's much more difficult educating the children to know their rights online.

When we come to children within the African region it's more difficult because in terms of the tools and equipments and the knowledge base is very low.  So, even teaching them to -- on the rights, on the unconvention becomes a little bit more difficult.

In Ghana and also in a lot of -- a couple of African countries, Ghana was one of the first countries in African, though.  Ghana has developed a strategic, a child online policy and strategy and Ghana had partner with UNICEF and developed this policy and we have already started enrolling in its implementation.  So, administration is being done, different methodologies, you know, to reach out to the children in the underserved and remote areas to make sure that the inclusion, there is that inclusion on all the space.  And I have seen a number of countries, Nigeria and other countries that almost implement, Kenya is one of them, that have already have the child online protection policies and strategy in place that have been implemented.

I believe that once we have this collaborative effort, we bring in all the stakeholders, both national and international together to work towards a common objective to secure our (?) online environment to protect our children, I'm sure we will have (?).  Thank you.  I will reserve the rights when we get to the nitty-gritties of that session.  Thank you very much.

>> JUTTA CROLL: Thank you, Kenneth, for giving us an insight into not only what's going on in Ghana, but also in other countries on the Africa continent.

I think we will get back to that question whether we need -- how we can achieve a global response, when on the other side, like you said before, we have the borderless internet and we need a global response.

I would now like to turn to Beeban.  Beeban, you have the floor to speak on the situation in your country.

>> BEEBAN KIDRON: Thank you so much.  And thank you, everyone, for being here.  I'd like to talk about one -- two particular things in the UK and then maybe something on the international front.

The first is the -- in my capacity, across bench, that means underlying peer in the House of Lords, I was able to introduce something called the age appropriate design code into our data bill.  And I think that I would like first to just say how important it is that when we think about child safety, that we also think about data.  Because in a sense, you know, the design of services is really driven by a hunger for data, a hunger to extend our youth and a hunger for network and growth.  And that desire actually means that we have quite a lot of features that push the behavior of children and impact on the behavior of children in particular ways.

And with the introduction of the age appropriate design code we have seen some really interesting changes to platforms.  I can't go through them all here.  Maybe David will talk about some of the ones in Instagram and so on.  But what we saw was things like safe search and auto play being turned off at YouTube and stopping notifications after 9:00 on TikTok and stopping direct messages and so on.

So, the first thing I'd like to say is that we have some experience here and those changes were all global.  And I want to say that the code sits very much on the European leadership with GDPR and so on.  But I want to make a case here that child safety and children's rights have very well served by data protection.  And we must consider it internationally as an absolute tool.

The second thing is that I, actually, have the privilege of being on what we call here the prelegislative committee on the online safety bill.  What that means is that the government has put forward a draft online safety bill with the intention of making the UK the safest place to be online.  That's the government's claim.  And a number across both houses and across all parties, we were asked to look at the bill and make some recommendations.  And that report will be out next week.  So, I can't do any spoiler alerts, but I think that what we are looking at and what the evidence really drove us to is really to look at not only how do you diminish the impact of bad actors, but how do you make the online services responsible for amplification, recommendation, spread of those bad actors.

So, rather than seeing it as a 2D world in which there's a bad actor and a victim, there's actually the mechanism of the service itself and is it serving, you know, is it serving the safety of children.

So, I think that next week we will be -- have a lot of things coming out of that.  And I hope that it will be both inspiration and to a degree leadership in one particular area.

But I think the last thing I want to say on this very short intervention is that I think we have to be careful not to keep on reinventing the wheel in every nation state and in every -- in every place, and imagine that no one has done any work before.  And I think that, you know, I feel very passionately that the code sitting on top of GDPR was a successful thing.  And that, actually, some of the looking over the what the lawmakers are doing in Europe with the DSA has been very instructive for the online -- instructive for the online safety bill and it is really important that we try and find responses that are both possible for us to share within -- you know, within broader sense, but also that, actually, drive the base line of the tech sector up.

I think as was sort of put -- as Jutta said in her introduction, that really, you know, I think the era of self-regulation, co-regulation has been something of a tragic failure.  I regret that.  But it has been a tragic failure.  So, if this is the era of regulation, then we have to work very smart to make sure that the fight that does not bring in its wake a whole load of unintended consequences or indeed, take out the benefits for children in particular or that the digital world allows.

Let me stop there and I am sure we will come back to some of that.  Thank you, Jutta.

>> JUTTA CROLL: Thank you, Beeban.  Thank you so much.  I think what you said was the perfect segue to now germ speaking on the Youth Protection Act that has been amended for and payment to France May 21st, you said.  The prelegislative committee wanted to make the UK the safest place.  I do think that the legislators in Germany wanted the same for Germany.  And now let's hear from Thomas and Michael, please.  You have the floor.

>> Absolutely, Jutta.  Thank you and good afternoon to everybody from Michael and me.  The usage habits of children and young persons regarding digital services have completely changed.  From browsing to social media, from confrontation to interaction.  This leads to new opportunities but also to new challenges and risks we have heard about yet.

Based on a reform of the German Youth Protection Act this year the federal agency for child and youth protection in the media now pursues various approaches to establish and modernize protection of minors in the media.  To holistic concept of youth protection, we have to go from protection by shielding to a trial of protection, empowerment and participation.  This approach follows the United Nations Convention on the Rights of the Child and the general comment on the rights of children in the digital environment.  First of all shielding minors from harmful content is still an important factor in our youth protection concept.  The federal agency (?) harmful content that makes it illegal to provide it to minors.

What is more, we have a strong legal mandate to inform and give orientation about harmful phenomenons.  So media literacy became a part of the regular concept of youth protection.

Finally, how do you strengthen the children's rights of participation in digital media, although social media and games involve a great variety of risks.  Risks from communication and contact functions purchase functions, mechanisms to promote excessive media use, behavior and so on.

The new law wants the companies to build a structure of protection and support in the digital services that are favored by young persons.  These precautionary measures include, for example, child friendly terms and conditions, safe default settings for the use of services that limit the risk of youth depending on age, and child friendly notice and takedown or support opportunities.

Probably later we can give you a short overview about the different approaches to implement precautionary measures in digital services.  These are joint exercise of responsibility by state authorities, companies and the Civil Society, and a dialogic regulation which might end in a law enforcement process thank you very much.

>> JUTTA CROLL: Thank you, Thomas, for your insight into the amended German law.  And I think we will discuss this approach of dialogue regulation further.  I think it builds also somehow a basis of the Digital Services Act that Agne Kaarlep is now talking about on behalf of the European Commission, DG connect.  Agne, the floor is yours.

>> AGNE KAARLEP: Thanks a lot, Jutta.  Good evening or good morning to others depending on where you're tuning in from.  So the EU is in the process of overhauling the horizontal legislation, the e-commerce directive which has been the cornerstone of governance of the internet from the year 2000.

And I think we can all safely say that in the past two decades the digital space has fundamentally changed and changed also us, the way we live and work, we communicate.  So, in response to, sort of, looking at these challenges in the online space and also, sort of, looking at regulatory developments in Member States as Germany has just mentioned and also previously in the UK, the commission proposed a Digital Services Act with the aim to ensure safety of users online, including children and minors.  While also providing, of course, strong safeguards to protect fundamental rights.

So, with this horizontal piece of legislation we are responding to some of the key challenges that we have seen in the online space.  So, the spread of illegal content, insufficient protection of user rights or fundamental rights online.  And also systemic risks stemming from the way the digital space functions itself.  So, including the way algorithms and content moderation systems work.

So, we have set out a number of obligations, and I think I will get to these a little bit later in the session, a bit in more detail.  But to say up front that measures to ensure the safety of children and minors are -- certainly have an important place there.

And, of course, this is just a proposal.  So, we are still negotiating together with all the Member States and the European Parliament.  But we have already seen that the council, who has just recently come to a unified position, has made some amendments to first highlight the importance of tackling issues related to children and minors.  So we can be somewhat assured that in the negotiations, this ambition will be maintained.

But I think just important to note that this is not the only piece of legislation in the EU that exists, that targets these issues.  Among others, as Beeban was saying, it's GDPR which has targeted rules related to minors.  There is the social media directive which has rules to audiovisual media services as well as video sharing services.  And also looking into the near future, we are in the process of developing a new strategy for better internet for children.  And we can also expect soon sectoral legislation to combat the challenge of child sexual abuse material specifically.

But I think I will stop here for the moment as an introduction.  And looking forward to the rest of the session.

>> JUTTA CROLL: Thank you, Agne, for giving us a first impression on what we have to expect from the European Commission and the harmonization on European level.

Now we turn to David Miles from industry and would like to hear the perspective of a platform provider in regard of being regulated, for doing self-regulation, and taking their duty of care responsibly.  David Miles, the floor is yours.

>> DAVID MILES: Thanks, Jutta.  As I was preparing for today I was reflecting on my very first IGF in 2009.  A lot has changed in Egypt and beyond since then.  But what was apparent then and up until a few years ago was the fact that in terms of children's digital lives there was little or no regulation.  What there was was about the collection of data with the children's online privacy protection act, copper, and the FTCU vision effective in July 2013 and GDBR.

And now in 2021 at the 16th IGF and more than a decade on, that period of self-regulation is coming to an end.  Of the 120 countries in AMEA, I oversee more than 20 have child privacy safety regulation on the way or already on the books.  The age appropriate design code in the UK the German, federal and interstate youth federal protection laws excellent best practice examples of the kind of work that's going on and will influence a lot of other regulation around the world.

At Meta we have applicated for the democratic government to set new rules on areas like local content, privacy, data and elections because we believe that businesses like ours should not be making these decisions on our own.  What people often agree -- disagree about exactly where to draw the line, government regulation can establish standards for all companies and standards that we should be able to meet.  Companies should also be judged on how their rules are enforced.  Our community standards let our users know what they can and cannot do on Facebook or Instagram.  Throughout enforcement report, Meta has published figures on how it deals with harmful content, including how much of it is seen and taken down for the past three years.

Our next report will be subject to external audit.  And along the oversight board we continue to lead the way in our approach to transparency and accountability.

Of our 60,000 employees here at Meta, 40,000 work in safety and security and I'm one of those.  We have spent more than $5 billion in the last year to ensure our users are kept safe and feel empowered to connect and build community.

We can always do more, but the safety of our users is vitally important, including cross collaboration through organizations like the tech coalition, a range of international organizations, and forums like the IGF.

I look forward to today's discussion.  And back to you, Jutta.

>> JUTTA CROLL: Thank you, David.  I think we have now got a good overview on what is already regulated, what is in the line, what is about to come within the next month and probably years, because we don't know how much time it will take to discuss all the maintenance to the Digital Services Act that are already there.

And I would like to encourage participants in the room be it virtually on Zoom or on site, please raise your hands and bring forward to raise your questions you probably have directly to the panelists, to the attention of our session.

I have already seen that a colleague from Mexico is in the room virtually, I think.  Mauricio Hernandez, could you give us an impression how the situation is in Mexico?  I do remember that when we had the IGF in Jalisco some years ago, that gave a push to the situation in your country as well.  Would you be able to speak to us?

>> MAURICIO HERNANDEZ: Absolutely, Jutta.  Hello, everybody.  It is a pleasure to join the session from Mexico City at this time.  While the situation in Mexico is very difficult right now, most parts of Latin America, we are (?) problem regarding this illegal movement of (?) from Central America up to Canada and Europe.

I would like to share, it's a better possibility, I think we need to talk between promoting legislation that will share technology developments, plus with a human rights charisma.  And with this, I want to mean that considering children, illegal activities and crimes, we all need to develop a sure agenda and a sure regulation for internet and platform standards.  It will not be possible to fight against the jurisdiction rules if we do not have standardized rules in order to how to proceed from the private (?) and the platforms in order to provide at least minimum session environment to our children.

On the other hand, what we -- I believe that what we need is the compromise and the commitment from platforms to develop and to limit this technology respecting human rights.  Human rights have for children means (?) over identity, considering some kind of limits for sharing contents, and to develop by ages some kind of continuity allowed to share to (?) consume --

(Audio difficulty).

>> MAURICIO HERNANDEZ: We need to consider that minors' data is the most profitable asset we will have in the activity of the internet, because they will continue developing as regular customers when they come into age.  So, if we do not consider and commit companies and stakeholders develop various rules that will be flat all worldwide, we are not going to be able to provide authorities enough abilities to pursue and (?) the illegal activities that are committed not only in Latin America and Mexico for sure.  And I must admit, but also in the rest of the world who are customers of this, children (?) let's say.  And it's a pleasure, Jutta, and all the rest of my colleagues to meet this afternoon in Poland, this wonderful, wonderful session.  Thank you very much.

>> JUTTA CROLL: Thank you, Mauricio.  I really appreciate that you highlighted that the -- how did you put it?  Minors data are the most profitable assets because they will stay, of course, online for a long time, and that we have to take care of that.  We have done so with the European GDPR and we are further beyond Europe as well.  I think this is --

>> MAURICIO HERNANDEZ: Add one more minute from the educational sect.  In Latin America, information technologies and computer law is not a formal area of education from the beginning.  And I think we are leaving and relying (?) law and the authorities when the education system and the education ministers must begin working about how to promote the responsible and accountability profile while surfing internet and that's for all of us.

If I remember pretty good in Jalisco, a kid said what's the matter with my internet?  I think we have to change that chip in minor mind.  Thank you.

>> JUTTA CROLL: Thank you, Mauricio.  Maybe we can turn now to the first policy question we wanted to discuss in this session.  And that is how can we ensure that government regulation, self-regulation and co-regulation approaches to content moderation and child online safety are compliant with human rights frameworks, that they are transparent and accountable and enable a safe, united and inclusive internet, which is the whole overarching theme of this Internet Governance Forum.

When you, Mauricio, mentioned the human rights charisma regulation, I think that is a fine term to describe what we are looking for.  But how could we ensure that?  And I have Hoda and (?) on my list to try to give an answer to that question.  So I would like to take Agne and also Hoda and also other speakers and panelists are invited to answer the question.  How can we continue moderation balance with -- and child online safety balance with human rights frameworks, Agne?

>> AGNE KAARLEP: Thanks, Jutta.  This is something we tackled and then analyzed a lot in the preparation of Digital Services Act because it's absolutely a central element of our approach.  And we paid also a lot of attention to work, which has already -- and there's a lot of work that has gone into thinking how to design legislation which is compliant with human rights frameworks.  So, we consolidated the work done on the Council of Europe and also the UN, for example, the UN guiding principle for businesses and human rights.

A couple of things to highlight, which we baked into the proposal and which respond to some of the key elements we found in these frameworks.  So, one element is proportionality.  We have designed the rules that apply to these services which are first affected by the challenges and the rules are designed to be proportionate to the sizes of the service and also compatible with the type of service.  Because, of course, the online space features so many different services and, for example, just Meta on the call itself offers a gazillion different services which also have to comply with different types of rules.  This is an important element.

Then, of course, the respective fundamental rights has been overarching into all the different rules.  And there's maybe two principles there to highlight.  So, rules which could lead to removal of content are strictly limited to illegal content.  And they are clear -- the definitions on those are clear in terms of what it is illegal and what is not.

And also there's extensive rules designed around redress.  So, users can have redress and measures for any content moderation decisions which affects them, including where the decisions are taken because the content is illegal or if the content is against the terms of service as decided by the specific service itself.  So, these are some of the important elements there.

And then what we also have done is looked at the, sort of, systems of these -- of the services that we are regulating, as opposed to regulating, sort of, specific types of content or setting out the very specific measure, I think this is also something that Germany has taken the approach in their regulation, their legislation.

So, for example, we have designed a risk assessment framework for very large online platforms.  And they are required to assess the specific risks their services pose to the rights of the child and take mitigating measures to counter them.  So, this assessment is service specific.  And the measures to be taken are open-ended.  So, it ensures that the measures can be tailored to the service and they can also be future proof.  Because, of course, we are always evolving.  We are always learning more.  We are finding new research and new ways to, sort of, mitigate the harms online.  And this is also part of the future, sort of, proof approach.

And finally, I mean, absolutely key in any human rights framework is accountability and transparency.  In the DSA we measure and we oversee this in numerous ways.  So, firstly, service providers will have to set out transparency in the transparency reports of what they have done in their content moderation decisions.  They also need to be clear about their terms and conditions and these need to be very clear and transparent.  And the council also added specifically that these also need to be understandable to minors and clear to minors.

But also, of course, this is something from the perspective of the service provider.  So, we also need independent verification of these types of actions that are being taken by the industry.  And there we have set out an obligation to have independent audits of these types of risks and the mitigating measures, as well as data access to researchers who can also independently analyze whether and how these risks arise and whether these mitigating measures which are being taken are effective.

And in addition to this, of course, we have regulatory supervision.  What's important here is as I think the German colleagues already mentioned in their intervention, is that the result is a combination of efforts from Civil Society, from researchers, from the industry, as well as from regulators.  And this is all set in, sort of, one rule book, but the implementation is very much a combination of various actors who all have an important role to play.

>> JUTTA CROLL: Thank you, Agne.  Your intervention gives me a ground to -- before we go to Dr. Hoda Dahroug, to refer to David, as well as to -- back to you.  You said that there is this risk assessment framework for the very large platforms, which are, if I remember right, have a threshold of 45 million users in Europe.

I assume that this will also only apply for like TikTok or Instagram when I look to platforms that are used by children or by young people under the age of 18.

In the German law we have a threshold for the duty of care for platforms at 1 million users and I'm wondering how this fits together, whether the lower threshold even will stay in force or whether the overall 45 million users threshold will apply across Europe and lower thresholds don't -- we don't stick to that.

Could you give a short answer to that question?

>> AGNE KAARLEP: So, just to say in terms of the threshold set of 45 million, because the active user definition will be set out in a separate act, according to the commission proposal, we don't exactly know which platforms will be in scope.  So that we can't, unfortunately, comment on.

And as to the interaction of the EU level rules versus men state rules, this is also a case-by-case assessment.  Because EU rules, of course, it's a regulation so it's directly applicable.  So any rules conflicting with or regulating explicitly the exact same area might need to be amended.  But whether this is exactly the case with the Youth Protection Act is something of a case-by-case assessment and not necessarily for the entire scope, et cetera.  So, this will really need to be looked at more carefully by legal experts.

>> JUTTA CROLL: Thank you so much for your explanation.  You also have been talking about the redress for users, but I think we can turn to that question a bit later, hearing first from Dr. Hoda Dahroug from Egypt.  Hoda, the floor is yours.

>> HODA DAHROUG: Okay.  Thank you.  So, to apply to the question, I have to go in the days for the three adapted main mechanisms that I just mentioned.  First supporting the (?) and policies.  The Egyptian government is undertaking serious steps to ensure lobbyists and transparent regulation for child online protection and (?) regulation which includes as example digital was one of the first 20 countries that recognize the Convention on the Rights of the Child CRC.  Additionally Egypt has five optional protocols with the CRC which signifies Egypt's commit for children's right.  For legislative framework is one of the pillars of building this in Egypt, MCIT launch national in cybersecurity strategy 2017, 2021 that targets providing safe and supported environment that would enable value sectors to deliver integrated e-services.

Also in July 2020 His Excellency president (?) ratified the personal data protection law 151 of 2020.  Law is part of MCIT efforts to protect the personal data of citizens and residents of Egypt.  As well as Egypt laws the national human rights strategy 2021 which include the political human rights, the women and children rights, the economic rights, and the use and people with disabilities rights.

And this was a great initiative in Egypt and a great steps towards a lot of issues and a lot of human rights.  The government has established a set of strategies and organizational guide and oversees the implementation of child safe (?) mechanism including the national committee for child online protection, which supports a safe environment for internet usage among children.  In addition to assessing the current state of child online protection Egypt through providing studies, research, open dialogue, session and (?)

Additionally, supports a participatory approach on (?) and online safety strategy which is a platform to open a multistakeholders' dialogue in order to full synergies benefit from the expertise of all stakeholders including parents, educators and the children themself and considers their perspective which ensure a consistency of -- and I believe that after COVID, this digital platform was so, so important to include all the stakeholders and not left anyone behind, to share all dialogues of how to guarantee the online safety structure.

Going to the second pillar, setting out the responsibilities of all industry players toward the users, the government alone will not be able to ensure safe and transparent online (?).  So MCIT corporate with several stakeholders to assume this mission and works toward mobilizing private sector stakeholder, especially service providers, industry representatives to take an active role in the awareness raising efforts.  Highlighting child safeguarding measures and disseminating key messages on safe online behavior for children, parents and (?)

So, MCIT and the national committee for child online protection participating in raising (?) hours of this and promoting technological specifics.  While technology at the internet present limitation for child online safety, it possesses the remedy for online threats, it offers new means for documenting and tracking this (?).  So MCIT with (?) and Microsoft has cooperated in implementing child exploitation tracking systems, technologies to protect youth and the children online.

As part of MCIT's strategy in stakeholders and engagement on child online protection, a forum through the digital citizenship platform to enhance the stakeholder coordination and to engage decision and policymakers to strengthens the capacity of the national protection system to prevent all forms of online violence against children.

And the last pillar that encouraging the production of each appropriate online content and service.  Since 2007MCIT has focused on affording accessibility and the ability of technology especially in (?) and marginalized.

However, in first changing world realize that citizens need to master the (?) awareness and source of knowledge to phase the emerged interest.  At the same time today's children avoid additionally -- additional knowledge, resources and (?) they get the information about the public's fear on social media.  And this need more to look (?) rights, responsibilities and opportunities of living, learning and work.  This was a catalyst of launching the digital citizenship initiative to prepare our younger generations to benefit from the knowledge society and at the same time protect themselves.

Digital citizens initiative is mobilizing multistakeholders collaboration, too, in Greece --

>> JUTTA CROLL: You need to come to an end.  We are running out of time.  I'm sorry.

>> HODA DAHROUG: Okay.  So this platform increased the access to specialized knowledge and valid content, enhance awareness, support activation of policies and regulations that enhance online protection.  Now we are working on this national level and related stakeholders to build one-stop shop platform and I stop here to avail the others.  Thank you.

>> JUTTA CROLL: Thank you, Dr. Hoda.  You know that it's most important to be as interactive as possible at the Internet Governance Forum sessions.  So, I really would like to encourage also participants to the session to raise their hand and speak up and raise their questions to the panelists.

What you said, Hoda, leads me to the second policy question that we wanted to address in this session, and I would like to invite Beeban to speak on what needs to be done by the respective stakeholders to ensure the human rights granted to children in the UNCRC are protected, respected and fulfilled literature environment.  And that is exactly what the general comment number 25 is asking for.  So could you explain that a little bit more to us, even?

>> BEEBAN KIDRON: I'm smiling because it's such an enormous question in a very short time.  But I think that the first thing that I'd like to say is we have got to stop pretending that values are neutral.  There's something that enters the conversation as if where we are now is neutral and if we seek to impose something on behalf of children or other users that suddenly we are asserting some new set of values that stops the neutrality of the environment.  And I think that what we know is that there's really nothing neutral in the 21st Century experience.  In fact, it's highly automated, highly directed, highly targeted around all sorts of different values.

So, I think that if we are to see the general comment manifest itself on behalf of children, we have to first make a societal decision that that is what we wish to do.  Now, I definitely am on record saying we wish to do it.  Moreover, I would say we have all 196 countries have signed up to do it.  And it's about how we do it.

But I do think that that argument is not 100% won in the political and public arena.  So I think those of us who are committed have to acknowledge.  That's number one.

I think number two, I would say we have to start embedding it in other things.  I mean, I am really delighted that it currently sits in the EU AI act.  It's cited there.  I think there's a move to possibly put it into the DAS, certainly a move to put it in the online safety bill here.  It's a very, very important factor of the age appropriate design code.  And I would say for any policymakers that, actually, putting -- we didn't have the general comment at the time, but putting the convention on the time was the only reason that we got the code for children under up to the age of 18.  Yeah?  Because all the lobbyists came in and said, please, government, no, no, no.  We like 13, we like 13.  This is what we do in America and this is what we want to do around the world.  And the government went, nothing we can do.  It's on the face of the bill.

So, I think crating the general comment in these places allows us to make some shifts that are actually really hugely important.

I would also like to really back up what was said earlier about risk assessments and the way that it's being looked at in Europe.  But I think those risk assessments in relation to children have to have an absolute child's rights perspective and that's a very good place to put the general comment.  And in relation to that, I would absolutely love to draw people's attention to the IEEE age appropriate design standard, which they just published last week, which does, actually, cite the general common and does take a child's rights appropriate to designing digital services with children in mind.  And if I have a colleague on this call, maybe they can put a link in the chat to that.

But the other more, sort of, absolutely practical thing is I currently have a bunch of lawyers looking right now at the general comment in the relation to UK law and saying what they think needs to be done by whom at what point to embed it.  And, obviously, you know, we can't do that for every territory.  But I think it's an exercise that I would love to be seeing be mirrored elsewhere.  And certainly we will make available our findings to other people to look at it from their perspective of their Justin.  So in a way, starting the process and letting people make their own version of that process.

Because I think that what is so important about the general comment is the indivisibility, but what is so difficult about the policy area is you have Department of Education, you have the (?) here, we have the home office.  We have justice.  You know, it goes right across.

So, actually mapping who must do what, when is possibly, even in one country, will probably give a little bit of enlightenment to other places who have slightly different arrangements, but very possibly similar problems.

And, again, I'm going to emphasize the participation rights of children and one of our colleagues that worked on the general comments near Livingston has written a report about digital play and what we need to do in order to make children's rights, sort of, a factor in designing digital play spaces online.  So, again, I just want to say there are so many different areas.  But it doesn't mean we should be intimidated.  We have to have a road map.

And then I'm going to say three things in very quick succession.  Political will.  I think that the tech industry needs to get our trust back.  I think it has been brutally lost and it needs to be found.  And as sort of a radical transparency.  I think the sorts of transparency that people like Frances Hellgren are talking about, the sorts of transparency that means we do know when things go wrong and the sort of transparency that means instead of playing hide-and-seek with the big tech companies, that we actually are in a space that we see that actually this is having a bad impact, what can we do to act and change.  And I think that's really important. 

And I just want to say one last thing, which is our experience here and our research here at 5Rights, it does rather suggest that small is not safe.  And I am worried about those in the EU and here in the UK and possibly even America, people with looking at the large platforms.  Now, you know, they spend a lot of money and they spend a lot of attention, in my view, not upstream enough and not enough.  But I think we must consider that small is not safe and we must go on a risk assessment basis, not on a reach basis.  Thank you.

>> JUTTA CROLL: Thank you so much, Beeban, for putting so many things into your short statement.  I'm really grateful for that.

And with a question that we received in the chat from Kossi Amessinou, I wanted to go to the third question we wanted to answer in this session because I think it's very much related to each other.  It's a question about educating children and she is asking -- he is asking for what is the parent's responsibility in Child Protection Online?

And with a third policy question, we want to answer how do we ensure a safe digital space?  How should governments, internet business and other stakeholders, and that also means parents, educators and so on, protect children including vulnerable citizens against online exploitation and abuse?

And Thomas from the German federal agency, would like to speak on that and also Agne had raised her hand in advance.  Thomas, please, you are the first to speak.

>> THOMAS SALZMANN: Thank you.  The answer to your question is from our point of view together in a mandatory dialogue and permanent cooperation and, if necessary, law enforcement.  The new German Youth Protection Act causes dialogic regulation, one approach the law gives us is -- usually is a step before obligation, is a joint exercise of responsibility by the government, industry and Civil Society.

The idea is the development of an overall strategy for an intelligent management of chances and risks concerning the use of media by children by bringing this process, we started a project line with different event (?) the so-called workshop of future in English.  A council of experts concerning children rights that includes at least two persons under the age of 17 advises this process.

Finally, we have the possibility to oblige relevant service providers like social media companies and a law enforcement process to implement precautionary measures for the safety of children.

These measures can be tailored to the specific features of each service in a so-called dialogic regulation and by participating organizations of voluntary self-regulation.  Which can develop guidelines under the supervision of the federal agency.

Both approaches have in common that companies take very active part in the process, just like the different media ecological actors, also like parents, use protectors and children themselves to work it out together.

>> JUTTA CROLL: Thank you, Thomas.  I think you highlighted two things in your statement that I would like to invite also David Miles from Meta to answer, too, and then also Agne.  And that is on the one hand the obligation to -- for companies to implement these precautionary measures which is laid down in the law and it's in the English British law it's called duty of care and so I would like to learn from David Miles how you see from the industry perspective this implementation of precautionary measures, and in the German law it's an open-ended list so we don't have a list that you exactly have to say per one, per two, per three.  It's up to the companies to develop their own measures.  How do you feel about that?

And then afterwards I would like to ask Agne how this approach of dialogic regulation fits into the DSA strategy, so David, please, first and then we take Agne.

>> DAVID MILES: I think both the German law and the proposed online safety bill has some really interesting aspects to them, as the agent design code.  If I go back to that as an example, we were given a year to respond to that before it came into force in September.  And we have since that time also had guidance from the ICO and things like age assurance and a lot of other things.

It's just no surprise really that a lot of platforms have responded to that in terms of default settings around privacy and starting to progress product development in line and complying with those things.  And that's really important to give industry time to use what the good out, which is technology to implement these things but to see that as an progressive ongoing process.  We are looking at parental controls now.  We are looking at time well spent.  Lots and lots of other stuff to see that as an ongoing process.  I think that's really important.

I think the other challenge within the duty of care is until now most harms and -- have been around illegal harmful content and that's been clearly in law.  (?) for example which we scan for and provide to (?) in the United States before it goes to U.S. enforcement.  A lot of the laws in duty of care are legal but therefore we need much clearer definitions around some of those things, self-harm, bullying, different kinds of aspects.  And there is also not just a rights element to that in terms of children and parental controls but also in terms of children's health and well-being and to start to look at these things in more of a public health, sort of, way and to go back to the origins of some of these things.  What brings about these harms?  What's the offline manifestation for those things?

I was really encouraged by the European Commission's proposed plans around (?) where there's a significant off-line preventive element as well as an online one and we would be pushing heavily to make sure really that detection is all very well and asking users to report, but what about prevention?  What are we doing to prevent that in the first place?  And in addition to that, precautionary to make sure that the rest of the industry is doing.  It's no good large players displacing harm elsewhere without trying to make sure that we, for example, open source technologies, smaller companies with less capability can respond quickly, can use those technologies.

So, I think there's some exciting opportunities for smaller players that can have a disproportionate impact.  And often are just not as well resourced to respond.

So I think these are really interesting times.  I think taking a systemic approach in my world, the volumes of what we deal with are huge, so dealing with individual pieces of content is challenging.  So, we welcome this concept of a proportional and systemic approach and we think that's definitely the way forward.

I would make one other final point.  What I find exciting when I go to Germany or I'm working with our public policy teams in Spain or France, is the degree to which regulators are being beefed up, in other words, lots of expertise being brought in.  They are well funded, lots of expertise.  It isn't just enough to have a piece of regulation, because it has to evolve.  But actually have strong regulators that are well resourced to be able to partner with as a major player on our side.  So, I think that collaborative approach I think is going to be really important.  And my prediction is we will see a shift in the balance of power from historically, you know, really well-meaning, brilliant safety NGOs being the sole voice, to new regulators who are also able to draw on the expertise from those NGOs which do a brilliant job, to have a much more rounded platform on which to collaborate going forward.

>> JUTTA CROLL: Thank you so much, David.  And thank you also for the kind of compliments for beefing up the regulators, which is quite astonishing coming from a company.  But I think you have well understood the approach of dialogic regulation on your side at Meta.

This is the thing that I wanted Agne to talk to and also maybe you can reflect on this term that David phrased, a new type of regulators.  Would that also play a role on European level.  And I need to remind you to be brief because we also have a question from the floor in Katowice that I would like to take, then after your intervention, Agne.

>> AGNE KAARLEP: Sure, sure.  It's quick and easy questions as well.  So, it's easy to be pre -- no.  I want to actually highlight perhaps two points on this.  Approach of, sort of, a broad duty of care, which has a lot of different -- so an open-ended list of measures that can be taken.  So, there's a trade-off between a broad duty of care versus a specific obligation.  And the trade-off is that companies, in terms of to be compliant, there's a lot more uncertainty if there's a broad duty of care.

So, I think this is just something that I wanted to highlight because this ties in also with the proportionality element.  So in the DSA we have different sizes of rules and the impact they have from a societal perspective measured in region.  We know this is not a perfect system but this is a system of proportionality that we found is currently the best that we could come up with.

And this is also because there are certain obligations that will be also from micro platforms for whom a broad duty of care will be unmanageable.  Because they require specific rules that they will then know how to really implement.  And also why this risk assessment and mitigation obligations are really for very large online platforms where we see these very systemic risks emerging.  And who actually also have the capacity to then deal with a relatively burdensome obligation --

(no audio).

>> AGNE KAARLEP: Between one approach and the other and hopefully they can be mixed, but in a proportionately --

(no audio)

(Audio difficulty)

>> AGNE KAARLEP: Social science around the issues of online harms, we are still -- we are developing very fast.  But we are still -- you know, we are still -- there's a lot more to research and there is a lot more to understand between the dynamics.

So, I think the purpose -- what we have tried to do in the DSA and what regulations should ideally do is to create the systems and the boundaries and the borders of how this knowledge can be developed.  And this is why we have that platforms need to be transparent and, sort of, assess their own risks.  But auditors need to come in independently and allow this to be also be made public in terms of what is being done.

We also need independent research, researchers need to be able to, actually, look into the data and see what is happening.  And this will help everyone.  And, of course, with these types of obligations, as David was saying, regulators really need to beef up, because these are complex issues.

And I think this will certainly create a new type of regulators at post Member States and also at an EU level to, sort of, oversee these complex systems.  Because we are definitely seeing more, sort of, interdisciplinary issues crop up rather than less, and especially this in the online space.

>> JUTTA CROLL: Thank you, Agne.  A very helpful clarification from your side and also reminding us that there might be a trade-off between the different approaches.  And we are very much looking forward to see how a mixed solution could be developed.

We are now going to take the question from Katowice.  The person there, are you able to speak directly, or do we need a translation?

>> AUDIENCE: Yes, I can speak directly.  Hi, my name is (?) from Poland.  And I am an active internet user.  And I would like to ask a question about a very specific problem, advertisement.  Now, today, advertisement, there is an abundance of apps in social apps and games apps, like YouTube, Facebook, Google.  And very often these ads are well blend in, into the UI of the app.  So it's hard to tell the difference between what is an ad and what is a content of the app.

And most importantly very often the ads includes inappropriate content like sexual characters or false advertising, which, in fact, lure children's into clicking the ads and playing the mobile games or watching some kind of content.  And, of course, it is coherently bad to prey on the curiosity of young children.

But at the same time, ads are main income for the companies.  So, I would like to ask, how in UK, in Germany, in Egypt, the legislators tackle this problem and how said companies like, for example, Meta, see this problem.  Thank you.

>> JUTTA CROLL: Thank you for this very interesting question.  I do think we don't have the time to make it through the panel and get answers from all the panelists.  But we have at the end we have reserved one minute for each of the speakers and maybe you can also address this question in case you have an answer to the question in regard of advertisement.  I do think that it is part of the age appropriate design code, which Beeban could probably refer to, and also I'm really interested in the answer from David Miles.

So, with having said that, I would like to open the final round for the panelists.  And we wanted to conclude on -- we have heard a lot of different regulatory approaches.  But as Beeban has put it before also, we have different arrangements and different approaches.  We are trying to address similar problems and issues.

We have heard about the concept of dialogic regulation.  I would be very much interested in hearing from those countries who haven't yet tried that approach, whether they see an opportunity for doing so.

And one final sentence, what would be your recommendations for policymakers in regard of legislation, taking into consideration the rights of the child as later on in the CRC and in the newly adopted general recommendation number 25.  Who would like to start?  Just go ahead to save us time.

Thomas, I see you're looking to your colleague Michael.  Would you like to start?

>> THOMAS SALZMANN: We can say something to your last questions to the conclusions.  Yes.  Most importantly, we have to change our perspective on regulation, we guess.  And traditionally legislation was orientated on the technical features of different media but we have to think from the children's point of view.  That means fulfilling the (?) of protection and empowerment and participation has to be the main motivation for regulators.  This understanding of child and youth protection leads to high standards and a holistic concept of child and youth protection.  Once more, Article 12 of the United Nations Convention on the Rights of the Child demands us to let children participate in our processes.  The concept of dialogic regulation to implement precautionary measures comes with two main advantages.

On the one hand, to identify best practices for an approach, with the high level of flexibility to face risks and make chances possible, and on the other hand, to reach a high level of obligation in relation to providers.

>> JUTTA CROLL: Thank you, Thomas.  Who would like to take the floor next?  Is it David who is the next in my windows?

>> DAVID MILES: Yeah, okay.  Yes.  I think the thing I would say I just wanted to come back to the advertisings point.  With Instagram and Facebook we have already removed -- moved the situation where advertisers are only able to target on the basis of age, gender and location.

What is really interesting is we have a division called TTC labs which is co-designed with young people and it very much informed by that in the sense that it's what young people expect in those kind of environments.  So I think that is shifting already in terms of advertising in the mix of activities.  So, I wanted to refer to that because that was the question that came from earlier on.

And I think we outside of Europe have messaged kids with millions of households using that technology and again it's the same process in terms of the nature of advertising.  So I think that's changing considerably.  And I think when you look across other platforms like gaming and other things as well, I would hope that we would see the same in those kind of environments over time.

>> JUTTA CROLL: Thank you, David.  Then would I like to turn to Kenneth once again.  Kenneth, what is your approach in this regard and what recommendations do you have for policymakers in regard of legislation?

>> KENNETH ADU-AMANFOH: Thank you.  I think legislation all policymakers has to pass legislations in regard to the child online protection.  This legislation should be inclusive, multifaceted and used in a multi-stakeholder approach and not even just a multi-stakeholder approach, but the multidisciplinary approach.  In other words it's not just about bringing different stakeholders to as part of the team in developing this strategies and legislation framework but also bring in the experts.  Let's get the lawyers in there.  Let's get academias, let's get the experts who know, who can, you know, be more informed into it.

Also, this legislations has to be passed down.  Usually we have the strategy at the high-level, policy level, government level and it doesn't really go down to the Technical Community, to the academia, and in Ghana, for instance, a good example is the national security apart from the national cybersecurity strategy which has a strong pillar on the protection of child online activity, it hosts the national security has developed its national security strategy.  And this strategy is informed by the human rights convention on child online protection and there's a pillar in there.

Apart from that, we have seen the national security go ahead to create awareness for all the security agencies under the (?) educating them, ensuring and you see the minister coming out to speak on human rights, respecting human rights, respecting all this.  So I think it is very key.

I also think that the Technical Community who are the custodians of the digital infrastructure in implementing truth or protecting systems to protect the infrastructure should always aim at enabling children to be active in safe and secure in this digital infrastructure, usually the focus is not there, but the focus should be more in there.

The national telecoms regulators and the ICT regulators I believe has to come out with the legislation or regulations to enforce acceptable standards in all the industry plays in this technical communication infrastructures.  I have seen that being done in Ghana.  Ghana has -- we have the national set, we have the sectoral sets, and we have a sectoral set that is controlled by the telecom regulator.  This has oversight on all of the industries so ensure that --

>> JUTTA CROLL: Kenneth, I need to stop you.  We have only left three minutes and we have three speakers who have to sum up.  So thank you so much.

>> KENNETH ADU-AMANFOH: Sure.  Thank you.

>> JUTTA CROLL: To remind us on the acceptable standards which is very important and will go into the report.

Now I would like to go to Hoda with only one sentence, we have only left three minutes.

>> HODA DAHROUG: Okay.  So, my recommendation is as a technology is the main enabler and the main risks so it's whole solution as well by using the disruptive technology now and then so I recommend issuing policies and regulation that support the private sector across the could not tent by promoting the development of application and using the AI and machine learning to see the indicators and figures that help the government to monitor the user behaviors.  And so it's can predict what will happen in the future and the threats of the future and I have to say that the community now has the authorities to give the feedback.  So it has the authority to -- the main impact to the growth and loss of any social media platform and application.  So, it's very, very important to aware and make awareness for the community to use this authority.  And at the end, no laws will guarantee full child online protection.  But policies and best practices will lead to bridge those gaps.  Thank you very much.

>> JUTTA CROLL: Thank you, Hoda.  That was very helpful, indeed.  Agne, I think it wouldn't make sense to ask you for recommendations for policymakers in regard of legislation towards child online safety.

But my question to you would be if you sum up the session and what you have heard, how could we ensure that the best interest of the child is given highest priority when legislation is made?  Your short sentence to that question, please.

>> AGNE KAARLEP: I think from the -- from at least the EU perspective but I think this is also highlighted globally, the rights of the child are very specific.  So in the EU charts of fundamental rights, they are the only rights that apply to all parties, including private sector.  So, otherwise most of the other rights are directed towards government for the government for safekeeping these rights but these are really also for the private sector.

So, I think just taking that into account whenever planning for any type of legislation is crucial.

>> JUTTA CROLL: Thank you so much.  Very quick and precise.  Beeban, I think you have the last word of this session.

>> BEEBAN KIDRON: I love having the last word.

>> JUTTA CROLL: I know that.

>> BEEBAN KIDRON: Thank you.  So, I just really want to say there is no silver bullet and if you think about, you know, I often think about the industrial revolution here, you know, and we had 14 factory acts just on workers' rights inside, including child labor.  But we also at that time invented the weekend and we put street lighting in and we put regulation around sewers because people came to the city.

So, I think we must see this as a journey and what we have to do is get each piece right instead of making each piece a Christmas tree.  That's my big recommendation to everyone.

I do want to honor the person who asked a question and just say, you know, you're absolutely right to be concerned about advertising but don't concentrate just on the adverts and the content of the adverts, although I think scam adverts are scam and fraudulent.  You have to look at the whole infrastructure around data and targeting and personalization, this is the infrastructure underneath the advertising model.  So I think you're right to concentrate on it and normally if you want to get something right in regulation, it is worth following the money and that is where the money is.  Thank you very much.

>> JUTTA CROLL: Thank you for the final word, Beeban.  I think this was a good inclusion and also an answer to the question we got from the floor.

I would like to take the opportunity to say a big, big thank you to all of you for joining this session and making it a very fruitful debate.  We are obliged to deliver a report in two hours' time so I will a little bit busy after this session because I have taken so much notes.  I would like to thank my colleague Torsten Krause to help to organize the session, to set the agenda and who was also helping to moderate in the chat.  I also would like to thank my colleague Clemens Gruber from the Digital Opportunities Foundation who will help with the minutes and a big thank you to all our speakers and to all the participants in the session.

I am pretty sure we will continue with that debate.  I would like to refer you to the main session that will take place on the first day, 11:15 central European time, which deals with regulation and will -- it will have a look backwards how we came to the situation we have now and what will come in the future.  So, you are all invited to take part in that main session.  Thank you so much for being here and have a nice evening, morning, afternoon, whatsoever, and pretty much looking forward to a festive holiday in two weeks' time.  Thank you.  And bye-bye.

>> TORSTEN KRAUSE: Thank you for inviting, organizing and moderating this workshop and session.  Thank you very much.

>> We all live in a digital world.  We all need it to be open and safe.  We all want to trust.

>> And to be trusted.

>> We all despise control.

>> And desire freedom.

>> We are all united.

(Session was concluded at 17:34 UTC)