IGF 2022 WS #471 Addressing children’s privacy and edtech apps

Friday, 2nd December, 2022 (08:15 UTC) - Friday, 2nd December, 2022 (09:45 UTC)

Organizer 1: Civil Society, Latin American and Caribbean Group (GRULAC)
Organizer 2: Civil Society, Latin American and Caribbean Group (GRULAC)
Organizer 3: Civil Society, Latin American and Caribbean Group (GRULAC)
Organizer 4: Civil Society, Latin American and Caribbean Group (GRULAC)
Organizer 5: Civil Society, Latin American and Caribbean Group (GRULAC)

Speaker 1: Marina Meira, Civil Society, Latin American and Caribbean Group (GRULAC)
Speaker 2: Michael Canuel, Private Sector, Western European and Others Group (WEOG)
Speaker 3: Nidhi Ramesh, Civil Society, Asia-Pacific Group

Additional Speakers

Hye Jung Han, Researcher and Advocate at Human Rights Watch

Rodolfo Avelino, Insper


Panel - Auditorium - 90 Min

Policy Question(s)

1. What are the concerns that should be considered by governments and by the industry while structuring policies based on digital education apps and platforms? 2. How can all stakeholders and actors, including children themselves, develop actions and regulation in order to support the development of better data related practices in the edtech sector, that combine student’s and children’s human rights and the digitalization of educational services? 3. What are the technical alternatives to student-data-commercializing business models for the edtech industry that can promote the development of the sector while protecting students’ and more specifically children’s human rights and best interests?

Connection with previous Messages: The proposal is directly linked to the IGF 2021 message "Emerging Regulation": Market structure, Content data and consumer rights and Protection", as it proposes the debate on how to address educational technologies that promote education and the right to connectivity while respecting children's rights to privacy and data protection. From a multistakeholder perspective and based on the notes exposed in a recently published report by Human Rights Watch, also considering the asymmetric impact that data mining aggressive practices may have on the Global South, as well as the relevance of educational technologies as powerful tools for social and economic development, the ultimate goal of the workshop, therefore, is to debate on solutions and practices that consider the actions of all stakeholders for more sustainable and protective educational virtual practices.


4. Quality Education
9. Industry, Innovation and Infrastructure
10. Reduced Inequalities
17. Partnerships for the Goals

Targets: The proposal approaches education and technology from a perspective that understands remote learning as a powerful tool for social development, as long as the products and services respect children's rights, therefore related to SDG 4. It also proposes the discussion of better data practices for the edtech sector, in accordance with SGD 9, and from a multistakeholder perspective, in accordance with SGD 17. Alongise, the panel will analyze the asymmetric practices adopted by technology companies in the Global South compared to countries where data protection regulation is more evolved, which is directly linked to SGD 10. The main goal, therefore, is to ensure quality education for all children in a respectful, inclusive, and promotive way, which can also reduce inequality between countries, especially combating asymmetric practices in less developed countries. Moreover, by addressing best practices for children's data protection from a multistakeholder perspective, we ensure inclusive and sustainable industrialization and foster innovation while strengthening global partnership.


The predominant business model in digital services has from the past years been based in massive data collection for profiling users and targeting them with personalized ads. A recently launched Human Rights Watch report points out that most edtech apps and platforms are also inserted in this data economy: many of them collect student data and sell it to third parties such as data brokers. This edtech industry had been rapidly growing, but the Covid-19 pandemic pushed it even forward, as it forced governments and schools throughout the whole world to adopt remote learning strategies, which were laid upon all students, including children. At the same time, children’s rights standards vocalize their right to full education and their right to having their data handled according to their best interests. The consideration of children’s student data as an asset, however, is largely unregulated throughout the world, especially in the global south, where data protection regulations, in general, are still incipient. Standing from this scenario, the panel will discuss how the current predominant edtech model affects students’ privacy. Not only, but it will debate in a multistakeholder format alternatives for the continuing evolution of this industry, which can play a key role in the promotion of education and other children’s rights, while protecting children’s best interests. As doing so, the session will propose best practices to be adopted by all actors in the face of the continuing digitalization of education, discussing how the regulation and governing of these services can combine economic development and the protection of children’s privacy and human rights.

Expected Outcomes

The session will gather the concerns and proposals that arise from the debate into a short report that will be published by the organizers. The report will address the current threats to children’s rights posed by the edtech industry and propose best practices, actions and regulation to be adopted by all those involved in the sector: governments, industry, schools, the technical community, civil society, students and their families. Also, a dashboard will be created to gather all the materials and information cited by the panelists and audience during the session, in order to promote the exchange of knowledge in the field.

Hybrid Format: The workshop will be conducted in the following format: all panelists will have 5 minutes for opening remarks, in which they will introduce themselves and their experience on edtech and students’ and children’s privacy. Next, the moderator will ask each one of them a specific question related to the stakeholder group they represent, in order to address the policy questions that lead the session. Panelists will also have 5 minutes each to answer that question. The last 30 minutes of the workshop will be dedicated to answering questions and debating with the online and onsite audience. The online and onsite moderators, from the start of the workshop, will encourage participants to send questions and notes on their views and local livingness on the issue discussed. They may incorporate the online participation in their moderation and policy questions from the start of the workshop. Also, the questions to be asked in the last 30 minutes of the session will be alternated: one will be from the onsite and the next one will be from the online audience, to be selected by the online moderator.

Online Participation


Usage of IGF Official Tool.


Key Takeaways (* deadline 2 hours after session)

The use of edtech apps by children and adolescents generates different risks, especially with regard to privacy and the protection of their personal data. Large corporations that create and provide these services, some of which are free, can collect massive amounts of data and use it to send personalized advertising and behavioral modulation based on their vulnerabilities.

Call to Action (* deadline 2 hours after session)

It is necessary that governments put children's best interests at the center of the debate, including hearing their opinions and experiences. Governments must also pass legislation to protect children's data and monitor and penalize any violations of children's data, privacy, or rights. The tech industry bears the primary responsibility for child data protection.

Session Report (* deadline 26 October) - click on the ? symbol for instructions


  • Millions of students have returned or will return to a new academic year and they will largely use technology that was adopted during the pandemic. Just a few months ago Human Rights Watch published a report called “How Dare They Peep into My Private Life?”: Children's Rights Violations by Governments that Endorsed Online Learning during the Covid-19 Pandemic” that investigated the educational technologies endorsed by 49 governments in the whole world. The investigation covered the majority of kids who had access to the internet and devices.
  • Every government, except for one, authorized the use of at least one online learning product that surveilled children online, outside of school hours and deep into their private lives. This was the first time that evidence was collected that the majority of online learning products harvested data on who children are, where they are, what they’re doing, who are their family and friends are and what kinds of devices their families could afford for them to use.
  • The critical point of the HRW’s report is that the products did not allow students to decline to be tracked. The monitoring happened secretly, without the child's or family's knowledge or consent. Being the online app a mandatory tool, it was impossible for kids to opt out of surveillance without opting out of school and giving up on learning.
  • This pairing of the edtech industry with the attention economy and targeted advertising industry, as it’s clear from Han’s research, has been promoting a clear violation of the students’ rights to privacy and to the protection of personal data. And on top of that, it promotes children’s behavioral manipulation to an extent that we are still unaware of (neither in terms of present, future, individual or collective impacts).
  • Children are human beings going through a developmental stage. They need to be able to make mistakes and learn from them, as well as to experiment throughout this development in order to understand and mold their own personalities.
  • The need for children to experiment with their personalities is completely undermined by the attention economy and its profiling and aggregation techniques. In the end, what we see today is that the content that reaches children online and therefore influences their personality shaping is, to some extent, dictated by private and commercial interests. So besides behavioral manipulation, this aggregation and specific content targeting can also reinforce discrimination.
  • In order to face the problematic current scenario of the edtech industry, we need to understand that the protection of children’s rights will only be achieved once it is shared among all of society. Much is often said about families being responsible for educating children to use digital devices and services. Some families should support children in their use of edtech apps as much as possible, of course, but that can’t be all. How do states choose edtech tools to be adopted in public education? How do schools themselves choose the tools to be adopted in the private education sector?
  • We need to address the responsibility of the private sector, both the edtech companies themselves and other companies from other sectors that are buying student data from them.
  • When addressing the responsibility of States, schools and the private sector, we need to bring the concept of the best interest of the child to the table, as determined by the UN’s Convention on the Rights of the Child, the most ratified international treaty in the whole world. All actions that directly or potentially affect children must be undertaken in order to fulfill their best interests.
  • The first and foremost way to protect children online is to be aware of what data they provide, whether the apps that they use are putting their data in unwanted hands. Check the company's reputation reviews, take advice from parents and teachers and check online if you're in doubt before using them. Maybe teachers should be trained in schools to help students to understand how to keep their data safe. 
  • The other way is to ensure that we have the best practices enforced. Companies that offer solutions to children have to be mandated to only collect relevant data. Companies should face severe consequences. This is where the IGF can play a role and convince governments to enforce these universally. Governments should come together and make laws that ensure that children stay safe online and that their data is protected . Technology is not going away and children are increasingly going to use the internet and online apps for their educational needs and other social media requirements. We should work collectively to bring laws across national boundaries, encouraging organizations, government agencies and international institutions like the United Nations to mandate rules that will help protect us online and our privacy.