IGF 2022 Reports

IGF 2022 Open Forum #4 Digital self-determination: a pillar of digital democracy

Updated: Wed, 01/02/2023 - 14:06
Governing Data and Protecting Privacy
Session Report

The Swiss OFCOM organised an "Open Forum" on the topic of digital self-determination and trustworthy data spaces.

Political authorities at national and international level as well as civil society are developing new models of data governance in this respect. In Switzerland, we published a government report earlier this year on the concept of "digital self-determination" and "trusted data spaces". Our vision is a data society based on autonomy and freedom to manage one's own data. We want to restore trust in data technology and empower the actors in the digital space. To this end, Switzerland is developing a code of conduct for trusted data space providers.

The proliferation of national and unilateral approaches to data governance has led to an increasing fragmentation of regulation at the international level. This is accentuated by the fact that there is currently no international process for a holistic and cross-cutting discussion of data governance issues. This session aimed to address difficult data governance issues such as: How can we reinvent ourselves in the digital age as our digital footprints expand? What are the key elements needed to implement digital self-determination? How can we ensure that such efforts do not contribute to further regulatory fragmentation? What governance mechanisms are possible and how could they be deployed?

Magdalena Jóźwiak argued that personal data should be protected as a core value. Yet there is a dichotomy between private and public actors. The transparency of private companies is only based on voluntary disclosure. Many studies bring the importance of data governance to a constitutional level. The inclusion of digital self-determination is a way to balance the trends. 

Roger Dubach recalled the difficulty of moving from a national to a global discussion. At the national level, Switzerland is developing a voluntary code of conduct, with the objective of building trustworthy and human-centred data spaces. The discussion at national level should help inform the international level and vice versa.

Pari Esfandiari pointed out that the issue of data governance is very controversial, with a geopolitical concern. There are various ideological perspectives on data, from the American approach of maximising use, to the European approach of protecting it and the national approach of controlling it. Pari reminded the audience that a single, global data governance regime is as essential as it is unlikely to be achieved at the moment.

Marilia Maciel noted that data regulation has moved away from the use of data as intellectual property to focus more on how data can be shared and used collaboratively. The Swiss proposal is in line with this. Little attention paid to the development aspects of data in trade-related discussions. The issue of transparency arises in this context, trade negotiations are very opaque. We only have an idea of what is being discussed, but no clear indication. And only governments participate, which excludes the idea of multi-stakeholder participation.

IGF 2022 IRPC Access & participation as enablers of digital human rights

Updated: Tue, 31/01/2023 - 12:03
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Barriers to access and participation relate to infrastructure, policy - legislation (e.g taxation) and implementation, the level of digital literacy and content - especially related to the lack of language diversity online. The online and offline worlds are interconnected and marginalised communities are disproportionally affected online. The challenges can go beyond access since access can also be limited or used as a surveillance tool.

,

More needs to be done to ensure that online violence is tackled effectively. Content moderation is primarily used as a remedy but platforms need a concerted strategy that promotes transparency, investment, better working conditions, awareness raising on abuse reporting, and more cultural and language diversity for long-term solutions. Effective legislation and its implementation need to be the result of a constructive multistakeholder dialogue.

Calls to Action

A real commitment to promote access and participation and ensure digital inclusion from all stakeholders: more infrastructure, concerted policies and effective implementation, ensure digital literacy and content which is accessible to all. More engagement and constructive multistakeholder dialogue is needed to develop regulatory frameworks and effective solutions to promote digital inclusion, and to respect and uphold human rights online.

,

Online violence must be tackled effectively. Internet shutdowns are an obstacle to a free and open Internet and hinder the full enjoyment of digital human rights. Content moderation can only be an effective remedy if civil society, the technical community, private sector and governments work together to ensure that any implemented solutions are transparent, sustainable, localised, human-centered and rights-respecting.

Session Report

The session reflected on the importance of meaningful access and participation in the online environment for the full enjoyment of human rights. The panel discussion focused on

Access and participation to address major challenges  to access and participation online and reflect on ways to promote empowerment and inclusion;

Online content moderation  and the most pressing challenges  in Africa;

Redress for human rights violations online

 

The panel included:

  • Catherine Muya, Civil Society (Article 19), African Group - remote
  • Hon. Neema Lugangira MP, Government (Tanzania),  African Group - on-site
  • Roselyn Odoyo, Private Sector (Mozilla) - on-site
  • Victor Ndede, Civil Society (Amnesty International Kenya) - on-site
  • Yohannes Eneyew Ayalew, Technical Community, African Group - remote

and intervention from  IRPC Steering Committee member, Santosh Sigdel

 

The session was moderated on-site by IRPC co-chair Raashi Saxena and online moderated by  IRPC co-chair Minda Moreira

 

Access and participation

Victor Ndede identified four major barriers to access and participation:

  1. Network infrastructure  and policy
  2. Taxation. The ever-increasing governmental taxes on digital devices and services pose affordability challenges and a reduction of mobile taxes would lead to better inclusion.
  3. Digital literacy,  It is important to know how to use Internet-connected technologies to harness their full potential
  4. Content, language barriers and the importance of localised content. The fact that the content is mostly in English can limit participation online.

Roselyn Odoyo added that when it comes to access and participation, marginalised communities are disproportionally affected as the legal and political environment is still hostile to these communities e.g, refugees, and LGTB. Access is not only difficult but can even be used to surveil and curtail their rights.

Roselyn pointed out the fact that human rights violations online have repercussions offline and vice-versa and therefore human rights defenders working with marginalised communities and digital rights groups should work together rather than in silos to better address these issues. She also highlighted the importance of including civil society in discussions on accessibility and participation.

Responding to the issue of taxation Hon. Neema Lugangira pointed out that developing countries are losing out due to international tax regulations and that it is important to ensure that African countries benefit from the taxes generated by sales within the countries and that it is only fair that tech companies pay taxes on income generated in the country. The discussion developed further with examples coming from Kenya, Tanzania ad Congo and there was some support both from the panel ad the floor that the increase of taxes for users of digital services and devices could lead to accessibility challenges and hinder inclusion.  Hon. Neema Lugangira highlighted the importance of all stakeholders working together so those valid arguments can be passed on to legislators to better look at these issues.

Santosh Sigdel (Internet Rights and Principles Coalition Steering Committee) highlighted the importance of documents such as the Charter of Human Rights and Principles for the Internet in the promotion of accessibility and participation. The Nepali translation of the Charter, he added was a collaborative process. The translation of the document into the local language develops capacity building and gives communities the chance to work on issues that affect directly the tools to work on better laws and policies to address those issues. Sigdel also stressed the importance of the right to access to address digital inclusion and pointed out that the online world replicates what is already happening offline and, therefore, escalates vulnerabilities.

Online content moderation

Reflecting on online content and issues at the intersection of freedom of expression, online abuse and online hatred or incitement to violence the panel reiterated the position that human rights should apply online as they do offline and agreed that online violence has repercussions of offline.

Hon. Neema Lugangira looked at the fine line between freedom of expression and online abuse. She explained how the latter was used as a tool to silence some groups, especially women and highlighted the dangers of self-censoring. She called for online discussions that are “focused on the agenda, not the gender” and for wider representation in social media platform teams so that cultural and language diversity are taken into account. Hon. Lugangira stressed the importance of legislation and regulatory frameworks to remove the grey areas from where hate speech and online violence flow. She also pointed out the need for all stakeholders to come together and called for different ways of engagement and cooperation among stakeholders.

Yohannes Eneyew Ayalew reflected on the profound impact of online hate speech in the context of the war in Northern Ethiopia and on the slow response of social media platforms to prevent the escalation of violence.

Content moderation as pointed out by Catherine Muya should be a remedy to address issues such as hate speech however there was strong support that more needs to be done to address the challenges of online content moderation to ensure it is a practical and effective remedy. Catherine highlighted the lack of transparency and coordination which are directly linked to the failure of tech companies to truly invest in online content moderation and to develop effective measures from adequate training and fair remuneration of online content moderators, more awareness of the tools available for abuse reporting to the development of long term solutions accountability for those responsible and support to victims of online abuse.     

Questions and comments from the floor came from different stakeholders, from civil society to government and National human rights institutions and highlighted the different experiences on taxation of digital devices and services in African countries and the impact on accessibility and participation online. They also stressed the crucial need to address online accessibility for all by designing and developing content for people with disabilities. Participants also highlighted the importance of striking the right balance between business and human rights and the role of governments to uphold human rights online as offline and pointed out the barriers that need to be overcome to promote accessibility and inclusion from the physical barriers still preventing access in many communities to the lack of coordination and effective mechanisms or the lack of trust among stakeholders. The lack of access to the Internet and digital services due to national and regional shutdowns, particularly the current situation in Ethiopia Tigray region was another issue brought up during the Q&A discussion with participants and speakers highlighting the importance of a free and open Internet and members of the IRPC SC referring to the Coalition’s Charter of Human Rights and Principles and the importance of Article 1 - The Rights to Access the Internet as an enabler of all digital human rights.

 

Recommendations

At the end of the session, the panel put forward several recommendations to ensure the full enjoyment of digital human rights:

  • A real commitment from all stakeholders to promote access and participation and to ensure digital inclusion. This includes more infrastructure, concerted policies and effective implementation, ensuring digital literacy and content which is accessible to all.
  • More engagement and constructive multistakeholder dialogue to develop regulatory frameworks and effective solutions to promote digital inclusion, and to respect and uphold human rights online.
  • Online violence must be tackled effectively. Internet shutdowns are an obstacle to a free and open Internet and hinder the full enjoyment of digital human rights. Content moderation can only be an effective remedy if civil society, the technical community, the private sector and governments work together to ensure that any implemented solutions are transparent, sustainable, localised, human-centered and rights-respecting.
IGF 2022 WS #242 Lessons Learned from Capacity Building in the Global South

Updated: Thu, 26/01/2023 - 02:40
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Training programs and capacity building for the deployment of community networks in rural and remote areas has to be aligned with the needs of the territories, based on local knowledge, language and culture, supported by a multi stakeholders approach based on collaboration between different actors to exchange good practices and make them sustainable in long term.

Calls to Action

To gain meaningful connectivity in remote places and equal access for women, a deeper collaboration and understanding from governments, institutions and social organizations about the communities they work with has to be achieved for the development of community networks, considering these community projects as lifelong learning processes and local-based.

Session Report

Key Issues Raised

1. When people from rural and remote communities are trained to develop their technical skills that allows them to generate and operate local solutions for connectivity, as the community networks, the chances for those projects to be sustainable long-term increases.

2. The project of the National Schools of Community Networks was launched in 2020 and the main objective was to develop training processes to allow capacity building for the development of Community Networks in 5 countries of the global South: Brasil, Indonesia, Kenya, South Africa and Nigeria.

3. Each of these schools has their own program and different ways to execute it, and also different pedagogical methods.

4. Participatory Action Research (PAR) methodology is a research method that has successfully been employed for the development in the projects of the National Schools of Community Networks. Also other popular education tools to train people have good results to bring closer technology to people with no previous knowledge.

5. To help for the implementation of every school, a guide where PAR methodology was explained in a guide that was launched with the objective of create contextualized programs: https://www.apc.org/sites/default/files/FINAL_Technological_autonomy_as…

 

6. Training programs and capacity building for the deployment of community networks and other local solutions for meaningful connectivity have to be aligned with the needs of each territory based on local knowledge, ways of learning, language and culture.

7. Efforts and specific strategies need to be done in terms of increasing the participation of women in these training programs.

8. The development of these training programs has to be supported by a multi stakeholders approach based on collaboration between different actors to exchange good practices and make them sustainable in the long term.

9. Inclusive, responsible and sustainable digital transformation needs to be impulsed by policy strategies and regulatory frameworks, and also for the investing in some initiatives and projects that impulses local capacity such as the National Schools of Community Networks.

 

Presentation summary

1. Carlos Baca Feldman, LocNet project (Mexico): the initial setting for the development of the National Schools of Community Networks was the publication of a guide where it explained the Participatory Action Research methodology. This methodology was employed in the design of Techio
Comunitario, a training program developed in Mexico, that was the origin of the development of the design and implementation of the project of the National Schools of Community Networks in 5 countries of Global South: Brazil, Indonesia, Nigeria, South Africa and Kenya.
One of the goals of this project was the development of an online repository in order to strengthen and develop community networks by the exchange of materials that can help people and organizations to develop skills and knowledge.
2. Alessandra Lustrati, FCDO (UK): UK Digital Access Programme (DAP) is focused on catalyzing, inclusive, safe and secure digital transformation in 5 countries like Brasil, Indonesia, Kenya, South Africa and Nigeria. So supporting digital connectivity and skills development in cyber security, digital entrepreneurship and innovation are key to the development of alternative models of local solutions. There are real value following the next main three principles:

  • First principle - scalability vs replicability: Relatively small investments but really well targeted, context-specific and fully focused on building capacity, can be very significant in terms of impact. Once the model is demonstrated, in terms of adaptability to local conditions both the technological and organizational knowledge, can be disseminated effectively through the method of the National Schools of Community Networks, considered as positive proliferation of meaningful connectivity solutions on the ground.
  • Second principle - local ownership: As the model needs to be fit for context and well embedded in the local reality, it needs to take into account local needs and preferences, and what is viable or not in a particular location.
  • Third principle - sustainability: strengthening local capacity is essential for sustainability. The community became autonomous in terms of know-how in establishing and managing their own telecoms networks, understanding the interaction with a broader market or what it is considered the connectivity value chain: developing a business and organizational model that enables communities to access and appropriate technological services, in an efficient and affordable way.

3. Neo Magoro, Zenzeleni (South Africa): many rural areas do not have access to Internet, the cost for rural areas is higher than in the cities, there is a high level of employment rate and gender-based violence, and Internet data costs in South Africa are the highest in the world. For the development of the school, the organization Zenzeleni developed a curriculum made by experts who understood the needs and the realities of each community based on four pillars: personal, social, technical and business development. A Learning Management System (LMS) was created, and all the participants of the school were provided with cell phones to access the contents. Because of the diversity of participants, the idea of using home languages was reinforced, and some translators were employed. Another challenge faced was developing interest in interacting with technological devices for the female
participants, because at the beginning of the training was waiting for men to start using the devices. So peer-to-peer learning was encouraged and working into smaller groups led them to relax and share their own knowledge.

4. Harira Wakili, CITAD (Nigeria): Connectivity issues are a challenge for Nigeria, lack of access to digital education is a problem in most of the communities. To bridge the gender lack of access to education, CITAD worked on developing awareness of the importance of being connected and to develop community networks and also, for the women to participate. To create the curriculum of the school, they looked for experts, focusing on Technology and Sustainability skills development. Also, volunteer mentors were invited to support the participants, because of their knowledge on the relevant issues happening in the communities. For the first school that was developed, there was not much participation from women, so for the second school a different strategy was taken to focus on women and elders from the community, so was successful with 50% of participation this time. To improve the participation a group of women were created, because they felt intimidated by men, so bringing a feminist approach to the Internet encouraged them to speak for themselves.

5. Akinyi Arose, Tunapanda Net (Kenya): Connectivity built by and for the community were emphasized and the focal areas of the contents of the school were connectivity, using human centered design approach, providing meaningful access to community. The guide that was provided, and PAR methodology at the building scenario helped them to delimited the training needs, as a co- creation process with the community networks members that really speaks to their needs. For the next stage of development, a series of conversations and a survey were conducted to analyze and curate the training. Also mapped out the experts were important to provide information and training services, the key areas that became the development of some Community of Practices: to design and deploy community networks; sustainability; and local content creation. Different stakeholders were invited to participate in the process to understand how to work collaboratively. Peer exchanges, virtual mentorship and bringing the concept of community to the training allow them to work with different grounds or levels of development of the community networks from starting projects to more consolidated ones, and what they are doing on ground.
Another challenge that the communities face after training, are related to access to infrastructure and equipment to deploy the networks, so one of the strategies followed was to figure out ways of resource mobilization, especially for emerging ones. Regarding volunteer and sustainability of knowledge, how to maintain the knowledge delivered by the training.

6. Adriane Gama, PSA (Brazil): a co-created curriculum based on social and digital aspects, gender and youth concerns, working with methodologies focused on the perspective of Paulo Freire of popular education and ludic were developed. The challenges faced were related to the pandemic and the
impossibility to travel, and also the lack of connectivity in the territories of the Amazon, where PSA bases its work. It was important to look for partners to strengthen the community networks and access to fundings to work, based on a sustainable economy and according to local needs, where the local women participation to be strengthened.

7. Gustaff H. Iskanda, Common Room (Indonesia): the pandemic outbreak revealed a huge gap in different areas, not only in the digital divide but also in the development gap. In Indonesia the challenge of digital divide comes with a number of problems from very huge geographical challenges like in Amazon, and lack of electrical supplies devices. A prototype in indigenous communities of National Schools of Community Networks was developed, based on Common Room´s own experience and also following the guide was developed with APC. An Advisory Committee and a curriculum was deployed focusing on software and hardware with the integration of policy and regulation, technical capacity building and meaningful connectivity. Also, a training of trainers programme was launched and a handbook was published, to make the contents easier and more accessible for the people. In Common Room they work with an approach called 5L: low tech, low energy, low maintenance, low learning and local support. The nature of Community Networks is context-specific, it can be different from one another, so their implementation needs to be focused on research and observation on the needs and the rights of the community. Meaningful access celebrates multiplicity in microscale. Local communities have to find their own way to deploy and learn what is a community network that is relevant for daily
life. A multi-stakeholder approach was developed, especially policy and advocacy with specific needs for long-term capacity building, digital literacy, special license for Community Network deployment, including tax incentive, because most of them are non-profit. Community Network´s strong foundation
is on the network of people, open knowledge and technology.

8. Josephine, Kenya ICT Network: There are a lot of similarities on the challenges the schools have faced as access to technologies, language, devices as well as women participation. Also how the schools have been adaptable and flexible, and learning together with the communities, collaboration between different territories, development of open source technologies.

 

Participant Questions - Additions

Said from Public University of Debre - Ethiopia : 1. What are the target groups for your local capacity building? are public schools or private companies or institutions?
2. The Internet is becoming a place for violence or abuse especially for women and children. What is your effort to mitigate this problem and safeguard the connected community? 3. To Indonesia, to ensure infrastructure accessibility most of countries from global South have a problem of ensuring accessibility, especially in Telecom or Internet infrastructure, because most of the telecom companies are not willing to go to rural areas , so what is your government effort or institutions like you in your
country to ensure infrastructure such that Internet is accessible to all communities?
 

Talant Sultanov Internet Society Kyrgyzstan chapter: there are some Community Networks in Kyrgyzstan and we learned that local communities can do capacity building. The first thing that people did when they had the Internet was send messages to the central government to say “We have no roads, no electricity”. The second thing that people started using the Internet for was e-commerce to promote some local products from farmers. Also, local WiFi hotspots became safe areas for girls. The Community Networks Learning Repository mentioned by Carlos would be a really useful instrument.
 

Ashapur Rahman from Bangladesh School of Internet Governance: more cooperation is needed in the continent. But for the capacity building if we cannot connect the whole globe maybe we cannot achieve our goal.

 

Reflection on Gender Issues

a. The number of participants in your session (or an estimate)
There were approximately 45 participants in Addis Ababa, the CR3 was almost full. Even though we proposed a hybrid format, just 4 people joined us in Zoom. The panel itself was gender balanced, with the moderator, the rapporteur and four panelists were women, of a total of 8 participants.
b. An estimate on the percentage of women and gender-diverse people that
attended the session, to the extent possible

More than 50% of the assistants were women.
c. Whether the session engaged with gender as a topic. If so, please explain in
a short paragraph.

Even though the main topic of the session was not gender equality, many of the contents mentioned women participation and the challenges faced related to this topic, because the project of the National Schools of Community Networks have this issue as one of the main subjects to work for. Also, the number of participants were mostly women.

IGF 2022 WS #517 New data on fairer access to health using the Internet

Updated: Tue, 03/01/2023 - 21:11
Connecting All People and Safeguarding Human Rights
Key Takeaways:

The intersection between health and Internet Governance is underappreciated, and more work needs to be done on bringing these areas closer together.

Calls to Action

Health is the most fundamental aspect of human existence, and Internet Governance actors need to be mindful of their role in enabling people to have access to it.

Session Report

The session focused on the draft report entitled "Health Online Indicators LAC ‘22-‘23: Data on access to health solutions using the Internet", produced by Governance Primer with funding from LACNIC, presenting its findings to a global audience in order to seek further input and evaluate what next steps should be pursued prior to publication of the document.

Mark W. Datysgeld (Governance Primer) and Ron Andruff (ONR Consulting) had an open debate with the audience in relation to the contents of the report, in a session that focused less on exposition and more on exchange of ideas.

The first point that was brought up is how, in spite of the global COVID-19 pandemic, some countries in LAC still lacked proper regulations around telemedicine, a concerning issue that reduces the safety both of practitioners and patients.

It was next noted that the ability to purchase medicines using the Internet is also limited and sometimes outright banned in some LAC countries. This is at times circumvented with the use of delivery apps, which leaves open questions such as liability and medicine safety.

The subject of medicine importation was debated, with an understanding that this practice is not even seen as a matter of discussion in many jurisdictions, when in fact the Internet could be leveraged to increase access and reduce inequalities in relation to medicines.

Finally, a dependency map was presented showing the potential routes of action to generate impact on processes where health and Internet Governance intersect, pointing out that there could be much more action than there currently is.

In discussion with the audience, it was brought up that such a study would be useful to the African region, seeing as, much like LAC, it is also constituted of a patchwork of legislations that do not harmonize with each other.

It was then discussed that the IGF itself is a good avenue for such debates to take place, considering its open-ended nature and multistakeholder nature, but different fora need to be sought for the theme to be socialized and increase in awareness.

The final report will be published in 2023, reflecting the discussions from this session.

IGF 2022 Open Forum #84 Digital Education and the Future of Women’s Work

Updated: Sat, 31/12/2022 - 14:51
Connecting All People and Safeguarding Human Rights
Session Report

As part of the 17th UN Internet Governance Forum, held in Addis Ababa from November 28 to December 2, the BMZ Digital Transformation Centers (DTCs) together with the global project “Future of Work” organized an open forum on Digital Education and the Future of Women’s Work, on November 29. The session invited experts from a broad range of countries and background to a panel discussion on opportunities and challenges faced by women in the digital economy and the role of digital education for equitable access to IT-enabled jobs in the future. 

Sabina Dewan from the JustJobs Network India and Prof. Kutoma Wakunuma, partner of the DTC Kenya, shared findings from their research on women’s experience in the digital economy, and learnings on good practices as well as policy implications for ensuring new digital opportunities benefit women equally. Their discussion was followed by practical experiences shared by Hannah Adams from Harambee Youth Accelerator on Rwanda, Salih Mahmod from Mosul Space Iraq and Yayaha Amsatou from the SmartVillage project Niger. All panelists highlighted the need to leverage women’s opportunities in the digital economy, while universally agreeing on the many obstacles female economic empowerment remains to face.  

Participants off- and online engaged in the discussion and emphasized the importance for policymakers in the global south to ensure requisite levels of basic education as an essential starting point to promote effective digital skills development and future-oriented competencies.​ At the same time, participants called for more inclusive policy reforms education systems to ensure access for people with disabilities and other vulnerable groups. 

In conclusion, the session identified several key takeaways from the panel discussion and subsequent round of questions: While digitalization and new technologies are sought to provide new employment opportunities, their positive impact is often limited to selected, highly educated groups with access to internet and IT-infrastructures. As a result, policies and regulations must consider the needs and interest of vulnerable groups such as women to guarantee equitable access to jobs and benefits in the digital economy.  

Yet, access to digital technologies is not a guarantee for women’s economic empowerment and no end to itself. In fact, the lack of sufficient regulations threatens to exacerbate existing inequalities. Yet, there are no one-size-fits-all solutions – rather, policies need to cater to women’s needs and consider their country-specific context to empower women in the labour market as well as in their homes and communities.​ 

Socio-cultural factors prevent women from realizing their full potential and limit equitable access to jobs in the digital economy as social requirements and unpaid care work continue to bind women to traditional reproductive roles and routine jobs. Experts from India, Rwanda, Niger, Kenya and Iraq agreed that governments must prioritize job creation and commit towards reducing the gender skills gap by creating education systems that promote digital skills for younger generations. For that it is crucial that training programs are demand-oriented and offer clear pathways to jobs to ensure female participation. As shown in the case of Rwanda, female role-models can further promote women’s participation in skills trainings and increase their participation in digital markets.​ Meanwhile, governments should also incentivize the hiring of women and supporting female entrepreneurs in the digital economy.  

In addition to that, government actions must include public awareness campaigns to make digitalization more accessible in society and implement curricula that focus on digital literacy to promote key digital competencies. For that, increased investments in education are needed and gender-disaggregated data in national statistical systems must inform new regulations as well the design of systems evaluating women’s reproductive roles. As demonstrated on the example of Niger, in rural areas where digital literacy rates and access to digital financial services are particularly low, more collaboration between private and public sector is needed, as well as better stakeholder-engagement, especially including women, for promoting women’s financial and digital literacy.  

IGF 2022 WS #69 Governing Cross-Border Data Flows, Trade Agreements & Limits

Updated: Fri, 30/12/2022 - 02:18
Key Takeaways:

The risks of not developing a minimal common international approach in governing cross-border data flows is high, the bilateral or regional trade agreements are not adequate to address the cross-border data flows.

,

The approaches adopted by China, the USA, India, ASEAN, and the EU towards cross-border data flow differ, both in their degree of stringency and in their priorities (e.g. national security, free trade, privacy, etc.), while this fragmentation increases barriers to trade, harmonising them would bring risks of not respecting national interests and the different degrees of development.

Calls to Action

The comparability mechanisms could be a possible solution , but different degrees of development and digital capacities need to be brought into the equation.

,

Bilateral and regional agreements are important but not adequate, minimal global rules are needed to minimise costs to businesses

Session Report

The workshop had around 120 onsite and online audiences. Six experts from China, Europe, India, Singapore and Africa had shared their analysis.

Dr. Chin and Mr. Deng explained that China’s cross-border data governing approach is based on  values of  sovereignty, personal data protection  and national  security. Chinese authority is trying to strike a balance between safeguarding national security and promoting those international trade.The Data Security Law of China and Personal Information Protection Law govern China’s cross border data export. The vast majority of data can be legally exported after meeting the requirements of security  assessments, security certifications, standard contracts, etc. However, it is totally prohibited to provide any information any data in China for any foreign authority or any foreign judicial authority except obtaining approval from the Chinese Government. This is a challenge for the Chinese authority to solve. The new Data Export Security Assessment Measures  have provided some legal certainty by defining four categories of  personal and important data that are required for a prior security assessment  to assess the risk. This formulation  is moderately referenced to the practices of  general and national security exceptions provisions in many regional FTAs. On the other hand, China's regulation on the cross-border data transfer is still in the process of very fast and rapid progress; China focuses much more on the outbound data with few regulations in term of the inbound data transfer. China's cross-border data rules will inevitably affect the direction of international cross-border data rules, to engage with China to becoming part of a community crafting shared norm and rule creation for the future is inevitable.

Prof. Locknie Hsu explained ASEAN’s approach in governing cross-border data that involves a set of principles, a set of frameworks, and a set of very useful tools to enable data flows to be smoother and to be facilitated. They include ASEAN data management framework, ASEAN cross‑border flows mechanism consisting of ASEAN certificate for cross‑border flows and cross‑border contractual clauses, and ASEAN e‑commerce agreement. For ASEAN, there is another layer of data concerns, i.e. consumer protection, and also cybersecurity.

Prof Rolf H Weber pointed out  that within the 27 Member States of the European Union, cross‑border data flows is not really restricted or only restricted to a lot of minimal extent.  Since it is the practice of the European Union to have free and liberal and open space for businesses and for Civil Society. There is an important distinction between personal and nonpersonal data.  As far as nonpersonal data is concerned, there are not many impediments to cross‑border data flows.  As far as personal data is concerned, cross‑border data flows is subject to compliance with relatively strict rules. As far as cybersecurity is concerned, EU have a regulation on the security of important networking infrastructure and a cybersecurity strategy.  But to a very far extent, cybersecurity remains a national domain. Comparing the European approach with the Chinese approach, the rules in European Union are less strict and more open. Cross‑border data flows should be legal, free, and secure, there should have more rules on a global level.

Dr. Mansi Kedia pointed out that India’s model is going to be somewhere in the middle, using bilateral arrangements, informal arrangements or formal arrangements.  Or a model at the global level, which would address their concerns about national security and access to data in instances of cybercrime. India is much more conservative. India has seen a series of data localization norms that started very early on with  public sector, financial and banking services sectors. It was followed by consultations and deliberations on a privacy bill.  And the first and second versions which had hard localization policies and requirements. This strong need for sovereign control of data is also reflected in India's position in several multilateral and bilateral positions.  The recent softening of the stance is reflected in a recognition that cross‑border data flows have economic costs, cross‑border data flows will help trade and the digital businesses.  But the government has made in favor of data localization because of the dominance of big data and monopolization that might take center stage in the days to come. The other economic argument is that data localization will force companies to set up data centers in India which can lead to a domestically created digital economy and India won't be dependent so much on foreign companies for the digital services. India wanted a broader jurisdiction over citizen data, India should have extraterritorial jurisdiction over that data. 

For the roundtable question:

What are the possibility of cooperation and collaboration and where are the challenges? 

Dr Yik Chan Chin: First, There is a convergence in privacy protection as the foundation for data export despite how strong to protect privacy is varied between countries. Secondly, many countries  put national security as one of the preconditions and important exceptions in FTA as well. But there is important divergence amongst national governments in how to define the national security,

Prof. Locknie Hsu: National laws are fragmented in terms of how to deal with data localization, data transfer rules are also different. It is important to be mindful of businesses' needs in compliance and understanding what the boundaries are and understanding how to navigate the rules and understanding what they can and can't do, so on.  The comparability mechanism can help bridge some of the gaps between international, regional and global rules.

Prof Rolf H. Weber: We do have special instruments that can be applied if the equal level of protection should be achieved, 

Dr. Mansi Kedia:  We shouldn't worry about the fragmentation at this point.  Each country will work out a way to get to this when they're ready. 

Mr. Zhisong Deng: China is the second economy in the world and export a lot of goods and services out of the country, it needs more dialogues among those major jurisdictions to figure out how to promote free trade.

Ms. Linda: Africa has a general framework on trade which is the Africa free trade agreement that came into force in 2021.the power of big tech in Africa is concerned.

For the roundtable question:

What are the risks of not developing a nonnational approach? would it lead to divide or would the bilateral trade agreements be enough to continue to use in both trade and digital sovereignty? 

Dr. Yik Chan Chin: there is no global agreement, which is not ideal situation, but there is the co‑existence of different regional agreements and mechanisms.  Ideally, we want a digital agreement at the global level. At the moment, there is slow progress in the WTO in terms of e‑commerce negotiation. hoping a minimal framework developed out of these discussions. 

Dr. Mansi Kedia: Harmonization of cross‑border data flows is important, but  other policies will need to be addressed to address the problem of the digital divide before it reaches the point of harmonization. Otherwise, it will completely breakdown digital economic engagement across different countries.

Prof Rolf H Weber: minimum harmonization would certainly lower the administrative costs of internationally active entities.

 Prof Locknie Hsu: there is a question mark in terms of what these exceptions scope should be, how to clarify what businesses need to know, what they can and cannot do as mentioned before. 

Mr. Zhisong Deng: Without a common international approach, I am noticing three risks.  trade barriers; localization requirements will increase the burden on the global enterprises; losing the maturity of the digital economy. We can start with reaching more bilateral or regional agreements. The core issue is that all the countries should respect the national and security interest of other countries. 

Ms. Linda: Africa where most of the data protection laws now are enacted post GDPR mirrors the GDPR, sometimes it didn't work because of the different context, different budgets, and different political structures.  Different ways of looking at data governance and see where does the public interest lie, and where does also the national security lie.   How do we achieve data serenity without data localization.  And sort of balance tools and agree on that as we proceed to have the global conversations.  Because I think those differing points of view really have people in their own corners, deciding on their own data Government approaches. 

 

 

 

 

 

 

 

 

 

 

 

 

IGF 2022 DC-CIV DC-CIV: Geopolitical neutrality of the Global Internet

Updated: Thu, 29/12/2022 - 00:40
Avoiding Internet Fragmentation
Key Takeaways:

Maintaining a generally neutral, connected and resilient core Internet infrastructure is of vital importance. Combating bad behaviors sometimes can be locally implemented (dropping DOS traffic, seizing abusive domain names, filtering “bad” content, detecting and filtering malware, spam and phishing).

,

The major problems today are at the application layer whether this is misinformation, disinformation, CSAM, phishing, surveillance, etc. Finding ways to create incentives for good practices and to discourage bad ones is a challenge. When incentives don’t work, we need ways to hold badly acting parties accountable. This will require international cooperation in cases where harns are inflicted across jurisdictional boundaries.

Calls to Action

Stakeholders should study the process and suitability by which (a) sanctions are decided, either as a result of UN processes, or developed in a multi-stakeholder way --- and whether sanctions are indeed the right response, whilst sanctions might indeed be damaging the Internet and (b) are implemented or indeed implementable?

,

Stakeholders should weigh the above against fierce opposition from some countries and stakeholders to any kind of damage to the Internet and its Core Values, thus these should be making them limited, proportional, and with minimal undesirable side effects - plus procedurally clear, multistakeholder, and transparent.

Session Report

 

MAIN REPORT

 

The Internet is a Network of Networks, global, not only in a geographical context, but by several shades of the term 'global' in terms of being free of cultural, ideological, political bias, and global in terms of the technologies that converge into it. On many occasions, the decision taken was to separate geopolitics from the Internet which would make the Internet into two or more 'Splinternets' in place of the unfathomably valued One Internet.

 

Yet, recently, some leading members of the Internet Community signed a common statement "Towards the Multistakeholder Imposition of Internet Sanctions" - opening the door to the Internet Community having some means to decide on whether sanctions such as disconnection from the Internet would be appropriate. This Statement and background can be found on: https://techpolicy.press/towards-the-multistakeholder-imposition-of-internet-sanctions/

 

This brings forward the question of whether the Internet should be part of a sanctions regime.

 

In the session, panellists were asked the following questions:

 

  • Is the Internet technical architecture and infrastructure as currently defined able to impose sanctions?
  • Is the Internet management/administration as currently defined able/willing to impose sanctions
  • Indeed, should it impose sanctions, knowing these would break Core Internet Values?
  • What path could the Internet’s Governance take in the future:
    • Is the future that of assuming a technical mission, simply maintaining the geopolitical neutrality of the Internet or will it expand to reflect ways of the Internet bridging real world geo political and cultural divisions to make One World, or Two, Three or more?

Panellists answered as follows (some comments submitted in writing by the authors themselves, some paraphrased from their interventions):

 

(Vint Cerf, Google)

  • Maintaining a generally neutral , connected and resilient core Internet infrastructure is of vital importance. Combating bad behaviors sometimes can be locally implemented (dropping DOS traffic, seizing abusive domain names, filtering “bad” content, detecting and filtering malware, spam and phishing). 
  • The major problems today are at the application layer whether this is misinformation, disinformation, CSAM, phishing, surveillance, etc. Finding ways to create incentives for good practices and to discourage bad ones is a challenge. When incentives don’t work, we need ways to hold badly acting parties accountable. This will require international cooperation in cases where harns are inflicted across jurisdictional boundaries. 
  • It is important that attempts to apply sanctions proportionately and to follow the principle of subsidiarity. It is a mistake to apply sanctions at the wrong layer in the architecture. For example, shutting down the Internet to deal with bad behavior by some parties is an overreach that creates a lot of harm for those innocently relying on the operation of the network. 

 

—-------

(Bill Woodcock, Packet Clearing House)

  • The implementation of sanctions via Internet means currently faces two principal challenges: On the network side, network operators typically under-comply or over-comply, due to difficulties in appropriately scoping enforcement actions. On the governmental side, sanctions regimes are not typically published in a uniform, consistent, or machine-readable format, they’re not published in a single predictable location, and they’re not harmonized with other regimes. 
  • Many very specific implementation issues exist as well, starting with governments’ predilection for transliterating foreign-language or foreign-character-set names of sanctioned entities in diverse and inconsistent ways, rather than using the most-canonical form of each name, in its native language and character set. Network operator implementation has been occurring within the Sanctions.Net community since March of 2022, and governmental harmonization efforts have been occurring principally within the Digital directorate of the OECD.
  • Most conversation about Internet sanctions implementation has been positive and collaborative, since governments wish to see their sanctions regimes respected, and network operators wish to comply with the law and protect their customers. Dissenting voices have questioned the legitimacy of sanctions regimes from both the right and the left, principally fearing governmental overreach.

 

—--------

(Veronika Datzer, Advisor at German Parliament)

  • It is impossible for politics to refrain from the internet because it already is. This process cannot be reversed. We therefore need political solutions because the technical infrastructure of the internet must remain neutral.
  • Solutions to making the internet a peaceful place must not include internet sanctions as these impact all people and can have dramatic adverse consequences. They must be based on a multistakeholder model and co-create what it means to establish a peaceful internet, as such an understanding should not be imposed.
  • We need to be in close cooperation between the technical community and the political community. 

 

—--------

(Iria Puyosa, Toda Institute)

  • The global multistakeholder governance ecosystem should center the protection of human rights to safeguard internet core values. Sanctions against States that violate international law may be necessary in cases of widespread human rights violations or credible allegations of crimes against humanity enabled by  State agencies' internet usage.  
  • The global internet may need to create a multistakeholder policy advisory body that provides guidelines on targeted sanctions that may be enforced if necessary. Nonetheless, sanctions must be targeted, specific, and proportional. Also, a robust and reliable due process must be established for making these decisions 
  • The establishment of rules and processes to define and enforce sanctions should not be decided by a small number of governments (such as those belonging to the OECD). The policy formulation process should involve countries from different regions of the world. Otherwise, the sanctions regime may be considered unilateral and provide an excuse for the "sovereign internet" model leading to the splinternet.
  • All of the countries are working their model of sovereignty and Internet some are taking an approach that is completely different than that model we're used to, the open, free, Internet.

 

—------

(Bastiaan Goslings, RIPE NCC)

  • It is not in the mandate of technical organisations like RIPE, not within the policies that determine how these organizations are run, to make decisions on sanctions. Policies  are set by multistakeholder communities across an entire service region, which includes many jurisdictions. If there are sanctions, they need to be decided following due process, democratic fashion demonstrating that the sanctions are proportionate to the goals to be achieved. Economic sanctions are set by the European Union.
  • RIPE has no authority to actually enforce what they are doing as it operates as a trusted technical organisation, a neutral authoritative entity in this case, but no enforcement power of any kind. Networks using the RIPE database operate on Trust.
  • Anyone can decide they do not trust this system and operate their own registry. From that perspective, it is a vulnerable system.

 

From the Floor

  • The way to protect the Internet is to isolate the Internet from politics completely and emphasize to the parties that are geopolitical that this is not a geopolitical space and politics has to be out of Internet governance in order to protect the Internet. (Sivasubramanian Muthusamy, ISOC Chennai)
  • As a national regulator, when it comes to technical issues or technical harm, I guess from a technical point of view, it is easy to spot something going wrong or to spot harm. And stop it in a way or another. But Nationalisation of the Internet is a concern - how can we stop that? Secondly - it is already too late to keep politics out of the Internet as it is used by politicians. (Hadia El Miniawi, Egypt Telecom Regulator)
  •  it can't be Germany. It can't be the European Union. And it also can't be Google. We all have to be included in it. That is why the IGF is so important. It is multistakeholder and through a process that is completely inclusive to all states and big companies, at least. (Veronika Datzer)
  • There is no such thing as a splintered Internet. Once it is splintered, it is not the Internet. We have to deliver to the Government and leaderships that we can't splinter the Internet. If you start imposing a sovereign law, you will start actually removing functions of the Internet, totally or significantly. The Internet Society and myself, others, have created frameworks, whereas you can see how different policy proposals and Treaties work with the Open-Ended or Governmental experts in the U.N. It shows how the proposals can be commensurate with the Internet or not. With that knowledge, we have to work on a multistakeholder basis first. (Alejandro Pisanty, UNAM)
  • From the Global South perspective what we need the international community to do is actually work on the basis of humility, empathy, solidarity rather than punitive approach, which could actually be prone to bias or to political and economic agendas. Because always when we have a political and economic agendas rolling the dice, the reality that we have on the ground is that those that are the most vulnerable and that are least responsible and least involved in great power competitions, they're the ones that suffer. …we now have the Sustainable Development Goals and these need to be the priority of the international community….  So we very much understand that there are some trends related to the use of Internet for political and international security and geopolitical issues. But this trend must stop. And we must use the Internet to achieve the Sustainable Development Goals.(Tulio Andrade from Brazilian Ministry of Foreign Affairs)
  • The issue, the important issue of geopolitical neutrality of global Internet should be reflected through Global Digital Compact. The first one is development of internationally legally binding (?) from cybersecurity based on the international law. The suggestion is establishment of framework, rules and norms and accountable behavior of digital platforms and serve providers in data security and content and law. And defining a common region for Internet as a peaceful and development oriented environment for public good. Not as a new battlefield and militarized environment. Through signing a global Declaration by all members. The last one, internationalization of Internet and public core as a trust building measure could help global Internet to be geopolitically neutral. (Amir Mokabberi, Tehran University)
  • We have to have an agenda to continuously help the public officials to give them knowledge and help them determine what they think from a public interest perspective they have to do. It is useful to distinguish the core of the Internet, the functionalities, the numbering, naming, routing systems as opposed to everything that happens on top of the Internet. (several participants)
  • What we need is a space where politics can take place forever without destroying the structure, the Internet itself. The Internet is not one monolith as Bill emphasized. There are lots of networks. There is a great incentive for the Governments to recover the mantras which we had forever on the Internet side. Connectivity is its own price. You lose more than you gain when you lose connectivity. So let's start pushing more for outcomes at the multilateral level that are compatible with what has been happening under technical and multistakeholder sides for so many years. (Alejandro Pisanty, UNAM)

Summary by Olivier Crépin-Leblond - 28 December 2022

 

 

IGF 2022 Open Forum #108 Combatting Disinformation without Resorting to Online Censor

Updated: Thu, 22/12/2022 - 14:48
Enabling Safety, Security and Accountability
Session Report

Combatting Disinformation without Resorting to Online Censorship – Open Forum organised by LATVIA

Date, time, venue:

30 NOV 2022, 10:50 UTC, Caucus Room 11

Moderator:

Viktors Makarovs, Special Envoy on Digital Affairs, Ministry of Foreign Affairs of the Republic of Latvia

Speakers:

Ms Anna Oosterlinck, Head of the UN team, Article 19; Mr Allan Cheboi, Senior Investigations Manager, Code for Africa; Mr Rihards Bambals, Head of Strategic Communications Coordination Department, the State Chancellery of Latvia; Mr Lutz Guellner, Head of Strategic Communication, Task Forces and Information Analysis Division, European External Action Service; Ms Melissa Fleming, Under Secretary General of the United Nations for Global Communications.

Main information about the panel:

Disinformation is a major threat to public safety, security and democratic stability. Governments around the world fight disinformation in different ways. Right now we see a particularly concerning trend for online censorship by some governments as a way to address real or presumed threats posed by disinformation. By applying censorship, governments take away or limit citizens’ freedom of speech and expression. Online censorship also often goes hand in hand with information manipulation.

The main panel discussion focused on identifying ways for governments, organizations and platforms to address disinformation without resorting to online censorship, bans or internet shutdowns. There was broad agreement that disinformation is most effectively addressed by means of wholistic and multifaceted policies, and that this approach is also rights-compatible. Successful implementation of this approach must be based on a conceptional framework that identifies and defines the challenges. It starts with a critical examination of the true cause and context of disinformation and of the risks it presents to society at large. Free and independent media and better public communication are the best tools to fight disinformation. A free, open, safe and secure online environment is most resilient to disinformation.

Key points by each speaker:

Allan Cheboi: Misinformation is false information without intention to do harm, but disinformation is false information that is used to harm or influence other people’s decisions or thoughts. Disinformation is also used to gain power. In most of the cases, misinformation misrepresents facts, while disinformation is centred around a narrative. To address disinformation, we need to customize laws to include information monitoring. Specifically, for the local disinformation attempting, for example, to influence the outcome of elections. We need to make substantive investments to cope with the challenge. Another alarming development is disinformation targeted at the United nations (UN) peacekeepers. That needs to be tackled swiftly and decisively.

Rihards Bambals: Disinformation is false or misleading content disseminated on purpose to mislead and to gain political benefit. Disinformation is a global man-made disaster that is hazardous and influences vulnerable people. Latvia addresses the challenge with centralized information environment monitoring capabilities covering both traditional and social media. Latvia’s strategy is based on three pillars: effective government communication, quality of independent journalism and media, societal resilience. It is of utmost importance to invest in media and information literacy. Governments need to strengthen citizens’ capacity to think critically and recognize and report disinformation cases. Some governments invest billions in spearing disinformation. One example is the Russian Federation’s massive disinformation campaign accompanying its military aggression in Ukraine.

Lutz Gullner: First, we need to define the problem and distinguish between misinformation and disinformation. Misinformation is false information with no intention; disinformation is based on a clear intention. Disinformation can be used as a way to gain economic benefits. There are five characteristic elements we need to look when distinguishing disinformation: harmful, illegal, manipulative, intentional, coordinated. The European Union uses the ABC model which stands for: A – actor, B – behaviour, C – content. This is a technique for distinguishing disinformation and identify actors that are trying to manipulate the information. The approach allows governments to prevent censorship and look at given information in an objective manner.

Anna Oosterlinck: Disinformation must be seen in a wider context, including: reduced pluralism and reduced diversity of information that we can access online; challenges connected to the digital transformation of media; underlying social causes including economic and social inequalities leading to mistrust and polarization. All these factors combined create environment where disinformation can flourish. Disinformation has been addressed in some laws as restriction on false statement of fact that can cause substantial harm or laws on election fraud or misleading advertising, or sales of certain products.

We need to fight disinformation with a number of positive holistic measures by a range of actors. To fight disinformation, we need free and independent media environment, strong protection for journalists and media workers online and offline; implement a comprehensive right to information laws including by complying with the principle of maximum disclosure of information and by proactively releasing information of public interest. Governments should not spread disinformation themselves; they need to ensure connectivity and access to free Internet; invest in digital media and information literacy; adopt positive policy measures to combat online hate speech; work with companies to make sure they respect human rights.

Melissa Fleming: Back in 2018, the UN found that disinformation and hate speech online played a significant role in stoking horrific atrocities against the Rohingya population. They pushed ordinary citizens to commit unspeakable acts. And similar stories have emerged in many other conflict settings. For example, recently in Ethiopia Facebook posts spread hate and inspired attacks. In Ukraine, information is also being used as a weapon. Meanwhile, in Ukraine's neighbouring countries, we're seeing how spreading of lies about refugees brings more suffering for the most vulnerable.

Free speech is not a free pass. In the era of mis- and disinformation, free speech is much more than the right to say whatever you want online. Platforms must face the fact that they are constantly being abused by malign actors; they must live up to their responsibility to protect human rights and save lives. United Nations are constantly engaging with the platforms and advocating the need for the platforms to do their due diligence on human rights and to review their business models against the UN guiding principles on business and human rights. The platforms should offer a robust framework to reduce the dissemination of harmful falsehoods, as well establish remedy mechanisms.

The UN “Verified Initiative” succeeded in getting accurate lifesaving information to communities around the world during the COVID-19 pandemic. The UN is also working to strengthen the capacity of social media users to identify and avoid falsehoods by promoting media and information literacy and by creating own teaching tools. The UN has launched two free online digital literacy courses on mis- and disinformation in collaboration with Wikipedia. The courses are in multiple languages and are being taken by students of disinformation all over the world, hopefully improving their ability to identify mis- and disinformation and not become part of the spreading cycle.

The UN also encourages governments to promote various measures to foster free flow of information, enhance media diversity and support independent public interest media as a means of countering disinformation.

Summary/Conclusions:

  • Disinformation is false information put in circulation to intentionally do harm or to gain political or economic benefits. 
  • Disinformation can and should be addressed without resorting to censorship.
  • We need to keep developing our conceptual framework on disinformation and related phenomena.
  • Policies to address disinformation should focus on fostering a free, open, safe and secure online environment and on strengthening free and independent media.
  • The online platforms need to improve their efforts to better address disinformation.
  • It is important to strengthen citizens’ ability to identify and counter disinformation by investing in digital and media literacy programmes.
IGF 2022 Open Forum #101 Open Forum on Technical Standard-Setting and Human Rights

Updated: Wed, 21/12/2022 - 23:48
Addressing Advanced Technologies, including AI
Key Takeaways:

Effective and inclusive multistakeholder participation in technical standard-setting process is critical to ensure adequate human rights considerations are taken into account before, during, and after the development of technical standards, including during the implementation stage after standards have been developed.

,

In order to ensure inclusive and sustainable participation of stakeholders, including civil society, various barriers (financial, cultural, knowledge) to access and meaningfully participate in technical standard-setting processes could be considered and addressed at various standard development organizations while ensuring this does not slow down the process or pose additional hurdles.

Calls to Action

Standard Development Organizations should consider methods to ensure inclusive, meaningful, and sustainable participation of and access to technical standard-setting processes for stakeholders, particularly civil society organizations and human rights experts that can provide expertise so that adverse human rights risks are mitigated and addressed in standard-setting processes.

IGF 2022 Town Hall #109 Jointly tackling disinformation and promoting human rights

Updated: Wed, 21/12/2022 - 21:46
Key Takeaways:

There is no "silver bullet" to tackle disinformation. Addressing the issue requires multiple complementary measures, such as effective fact-checking, building digital literacy for all (including the most vulnerable), holding those who profit from disinformation accountable, and the involvement of all stakeholders.

Calls to Action

Report

Session Report

The session brought together over 60 organisations for an open exchange of ideas, experiences, and lessons on how to address disinformation through a multi-stakeholder and human-centric approach. In particular, the debate focused on how Africa-Europe partnerships can help tackle the issue, in light of the AU-EU D4D Hub’s mandate to foster digital cooperation between both continents.

The panellists explained how fact-checking has grown dramatically in recent years, becoming one of the most common measures to tackle disinformation. Nevertheless, more than effective fact-checking is needed, they warned. Africa-Europe cooperation should adopt a comprehensive approach integrating multiple complementary measures, such as building digital literacy for all (including the most vulnerable), holding those who profit from disinformation accountable, and the involvement of all stakeholders in devising solutions.

Bringing all actors to the table

Simone Toussi, Project Officer for Francophone Africa at the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) highlighted how “disinformation is a multi-faceted phenomenon that directly threatens democracy and human rights and affects all stakeholders in society”.

“Disinformation manifests in many ways, and can be perpetrated by a diversity of actors,” she added.

As such, she argued that countering fake narratives needs both online and offline efforts undertaken in a coordinated manner by governments, intergovernmental organisations, civil society, media, academia, and private sector. “Multi-stakeholder collaboration is crucial to bring together different views and understanding of the roles that each actor plays,” she said.

Toussi presented research findings proving that measures to tackle disinformation can be ineffective or inadequate when they only consider the point of view of a single stakeholder. For example, fact-checking is sometimes challenged by lack of access to information. Media and civil society participation can help ensure that governments treat information as a public good.

Engaging with the private sector… how?

The debate also touched on the essential role that technology companies play in keeping disinformation from spreading. Ongoing efforts by private sector include partnering with civil society and fact-checkers — including through multi-stakeholder collaborations as proposed by Toussi.

Nevertheless, for Odanga Madung, journalist and Mozilla Foundation fellow, such measures are not enough. He argued that one of the major contributing factors to disinformation is that fake or misleading information is algorithmically amplified by big companies.

“Big companies and social media platforms profit from the spread of disinformation. It is part of their business model, which is a very serious problem,” he said.

For Madung, tackling disinformation requires strong regulations to protect users and their rights, addressing big technology companies’ dominance, encouraging competition, fostering new ideas on different business models, and decentralising the Internet.

Planting the seeds of change

Charlotte Carnehl, Operations Director at Lie Detectors, proposed further investments in training teachers and fostering exchanges between journalists and school-age kids: “Countering the corrosive effect of disinformation and polarisation on democracy requires empowering school kids and their teachers to tell facts from fake online.”

She argued that enabling journalists to visit schools to explain how professional journalism works is a win-win situation. It can help journalists to learn about how the younger generation accesses and consumes information, while teachers and children can gain practical skills in identifying fake or misleading information online.

“Everybody needs the skills to assess and critically think about information,” Carnehl said. “Kids are actually a high-risk group for disinformation because they are targeted on channels that can’t be monitored, and they are largely navigating them by themselves without their teachers or even their parents present.”

When questioned on the short-term impact of such measures by a member of the audience, Carnehl acknowledged that it’s a long-term investment, “like planting the seeds of a tree”. However, she argued that there are also some immediate positive effects for children.

Finally, Carnehl called for special attention to be paid to marginalised groups, such as rural populations. Civil society organisations could help ensure that everyone can access reliable information, she said.

IGF 2022 WS #341 Global youth engagement in IG: successes and opportunities

Updated: Wed, 21/12/2022 - 16:06
Connecting All People and Safeguarding Human Rights
Key Takeaways:

There are existing youth initiatives that put a lot of effort in bringing Internet governance closer to fellow young people. Newcomers to the IGF community could learn and benefit from joining them.

,

In the recent years, the youth-led and youth-oriented capacity-building progammes has been strongly developed and growth in power.

Calls to Action

There is a need to build platforms of cooperation between different youth initiatives in order to achieve common goals more efficientely.

,

There are still some gaps in the inclusiveness of the IGF (for ex. the languege barriers) that young leader should be aware of.

Session Report

Firstly, Emilia Zalewska did the opening and presented the objective of the session which was to present youth initiatives and opportunities existing in the IGF community.

Then, the first speaker, Nicolas Fummareli from Youth Coalition on Internet Governance (YCIG) talked about the activities of his organisation, for ex. forming 5 Working Groups of young people from all over the world that prepared 13 session proposals for the IGF (11 was selected), conducting a series of preparatory webinars before the IGF and inolvement in regional NRI’s initiatives

He also encouraged the participants to take part in upcoming elections to the next YCIG’s Steering Committee.

The second speaker, Veronica Piccollo from Youth Standing Group explained the origins of her initiative that could be found in Brazil. The objective was to foster the participation of young people from Latin America in Internet Governance. Now the initiative has been recognised by the Internet Society as a standing group. It collaborates strongly with the  YCIG. One of the current activities of Youth SG is creating a new edition of the Youth Atlas that will map involvement of young people in IG.

The third speaker was Athéna Vassilopoulus representing the Generation Connect Europe (GC), ITU. She told that her group aims to engage young people in the activities of the ITU. In the first year, the GC created a youth declaration. The next year it has worked on preparation of the event Digital Youth Jam. It has also participated in the youth summit in Kigali, Rwanda. GC is currently restructuring the group and will create a call for new participants in January.

The fourth speaker, Piotr Słowiński from NASK National Research Institute described its role  in promoting youth in IG and cybersecurity. It started in 2020 and the process is ongoing. The participation in the Global Youth Summit in Katowice, prepared by Youth IGF Poland in cooperation with NASK was massive and allowed to connect youth and experts.

The fifth and last speaker, Melaku Hailu from Model Africa Union said that their initiative is the second model that created the Woman, Gender and Youth Directory. They do simulation of the Peace and Security Council of African Union. Their model addressed the SDGs with three pillars. social, economic, and environmental approach.

The speakers’ inputs were followed by the discussion with participants of the session. They expressed the need to push youth engagement forward and to find more people motivated to engage. It was highlighted that more advocacy work should be paid. There was also a comment that the IGF is in English which is a barrier for it to be inclusive.

IGF 2022 WS #420 Skills of tomorrow: youth on the cybersecurity job market

Updated: Wed, 21/12/2022 - 15:34
Enabling Safety, Security and Accountability
Key Takeaways:

There are gaps existing between what cybersecurity industry expect from gratuates and what they can offer. This disparities especially concern women and youth.

,

The cybersecurity job market needs to be more open to newcomers - for ex. not to require much experience for the starting positions - and to invest in finding and training new talents.

Calls to Action

There is a need to increase collaboration between government, academia, and the private sector to equip young people in skills neccessery in a job market.

,

Children should be taught early about career oportunities in cyber industry and have chances to learn some basic skills, for ex. at bootcamps.

Session Report

The session began by conducting 2 online polls via an online tool Mentimeter.  The moderator, Emilia Zalewska, asked  audience what their predictions are on following topics:  What are the top two skills that employers in the cybersecurity sector say they look for? Which types of cybersecurity jobs are most difficult to fill? In the first question, audience answered mostly correcly, that the top two skills are "Problem solving and teamwork". However, in the second question, most respondents chose that the job most difficult to fill is "Cloud backend engineer" while the correct anwser was "Cybersecurity manager"

After that part, the floor was taken by Teuntje Manders. She presented the audience with the ISC3 research project and its results on which the 2 questions in polls were based. The studyh involved surveying leaders in the cybersecurity industry to obtain data on what needs exist on the job market in this sector. 

This was followed by the next part of the session, in which there was a discussion among speakers, Firstly, Nancy Njoki Wachira conveyed to the international community that she is very happy that these types of projects take place, and that they will help young people with a career in cybersecurity.

Mohammad Ali Januhar commented in turn on the technical layer, related to how the industry works. He then reflected this on the market and its needs, in terms of employing young people.

Samaila Atsen Bako drew attention to universities and schools and whether they adequately prepare people to enter the cyber security profession.

Anna Rywczyńska spoke about the issue of access to development opportunities in this matter and the ease of access to information and training opportunities.

Samaila Atsen Bako also commented on the problem of the experience that business requires from candidates for cyber security positions.

Following these remarks, the floor was once again taken by Anna Rywczyńska. She referred to the insights as well as the issue of women in the industry.

Before the second round of discussion, the moderator invited questions. There were questions from
the audience:

1. What we can do to address the cultural barrier in accessing to cybersecurity job market?
The questioner argued that, from at least an African perspective, parents of young people do not
share the enthusiasm of young people to work in the cyber industry. The question was addressed by all panellists. It was pointed out that the biggest problem is to reach out to young people to promote knowledge, the industry.

2. Another question was asked by an online participant. After also pointing out the cultural aspect and his own family experience, the questioner posed this hypothesis: If you acquire the right skills, will you earn high enough. Therefore, he asked the provocative question whether it is not the young who should support their younger siblings, to choose the right path in terms of working in the field of new
technologies. The question also alluded to a question from the audience, which referred to cultural issues.

3. Many people, including those in governing circles, do not understand cybersecurity, so how do we require them to support knowledge in this area?

4. Where are the platforms for sharing knowledge, for acquiring knowledge about cybersecurity?

Some members of the audience presented thei statements as an input to the discussion:
1. Schools do not teach practical IT issues. We don't learn at school how to work with online
threats, for example.
2. Creating additional communities in schools and universities helps to understand the complexity of cyber security and also to get non-technical people interested in the subject.
3. It is necessary to help educational establishments outline the needs of young people so that they can develop their interest in cyber security.

After a round of question collection, the panellists  began to respond to questions and statements from the audience. Anna Rywczyńska and Mohammad Ali Januhar tried to address the issues that were flagged up in the room. Samaila Atsen Bako and Nancy Njoki Wachira have
completed the replies with their observations:
Parents want to protect their children, so they don't see their childrens careers as technology experts because they themselves don't know how to guide their children in this world so as not to make them addicted to technology or cause them to use it for bad things. This causes a cultural blockage.
There is a need to create an environment that promotes and facilitates access to knowledge on
cybersecurity.

IGF 2022 WS #292 connectivity at the critical time: during and after crises

Updated: Wed, 21/12/2022 - 08:28
Connecting All People and Safeguarding Human Rights
Session Report

 

Report: IGF 2022 WS #292 Connectivity at the critical time during and after crises

The session was moderated by Innocent Adriko who gave a background information and introduction to the session.

Ethan Mudavanhu started by acknowledging that everyone has a part to play when it comes to preparedness before or post-crisis. He said civil society’s role might be to provide strategies on how to minimize damages to critical infrastructure. He also mentioned the need to define the roles of each stakeholder around national emergency telecommunications plans and integrate those plans as climate change and adoption policy priorities.

He also mentioned that government has a role in regulating future technologies in emergency preparedness and resilience of internet connections. He added that, that could be achieved by envisioning satellite and the Internet of Things (IoT) as part of the emergency system which can be used as search and rescue alternatives.

He again mentioned partnership is important for the government, the private sector, and civil society to create better solutions and activation protocols. He gave examples of partnerships that helped in mitigating damages and ensure connectivity. One example was in Australia where the Stand program was a disaster satellite service and the first to be rolled out through funding provided by the government which was strengthening telecommunications against national disasters. He urged the need for Africa to consider a combined effort of terrestrial and extraterrestrial going forward.

Shah Zahidur Rayamajhi noted that connectivity issues might be looked at by the national and local authorities as each country has National Disaster Management Organizations. These management disaster management organizations are included with ICTs, ISPs, humanitarian organizations, and civil society organizations. He again noted the need to look at the option and evaluate connectivity that is relevant to the affected population due to a disaster.

He also mentions internet connectivity solutions for the affected population and designing solutions depending on each context and considering the different needs such as human, socio-cultural, economic, and affordability. He added that connectivity solutions bundled with internet, voice, SMS, and other data services are understood to be the best solution. The need to deploy various VSATs and WI-FI accessories through local partners or other stakeholders who may help in the deployment and service restoration process.

He highlighted the need to have portable connections and ensure the privacy and protection of data services provided to the affected populations. He again mentioned that after the emergency has been restored, the service has to be maintained and operated for a certain period because the telecom service provider might be affected and not be able to provide services to the community. He suggested that community networks can be deployed to support the connectivity efforts of response activities during and after crises.

CALEB KWABENA AYITEY KUPHE on his side started by defining a critical infrastructure as a system and an asset, whether physical or virtual in our digital world. In terms of crisis, there is a need to understand the three (3) elements of critical infrastructure; the physical, cyber, and human. We must also understand the effectiveness of the critical infrastructure for ensuring the effective functioning of the economy as that is an essential factor in determining the location of the economic activities or sectors that can develop in a country.

He highlighted that developing countries need to understand the framework and plan for the infrastructure system and know how to protect the critical infrastructure sector by understanding the risks involved and knowing where the vulnerability lies. He also added that collaboration between stakeholders like governments, the private sector, and civil society is important. He said the government and the private sector can help with some resources like routers, switches, computers, et cetera while civil society, academia, and the technology community can also collaborate to provide training to deploy the telecommunications infrastructure.

He again noted that developing countries must be able to resource personnel that can understand the emergency response. He referred to the collaboration between the Internet Society Ghana chapter and NetHope trained people in disaster management and he stated that he believes when we come together and provide resources for such people, we will be able to manage critical infrastructures.  He concluded by stating the need for developing countries to double the current investment level in emergency response projects.

EILEEN KWIPONYA started by noting that disasters can strike at any time giving Covid-19 as an example of such an emergency which can happen anywhere and at any time. She said the pandemic brought the world to a standstill which resulted in the closure of companies and some of them had to let their employees work from home. She highlighted that those that worked from home needed laptops and internet connectivity which partnership between stakeholders helped to solicit resources to enable continuity of work and daily life activities. She added that schools had to continue running and the government came in ad worked with organizations to solicit resources such as laptops and provide internet connectivity in schools to enable students to learn.

She again noted that ITU plays a critical role in disaster risk reduction and management by supporting its member states in the process of disaster management through designing of national emergency telecommunication plan whereby in 2023 all countries should have a national emergency telecommunication plan as part of their national disaster risk reduction strategies. She continued to add that low-income countries are left out when it comes to having a national emergency telecommunication plan and it will be difficult for a country to manage a disaster without a plan. She stated that it is important for governments and local stakeholders to come and develop emergency telecommunication plans to enable their response to disasters without the need to seek help from outside.

She noted that civil society can also contribute by creating awareness and providing capacity building through training communities in emergency response for them to be able to respond to emergencies when they strike. She highlighted that funding is a critical aspect when managing a disaster.

Ernestina Lamiorkor Tawia noted some of the basic things that the telecom sector does during a crisis. She highlighted that during a crisis, the telecom sector helps to make people stay safe, connected, and informed by notifying people of the occurring disasters, where disasters are being hit. This she said, helps to save lives.

She acknowledged the need for backup for critical infrastructure as communities don’t have to depend solely on telecom companies for their communications as they wouldn’t know what will happen to the infrastructure during a disaster. The need to get backup batteries that would last long and generators to provide power to communications infrastructures in the case a telecom infrastructure fails due to a power outage.

IGF 2022 DCPR Platform responsibilities in times of conflict

Updated: Tue, 20/12/2022 - 19:35
Avoiding Internet Fragmentation
Key Takeaways:

Although there may be convergences between different national legal regimes regarding platform governance, national regimes will inevitably diverge, fostering Internet fragmentation. Speakers highlighted the necessity of a multistakeholder agenda to enable the development of shared guidelines and procedures towards meaningful and interoperable transparency, for improving platform governance and law enforcement, and also to ensure users' rights.

,

One of the main points brought by the speakers was the obstacle of fragmentation for the regulation of digital spaces, something that was addressed from an infrastructural perspective to a rhetorical level.

Calls to Action

More transnational and national collaboration among independent regulators and authorities to foster platform observability;

,

Platforms should cooperate by providing continuous data access and information to researchers and independent investigators and not only publishing transparenc reports;

Session Report

The session aimed at discussing the impacts of platforms' regulation on internet fragmentation and how common standards and interoperability could help to mitigate it. The speakers stressed that to improve platform governance and enforce users' rights – especially regarding content moderation and possible impacts on fundamental rights –, it is necessary to develop a multistakeholder agenda on shared guidelines and procedures towards meaningful and interoperable transparency. 

The panel began with Prof. Luca Belli presenting the session and highlighting the impacts that both private ordering by platforms and national regulation pose to internet usage. Prof. Belli also emphasized the great role that platforms and its regulations – or lack of them – play in the responsibility have in respecting human rights. Those remarks were followed by Yasmin Curzi’s statement over the objective of the DCPR Session, which is to explore how platform regulations are affecting internet fragmentation worldwide. She stated that "such regulations may be causing negative externalities for users and for the enforcement". Some examples are data concentration, censorship, conflicts of jurisdiction, and others. The speakers were encouraged to discuss possible guidelines to orient policymakers in creating a more trustworthy and inclusive digital environment, that may be able to foster user control, and interoperability.

In this regard, one of the main points brought by the speakers was the obstacle of fragmentation specifically on the regulation of digital spaces, something that was addressed from an infrastructural perspective to a rhetorical level. From the first moment of the discussion two points can be highlighted: accessibility to information and the actual ability to use it. Oli Bird, from Ofcom, and Vittorio Bertola, from Open-Xchange, commented on the internet infrascrutural fragmentation that can be observed from software programming to hardware operation, for example. Bertola highlights that although States can contribute to IT with shutdowns etc, also the big internet platforms can disrupt national regulations on content. Oli Bird, on the other hand, brought examples of user experience on mobile devices to point out the multiple possibilities of the digital world both in terms of ideation and use. He added that, although there may be convergences between different national legal regimes regarding platform governance, "national regimes will inevitably diverge", fostering internet fragmentation. Point also highlighted by Yasmin Curzi, who recalled the infrastructural issues of access and its implications for regulation in the unfolding of the possibilities of usage. 

On a rhetorical level, some of the concerns pointed out by both Prof. Luca Belli and Prof. Rolf Weber were around determining a "common semantics" and methodologies for a jurisdiction capable of encompassing regulation that serves both local and global levels. On this matter, Prof. Weber also highlighted that open standards are important tools to promote interoperability. Adding to the discussion, Prof. Nicolas Suzor, from Oversight Board, talked about some of the challenges the current way platforms use automatic classification systems: marginalized populations face errors more frequently due to the lack of data regarding them. 

In response to Nicolas Suzor, Emma Llansó included to his point that the reluctance of platforms to make information available is part of the big picture, the researcher from Center for Democracy & Technology and Action Coalition on Meaningful Transparency, states the importance of independent research regarding online services in order to inform public policy-making, adding to that, she states the entering into force of the Digital Services Act in the EU as a way researchers will have to access data from platforms, which will be useful for policy making, especially regarding transparency. Luca Belli had as final remarks that the quality and relevance of the information in the reporting activity is more relevant than raw data.

IGF 2022 Open Forum #25 Explore the road of intelligent social governance in advance

Updated: Tue, 20/12/2022 - 18:13
Addressing Advanced Technologies, including AI
Key Takeaways:

AI brings continuous impact on human society, and human beings need to meet the transformation of intelligent society with governance innovation. Building a humanistic intelligent society requires the full cooperation of the government, enterprises, social organizations and academia. The formation of a people-oriented intelligent society governance (ISG) system is an important issue in exploring intelligent society.

,

It has become a new concept and new direction of the international community and academic community. The research on evidence-based intelligent society governance (ISG) based on social experiments can provide a new positivist framework for ISG. China has the advantages of in-depth application and extensive scenarios of AI, providing practical experience and cutting-edge exploration for the global AI development.

Calls to Action

Facing the global opportunities and challenges arising from AI, exploring the road of intelligent society governance (ISG) requires a new cooperation framework and governance model. Panelists proposed that it is necessary to respect the differentiated institutional and cultural backgrounds of various countries and enhance the diversity and inclusiveness of ISG while conducting international comparison and cooperation in ISG research.

,

Panelists further called on the international community to promote innovation through governance and share the future through cooperation by jointly initiating a global academic alliance, establishing an information sharing mechanism, as well as setting up an international cooperation fund, and strengthening discipline construction and talent cultivation in the field of ISG.

Session Report

Moderator: Dr Zhang Fang
Panelists: Zhang Peng / Prof. Su Jun / Prof. John E. Hopcroft / Dr Zhang Xiao / Prof. Simon Marvin / Prof. Huang Cui / Dr. Wang Yingchun
Rapporteur: Lian Xiangpeng / Tu Shengming / Zhang Yu / Ren Tianpei

This open forum session aims to discuss the research on intelligent society governance and the feasible path to realize people-oriented intelligent society governance. In today’s era, digital technology represented by artificial intelligence, as the leading force of the global technological and industrial revolution, is increasingly integrated into the whole process of economic and social development in all fields, profoundly changing the mode of production, lifestyle and social governance. The forum was organized by the Information Development Bureau of the Cyberspace Administration of China, and jointly hosted by the Institute of intelligent Society Governance of Tsinghua University and Center for Science, Technology & Education Policy of Tsinghua University. Seven experts from China, the United States and the United Kingdom discussed the opportunities and challenges of intelligent society governance and international cooperation, bringing together research topics such as AI social experiments, algorithmic governance, data governance, and urban governance. They agreed that intelligent society governance needs to adhere to the concept of people-orientation, to promote evidence-based intelligent society governance research, and to strengthen cooperation on a global scale, to build an academic community on intelligent society governance with shared resources and information, and to respect the diversity of governance cultures.

(1) Adhere to the concept of people-oriented intelligent society governance, and jointly build a humanistic intelligent society
The development of intelligent technology is based on embedding in human society. Humans are the initiators of the intelligent technology revolution, the main bearers of the new challenges of intelligent society, and the explorers of the risks of the intelligent society. The ability to promote people’s well-being and realize technology for good is the key to whether AI technology can serve the development of human society. 
In order to achieve a comprehensive balance of technical rationality and value rationality, intelligent society governance should establish people-oriented values, ensure the core position of people in the process of intelligent technology development, application and governance, and make the construction of a humanistic intelligent society as a goal of governance. Su Jun believes that a humanistic intelligent society is a people-oriented society with highly developed science and technology, widely used intelligent technology, a comprehensive balance of technical rationality and value rationality, a harmonious coexistence of people, environment and technology, an open, inclusive and harmonious society, and a rich humanism. Humanistic intelligent society governance needs to achieve a deep integration of social values and technology.
Wang Yingchun highlighted that social values and rules should be deeply integrated with technology and applications, so that values can be embedded in technology and applications can reflect rules, and the two aspects can boost each other and promote each other. Data governance is one of the core aspects of intelligent society governance. Huang Cui mentioned that there is an urgent need to establish a new global data governance framework, build a more inclusive, people-oriented and warm global data governance system, and effectively protect personal data and data related to intellectual property rights and national security intelligence, in order to promote the global digital economy and social development.
Panelists agreed that the concept of humanistic intelligent society governance is in the fundamental interest of human society and can guide the development of AI technologies in the direction that benefits human society.

(2) Conducting social experiments on AI and promoting evidence-based research on intelligent society governance
Artificial intelligence will be an important part of future human society. Exploring the path of intelligent society governance requires not only a focus on the development of AI technology itself, but also a systematic and comprehensive assessment of the ongoing social, economic, cultural, and political impacts of AI technology. As John E. Hopcroft argued, reducing AI social risks by strengthening AI governance is critical to ensuring the beneficial and safe nature of AI systems. In the face of systemic changes and future scenarios of intelligent society, Su Jun proposed to adopt evidence-based research methods such as “long-period, multidisciplinary, and wide-field AI social experiments” to address the various issues, risks, and problems brought about by AI technologies. This will further promote the application and development of AI technologies and enhance the effectiveness of governance of intelligent society.
Simon Marvin suggested that in the post-smart city era more attention should be paid to the integration of AI technologies in urban infrastructure construction and urban planning, and that regulatory reform and innovation should be promoted to better serve the development of AI technologies and social experiments. In order to give full play to China’s advantages of wide, deep and diverse AI application scenarios, and to deeply explore the impact of AI technology on people, organizations and society, China has actively organized and carried out AI social experiments in multiple fields across the country in recent years, which can provide practical experience and real feedback for the development of global AI technology. Moreover, Zhang Xiao believed that AI social experiments are important in exploring AI new paths in AI governance and promoting reform of the AI global governance system.

(3) Actively promoting international exchange and cooperation in intelligent society governance research, and building a community with a shared future for mankind in the intelligent era
Panelists agreed that it is necessary to explore the path of intelligent society governance under the concept of building a community with a shared future for mankind, strengthen international cooperation in research on intelligent society governance, promote the inclusive sharing of intelligent society, and realize the development of differences and win-win cooperation among countries. As suggested by John E. Hopcroft, it is necessary to strengthen international cooperation among governments, enterprises, industry associations and other diversified subjects around the world, to guarantee the right to speak for all parties involved in AI governance, to continuously improve laws and regulations on AI governance, to accelerate the development of procedures and standards for AI applications, and also to create more valuable activities for the marginalized groups in the AI system to improve their quality of life.
Finally, Panelists unanimously called for a new cooperation framework and governance model to build an international cooperation ecology for intelligent society governance. They advocated:

  • to initiate the establishment of a global academic alliance for intelligent society governance;
  • set up an international cooperation fund for intelligent society governance;
  • build a data sharing platform for AI social experiments;
  • strengthen the capacity building of intelligent society governance;
  • and train professionals through training, workshops, and academic conferences.

 

IGF 2022 WS #70 Fighting the creators and spreaders of untruths online

Updated: Mon, 19/12/2022 - 15:10
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Pre-emptive actions (e.g. pre-bunking, digital literacy initiatives) are needed to protect people from the risks and harms of false and misleading content online.

,

There is no silver bullet to stop untruths online. A cocktail of approaches are needed (education, media literacy, resources incl. technologies like ML, collaboration among factcheckers, int'l collaboration).

Calls to Action

Governments and the multistakholder community need to pool resources (monetary, knowledge) to fight the creators and spreaders of untruths online.

Session Report

This workshop explored the different types of untruths online – disinformation, misinformation, propaganda, contextual deception, and satire – and innovative ways to reduce the negative effects they have on people and society. Molly Lesher opened the workshop and set the scene, focusing on the OECD’s work in this area, including the  “Disentangling untruths online” Going Digital Toolkit note. She highlighted that while the dissemination of falsehoods is not new, the Internet has reshaped and amplified the ability to create and perpetuate content in ways that we are only just beginning to understand.

She recalled why access to accurate information important, including in the content of the fundamental rights to freedom of speech, the right to choose leaders in free, fair, and regular elections, and the right to privacy, which are essential for healthy democratic societies. She highlighted that false, inaccurate, and misleading information often assumes different forms based on the context, source, intent, and purpose, and that it is important to distinguish between the various types of untrue information to help policymakers design well-targeted policies and facilitate measurement efforts to improve the evidence base in this important area.

Sander Van der Linden then discussed his work to study false and misleading content online, and how from a psychological perspective false and misleading how misinformation infects our minds, how it spreads across social networks, and the ways in which people can protect themselves and others from its negative effects. He discussed how people can build up “immunity” through “prebunking” – that is, by exposing them to a weakened dose of misinformation to enable them to identify and fend off its manipulative tactics.

Julie Inman Grant discussed the Australian eSafety Commission’s work on addressing online risks and harms facing adults and children from the circulation of untruths online. She highlighted some practical approaches, programmes, and initiatives to address online harms, as well as differences in the impacts on and interventions for children versus adults and men versus women.

Rehobot Ayalew then gave remarks from the perspective of a seasoned fact checker who fights against untruths online daily. She underscored the importance of fact-checking, as well as its modalities and limits (e.g., scalability). She also highlighted the complications that non-anglophone countries face in combatting untruths online, and the mental health burden of having to research some of the more graphic and disturbing false and misleading content.

Pablo Fernández discussed Chequeado’s experience with the Chequeabot AI tool that facilitates factchecking in Spanish. He discussed how to find a balance between human intervention and digital technologies in the fight against untruths online, as well as how Chequeado usefully co-operates with the global fact checking community.

Mark Uhrbach then spoke about Statistics Canada’s efforts to measure misinformation so far, what surveys might be able to help us measure, and why surveys alone can’t measure everything, so we need to use some less traditional methods to fill in the gaps.

The panellists took several rounds of questions from the audience onsite and online. A key point from the discussion is that governments and the multistakholder community need to pool resources (monetary, knowledge) to fight the creators and spreaders of untruths online. Another important point is that pre-emptive actions (e.g., pre-bunking, digital literacy initiatives) are needed to protect people from the risks and harms of false and misleading content online.

IGF 2022 DCCOS Translating data & laws into action for digital child rights

Updated: Mon, 19/12/2022 - 11:04
Enabling Safety, Security and Accountability
Key Takeaways:

1. The collection of data using tested methodologies that enable comparison is essential to ensure child rights in the digital environment. This can directly influence policy at the national level. 2. Education remains essential as a preventive measure but cannot replace the need for proactive measures by online service providers.

Calls to Action

1. Governments must provide funding for such data collection and analysis, as outlined in existing legal frameworks. 2. Industry must respond in a coordinated way and adopt a safety-by-design approach.

Session Report

SUMMARY REPORT: The round-table discussed how to ensure the safety, security, and privacy of children in digital environments through data, legal, regulatory, policy, and technology analysis in order to build a more robust prevention strategy and enhance a more efficient response mechanism. 

The session explored how evidence-based analysis from the Disrupting Harm project, combined with international laws such as General Comment #25 made a positive impact on children's digital lives. Furthermore, participants have discussed how and to what extent new regulations from certain countries and regions have informed actions in other parts of the world. The important role of digital service providers to act upon evidence, promote regulations, and ensure safety across all technologies they deploy globally has been further explored in the discussion. 

Amy Crocker from ECPAT International opened the discussion and commented that as the digital world develops, expands and diversifies, it is becoming not only necessary but also a legal and moral obligation that we address the diverse positive and negative implications of the digital world for children’s rights.  

Prof. Sonia Livingstone from the London School of Economics and Political Science highlighted the key link between data and legislation, saying that ‘data’ has taken on a new meaning in recent years as it has become focus on data that is collected. Initiatives such as General Comment #25 (GC 25) to the UN Convention on the Rights of the Child represent are a milestone in child protection in the digital environment. In the GC 25, the Committee explains how States parties should implement the Convention in relation to the digital environment and provides guidance on relevant legislative, policy, and other measures to be followed by States to ensure full compliance with their obligations under the Convention and the Optional Protocols. Implementing robust data collection is essential, including research “with and by children”, to inform policy makers and legislators. The principle of "non-discrimination" is also key. The Commission establishes that "all children" must have equal and effective access to the digital environment, and "all their rights" must be respected by policymakers and stakeholders. Privacy and a high standard of protection must be ensured. Gaps exist, such as the difficulty of collecting useful data and evidence from the Global South; the delicate balance between privacy and data collection; the soft-law nature of the Comment; and the difficulty of fully understanding the "interest of the child" by policy makers and legislators. 

Interventions from the floor: 

  • To a question about whether the GC 25 also addresses the need for better data on the phenomenon and of what works to tackle it, Prof. Livingstone asserted that the GC25 does oblige evidence-based policy and action. 

  • Rodrigo Lopez from the Brazil youth committee asked whether the GC 25 addresses targeted advertising to children because this is an issue in Brasil as this has been considered abusive for years but it still happens. The response was that it does, that that the DC has increasingly recognised the commercial risks to children and needs to find a way to deliberate on the tipping point between commercial use and commercial exploitation, perhaps by identifying some criteria and building on the CRC Committee assertion that State Parties should prohibit by law any targeting based on inferred characteristics.  

Serena Tommasino from End Violence’s Safe Online team affirmed that data and evidence-based information have been a useful tool to advocate for children’s rights. Disrupting Harm (DH), funded by Safe Online, is a large-scale research project producing unique national, regional and global insights into how online child sexual exploitation and abuse in Southern and Eastern Africa and Southeast Asia (Kenya, Uganda, Thailand, Tanzania, Ethiopia, Philippines, Viet Nam, Namibia, Indonesia, Malaysia, Cambodia, Mozambique, and South Africa). They are now investing in implementation in these countries, and research will begin in 10 more countries. Data scarcity and poor quality have limited our understanding of OCSEA. Generating high-quality evidence is fundamental to developing a coherent methodology that might be replicable in different countries. A recent event in Brussels, Safe digital futures, saw experts discuss how to improve the data ecosystem and devised a joint roadmap to inform policy and stakeholders. 

Rangsima Deesawade, Regional Coordinator for South East Asia ECPAT International highlighted that a reliable and standard data collection methodology at regional and global level leads to comparability. By aggregating data extracted in different countries following the same methodology, it would be possible to have a scientific comparison of information on OCSEA. She pointed out that some of the findings were new even for ECPAT, such as in relation to gender. The data showed that there were similar rates of victimisation between girls and boys, with interesting variances across the countries. This type of information can help change persistent narratives.  

Interventions from the floor:  

  • Dave Miles, Director of Safety Policy for Meta, pointed out that Meta is using clear standards to design online/onsite guidelines and codes of conduct. Standards such as the GC 25 and age-appropriate design code are influencing the way Meta designs solutions. Larger platforms can influence others in the sector. Age assurance, child-friendly products, and empowering the trust and safety community should be the main concerns of the industry sector. 

  • Jennifer Chung, Dot kids raised the point that when we look at enabling activities, harm does not always start with clear criminal activity, making it hard to know where to intervene and at what stage.  Jutta Croll responded that Stiftung Digitale Chancen are advocating for proper age verification so that platforms know how their users are and can adapt support for them.  

  • Jutta Croll from Stiftung Digitale Chancen raised the importance of improving national risk assessment mechanisms and presented a model they have co-developed. Current risk assessment is addressing harm after the fact. Pre-emptive, comprehensive, and concomitant assessment of potential threats to children must be implemented among different platforms where anticipatory conduct might consequently lead to criminal offences. Precautionary measures such as age verification may fit this scope.  

Interventions from the floor: 

  • Jonathan Andrew from the Danish Institute for Human Rights wondered what the response has been from lawmakers to this risk assessment tool. When the draft CSA Regulation in the EU came out, most were still considering what risk assessment really means. He pointed out that education is key because risk starts before a criminal act, but precautionary measures by service providers are also needed. The role and mandate of national human rights instruments is key.  

Conclusions 

  • The GC25 is legally binding and comprehensive framework to ensure digital child rights. 

  • Education is key but cannot replace proactive measures by online service providers. 

  • Common data collection methodologies ensure comparability and drive change. 

  • Key to tackle tensions between children's data privacy and protection.

  • Governments must provide funding for data collection and analysis, 

  • Industry must adopt a safety-by-design approach. 

IGF 2022 WS #370 Addressing the gap in measuring the harm of cyberattacks

Updated: Sat, 17/12/2022 - 14:03
Enabling Safety, Security and Accountability
Key Takeaways:

Developing both qualitative and qualitative measurements as well as general indicators and sector-specific indicators is key part of advancing the harm methodology. The methodological framework should consider different kinds of harm inflicted by cyberattacks and include issues of re-victimization and redress. It is recommended to link the discussions on the methodological framework for measuring harm to the accountability framework.

Calls to Action

Addressing the harm stemming from cyberattacks is a collective responsibility. There is a need for multistakeholder initiatives that can break existing silos between different communities and experts to meaningfully advance the harm methodology. Outreach is important to the wider community, especially cybersecurity experts, policymakers, economists, and mathematicians who can meaningfully contribute to developing the harm methodology.

Session Report

Recent years have seen a growing number, scale, and impact of cyberattacks. State and non-state actors increasingly exploit vulnerabilities in cyberspace for financial profit or to gain an advantage over their adversaries. The CyberPeace Institute has been recording cases of cyberattacks related to the healthcare sector and in connection to the war in Ukraine. From June 2020 to November 2022, the Institute aggregated a total of 501 incidents affecting 43 countries around the world as part of the Cyber Incident Tracer (CIT) #HEALTH – a platform that records and analyses data on cyberattacks in the healthcare sector and, importantly, their impact. Similarly, to this date, the Institute’s Cyber Attacks in Times of Conflict Platform #Ukraine has featured 834 cyberattacks and operations in 35 targeted countries. While this is only a fraction of the full scale of the threat landscape, these platforms attempt to bridge the current gap in the understanding of the harm to people stemming from cyberattacks.

This workshop posed some key questions for developing the harm methodology, including how “harm” should be defined in cyberspace, what categories and indicators can effectively help to measure harm from cyberattacks, and how a methodological framework for harm caused to people can improve policymaking and ensure greater accountability for cyberattacks. While a significant effort has been devoted to documenting cyberattacks and understanding their economic impact, there is a remaining gap in understanding the damage they cause to societies. The harm methodology introduced during the session attempts to close this gap and proposes a novelty approach to the selection of indicators of harm and the assessment of harm in cyberspace. This framework aims to contribute to the efforts of comprehensively measuring the harm to people from malicious cyber behavior and this way advance policymaking and decision-making through an informed and human-centric approach.

The panelists spoke about the lack of data related to cyberattacks and what is the generated harm caused by such attacks. It was proposed that currently, we see the impact on records, facilities, and the economy, among others, but such an assessment is too narrow. There is a need to develop qualitative and qualitative measurements for societal harm, particularly in regard to the impact on vulnerable people and the possible re-victimization – both online and offline. Harm originating in cyberspace can be represented in many ways, and it is important to have an impact assessment as part of related policies and legislation. When discussing the state of play, the panelists noted that some attacks are already high on the agenda, including ransomware and spyware, but more efforts are needed to understand different kinds and degrees of harm. Research conducted on this topic remains insufficient, but some important contributions have been already published, including on the taxonomy of cyber harm. Conversations are important for advancing the initial methodological framework, especially concerning the number of indicators, quantifying numerical values, and qualitatively documenting and tracking the results. The CyberPeace Institute welcomes contributions from stakeholders. 

A methodological framework for cyber harm can improve policymaking and ensure greater accountability for cyberattacks with a human-centric approach. Given the complex landscape of cyberspace, policymakers need to understand the impact of cyberattacks in order to be able to base policies, strategies, and legislation on empirical assessments. By extension, it is key to not only measure the economic impact of cyberattacks but also the harm they cause to people. The panelists remarked on the critical need to consider redress for those who were affected. The effects of cyberattacks are often localized, meaning that many people experience harm to some extend and we need new safeguards and effective remedies.  Furthermore, it is important to link the discussions on the framework for measuring harm to the accountability framework. There are remaining silos in areas relevant to developing the harm methodology, including cyber insurance, law enforcement, and education, but building this framework should be a collective investment. Entities across the stakeholder communities need to cooperate and test the proposed approaches in different sectors as the indicators can vary. 

In conclusion, this workshop contributed to raising awareness about the methodology for measuring harm from cyberattacks. Such a framework has the potential to inform policymaking and decision-making and help prioritize resilience efforts based on areas with high levels of societal harm caused by cyberattacks. It was outlined that a follow-up should include outreach to the wider community, specifically to cybersecurity experts, policymakers, economists, and mathematicians. The input from the participants at the workshop has been gathered and analyzed to further inform and improve the methodology.

IGF 2022 Open Forum #71 From regional to global: EU approach to digital transformation

Updated: Sat, 17/12/2022 - 12:35
Enabling Safety, Security and Accountability
Key Takeaways:

When regulating digital platforms, a whole society approach is needed. The public needs to be fully informed about the regulatory process and the governments must put the citizens in charge of safeguarding transparency.

,

There is no single policy, no single legislation or no single actor to secure an open, free and secure digital future for all, but the prerequisite for any massive transformation, like the digital transformation, is the structured dialogue among all stakeholders based on shared principles.

Calls to Action

1. Ensure regulation work in practice - aim for effective implementation and clear guidance. 2. Digital developments require agile governance. Massive digital transformation needs structured dialogue and shared principles. 3. Policymakers should ensure public information and transparency.

Session Report

The EU is impacting global regulatory trends in the digital sphere. EU regulations are shaping the global marketplace and influencing governments in their policymaking decisions, and to a great extend setting the regulatory standard for global companies.. Most regulations are applicable also to non-EU companies operating in the EU and hence have immediate effect. This is a huge opportunity but also a huge responsibility for the EU to put in place and effectively implement its regulation for real market outcomes. The objective is and should remain the safeguard of a free, open and interoperable Internet ensuring human centric digital transformation serving citizens and society. The EU value-based approach is challenged in the current geopolitical landscape of tech war and digital authoritarianism.

The DSA Package is about greater responsibility (DSA) and market contestability (DMA). It aims at addressing a global problem and represents a global opportunity. DSA Package’s implementation within the EU will have a global effect due to the experiencing and building up of auditing and enforcement capabilities. In jurisdictions where the Package is not applicable, it would however set a threshold for elementary safety provisions since jurisdictions would not accept lower standards than those set e.g., in the DSA. The DSA should allow the balancing between content takedown and safeguard of freedom of expression (freedom to speak, seek and receive information). When regulating digital platforms, a whole society approach is needed.

To create a safer and more open digital space, protect European citizens from illegal and harmful content and conduct online, create a level-playing business environment it is essential that key elements of the ecosystem are put in place before or together with regulation (free space for the journalists, independent auditors, effective system of notifiers, among the others). This may ensure the accountability of gatekeeper platforms. The approach may depend on the country or the society. The overall vision of a human-centric approach to digitalization may result in an articulated complex regulation, like in EU, with many members states of different societies or like in Japan with a very homogenic society, in more agile and a multistakeholder governance approach.

The Data Act will impact IoT products’ design, which will have spill-over effects also on non-EU markets. It is also about ensuring interoperable and well-functioning cloud market. Fundamental values should be attached to data flows which affect privacy, national security, and intellectual property. It is important to build the interoperability among the various approaches to data governance across jurisdictions. This is one aspect into which looks the Data Free Flow Trust initiative.

EU’s aim is to have a trustworthy and human-centric AI to avoid algorithms’ possible harmful effects. AI policies depend also on a given country’s democracy values. While other parts of the world may have different approaches to regulation, the effect and interest in framing digital developments has been increasingly recognised by governments world-wide. AI raises questions on innovation and regulation, and the approach to these two may be altered by a possible Beijing effect.

There is no single policy, no single legislation or no single actor to secure an open, free and secure digital future for all, but the prerequisite for any massive transformation, like the digital transformation, is the structured dialogue among all stakeholders based on shared principles. One lesson learned from the several EU regulatory processes (Artificial Intelligence, Data Act, Digital Services Act) is that even if the legislation process in EU might be the most transparent in the world, it is still complex for the citizen to navigate the extensive information material. There is a need to secure not only the participation of the citizen in designing the regulation but also the public supervision of the whole process. The outcome of the legislative process is as important as its quality. For DSA/DMA to secure the maximum level of transparency the idea was to inject the transparency in every aspect of the regulation, as for content moderation, journalism, audit or trusted flaggers and to introduce the cardinal principle that there is no general monitoring.

IGF 2022 WS #440 Declaration for the Future of the Internet

Updated: Sat, 17/12/2022 - 11:47
Connecting All People and Safeguarding Human Rights
Session Report

Internet Governance Forum 2022

[Workshop # 440] Declaration for the ­­Future of the Internet

Addis Ababa, 30 November 2022

Speakers

  • Pearse O'Donohue, Government, Western European and Others Group (WEOG) 
  • Timothy Wu, US Government, Western European and Others Group (WEOG)
  • ­­Marietje Schaake, Technical Community, Western European and Others Group (WEOG)
  • Anriette Esterhuysen, Civil Society, African Group

Moderators

  • Grace Githaiga, Civil Society, African Group
  • Sonia Toro [Online], Private Sector, African Group

 

Organisers

  • Julia VAN BEST, European Commission
  • Esteve Sanz, European Commission
  • Anca Andreescu, Stantec
  • Sonia Toro, Stantec
  • Noha Fathy, Independent

Rapporteur

  • Tom Mackenzie, Stantec / ITEMS International

 

Discussion: The DFI and how to keep the Internet open, trusted, interoperable

This workshop was proposed to discuss the principles of openness, interoperability and resilience of the Internet, and policies that are needed in order to safeguard it.

The two questions on the table were:

  • How to ensure that the Internet remains open, global, and interoperable, in line with universal values and fundamental rights?
  • How can governments, private entities, civil society, and the technical community translate the principles of the DFI into concrete policies and actions and work together to promote this vision globally?

Connection with previous IGF outcomes

The workshop was organised as a follow-up to IGF 2021 and earlier IGF meetings on the ‘Economic and Social Inclusion and Human Rights’, ‘Universal Access and Meaningful Connectivity’, ‘Inclusive Internet governance ecosystems and digital cooperation’ and ‘Trust, Security, and Stability.’

The purpose of the workshop was to have a multi-stakeholder discussion on how to preserve an open, global, interoperable, reliable, and secure Internet, and how this is a key objective in the drive to achieve sustainable development and digital inclusion. This further includes providing meaningful and sustainable Internet access to everyone and safeguarding Internet openness to promote democracy and human rights.

The discussion was also intended to tackle the open internet policies and actions that are needed to promote the trust, security, stability, and interoperability of the Internet including a human-centric approach. This cannot be achieved without considering the sustainability of the internet governance ecosystem that hinges on well-structured coordination and consolidation among the different stakeholders in order to promote a positive vision for the future of the Internet.

While an Internet that imperils fundamental freedoms and human rights online threatens the achievement of almost all the SDGs, there is a direct link between the proposed workshop and the following SDGs:

  • Decent Work and Economic Growth: a free, open, global, interoperable, reliable, and secure Internet creates new working opportunities and contributes to growth. It would also provide career opportunities, support the emergence of new businesses, extend distribution channels to remote areas, increase employment in higher-skill occupations, and create new jobs for less-educated workers.
  • Industry, Innovation, and Infrastructure: a global and open Internet promotes innovation, also contributing to industrial development and infrastructure building/roll-out. It also has a positive impact on economic growth and social well-being which are important for the peace and happiness of individuals and societies at large. It further allows for cultural exchanges and the open exchange of knowledge and creativity which could greatly influence a lasting peace.
  • Reducing Inequality: A global and open Internet contributes to reducing inequalities between those who have studied and those who have not, urban centers and rural areas, developed and less developed regions, men and women, and ultimately between rich and poor.
  • Peace, Justice, and Strong Institutions: a global and open Internet ensures transparency, rule of law, democratic societies and processes, reliable institutions capable of regulation but also respective fundamental rights.
  • Partnerships for the goals: the session would allow structuring the effort of stakeholders around a global, open, and human-centric Internet to build/further partnerships around accelerating the achievements of the SDGs. In this context, the DFI is open to the broadest group of countries from all geographies and development levels, who actively support a similar future of the Internet and want to re-affirm the commitment to protecting and respecting human rights online.

Declaration for the Future of the Internet

Many internet stakeholders grapple with complicated questions that relate to Internet safety and openness. This includes how to expand Internet access while keeping the Internet safe from illegal content and dangerous goods; how to fight against disinformation while protecting fundamental rights, i.e. freedom of expression and freedom of information; and/or how to keep the digital space contestable, open for innovation and inclusive.

At a time when the negative developments of the Internet – including high market concentration and abuses of market power, diminishing pluralism and data privacy, and increasing disinformation, harassment, and censorship – are justifiably and decisively being addressed we must not forget and give up on the great benefits a well-functioning internet can add to our societies and economies.

Against this backdrop, the Declaration for the Future of the Internet (DFI) was published on 28 April 2022, rallying over 60 countries around an affirmative, positive agenda for a free, open, global, interoperable, reliable, and secure Internet. The DFI sets out shared fundamental principles that re-emphasise the great positive potential of the Internet. A well-functioning global Internet will reinforce democracies, promote social cohesion, and protect universal rights while allowing for digitally spurred economic growth and development.

To this end, the workshop discussed how stakeholders can ensure that the Internet remains widely accessible, open, human-centric, and in line with universal values and fundamental rights. In the same vein, discussion focused on how these stakeholders can translate the principles of the DFI into concrete policies and actions and how countries and stakeholders can work together to promote this vision globally.

Reaffirming the principles of Openness and the multi-stakeholder governance model

The Declaration for the Future of the Internet can be seen as an attempt to reaffirm the basic principles of openness, resilience and interoperability on which the Internet was founded. These principles are common currency within the multi‑stakeholder community. However, there is a need to reaffirm them at a time when geopolitical tension threaten the integrity of the Internet and some analysts have raised the prospect of a splintering of the Internet.

The objective of the DFI was to produce a declaration including a set of principles that national governments around the world could to sign up to. At the same time the DFI allows countries to state that there are certain things they can and cannot do to ensure the Internet is and remains an open, interoperable trusted space which respects the individual including their integrity, physical and online, their personal data as well as their identity. 

The key issue at stake is trust. All citizens as well as businesses should be able to trust that they are safe when they are online, that their data is secure and that their transactions are confidential.  This will lead to a trusted environment and further innovation.  It also means that the data economy can thrive and it can be something to which individuals can place their trust. 

Relationship between the DFI and the IGF

Some critics have identified possible tensions between the stated aims and principles of the DFI and the mission of the IGF.  However, these are exaggerated. On the contrary, the DFI can be seen as a timely and useful expression of the principles of openness and inclusion that have been established by the IGf for over 20 years.  The IGF has shown that it has been able to adapt itself to address the questions of the day in order to ensure that, in the future, not only is the multi‑stakeholder community, but the Internet governance forum which you have all invested so much in will continue to play and an increasing central role with regard to the governance of the internet.

Key Take Away 1: The DFI states important principles in favour of the Open Internet. But what now?

The DFI can be viewed as a response to what the US and partner states have viewed as alarming patterns behaviour by certain nation states with regards to Internet governance, and the technical standard setting processes on which the stability and interoperability of the Internet relies. The stated goal of the DFI is to reaffirm basic norms, and to restate basic principles that have long been taken for granted. These notably concern how nation states are expected to comport themselves when it comes to governance issues and the management of critical internet resources.

A key aspect of the DFI is respect for the Internet’s multi-stakeholder governance processes, and the notion that one of the Internet’s founding principles was that it couldn’t be controlled by a single country. However, there are growing concerns that certain nation states are seeking to increase their power or increase their leverage at the expense of the multi-stakeholder governance process, particularly affecting the technical sides of it. 

What next? With 70 signatories overall, including a small fraction from the Global South, this suggests that significant work still needs to be done to achieve buy-in for the values, standards and governance principles as put forward in the DFI. This should be an objective in the run-up to upcoming multi-lateral, multi-stakeholder processes e.g. the UN Global Digital Compact (Sept 2023) which is expected to “outline shared principles for an open, free and secure digital future for all”. 

The DFI can serve to smooth the path for the principles of the Open Internet to be encapsulated in whatever comes out of the GDC (involving 170+ countries) but a more inclusive, transparent process, may need to be put in place to ensure that all stakeholders, have an opportunity to engage.

Key Take Away 2: Risk that a DFI might be perceived as exclusive (either you’re in or you’re out). A roadmap for more inclusive, multi-stakeholder consultation may be needed to ensure wider international buy-in.

The DFI has been promoted as reflecting global aspirations to build an inclusive rights-respecting Open Internet, and using the potential of the Open Internet to foster economic and social development.  

However, the DFI was ostensibly drafted by actors of the Internet ecosystem in the US and Europe. It appears not to have been the product of a broader consultative process involving ecosystem actors from other regions of the world. This may have led to perceptions around the world that it is a declaration to which like-minded actors or like-minded states can subscribe to. But it may also have led to the perception that “either you are in, or you are not”, an unintentional deviation from the path of inclusive dialogue on which the future of the Open Internet undoubtedly relies.

To ensure greater buy-in to the principles and standards of the Open Internet, a wider consultative process which takes into account, more explicitly, the concerns of countries in the Global South could be sought. This might involve fixing what some might see as a procedural weakness and allowing non-state actors to become signatories of the DFI.   

Concern was expressed during the workshop about the perception that the internet may be being used as political football in terms of global geo-political tension and conflict. This will harm the Internet, and the Internet's potential as a platform for holding peace will be compromised. In this sense, the declaration should perhaps be a viewed starting point rather than an end point. Instead of being a declaration perceived as having been prepared by well-identified actors in the multi-stakeholder (but not necessarily internationally representative) internet ecosystem, to which other countries should simply come to the table and “sign on”, it might be better if it was perceived as an invitation to join-in in the process of building a global consensus.

There are lessons that can be drawn from the Net Mundial process in 2014 in Sao Paolo, Brazil. This resulted in a powerful, and simple document. However, it never quite made it into the multi‑lateral space.  Therefore, it never went any further. This may call for a re-appraisal of the multi-stakeholder model of internet governance; how it is defined; how it works in practice; and how individual stakeholders (including states) that take part in the system can be held accountable.

Key Take Away 3: Greater support for DFI in Global South will result from enhanced cooperation between development agencies to deploy internet infrastructure, and promote connectivity

Conspicuously the DFI lacks signatories from the Global South (two countries in Africa). Before it is reasonable to expect strong buy-in for a rights-based Internet, built on the principles of openness and multi-stakeholder Internet governance, it is vital to ensure the deployment internet infrastructure globally, and the delivery of reliable, affordable, physical access to the Internet to a majority of the world's population. This can be achieved through better coordination between development agencies, and better engagement with countries regarding local needs and practices.

The future of the open, interoperable Internet will be assured insofar as civil society and the non-governmental multi stakeholder community are empowered to comment on the deployment of internet infrastructure, and Internet based services, freely and without intimidation.

Multi-stakeholder actors should feel free to positively advise governments regarding the practical implementation of the principles set forth in the DFI. This will also serve to reinforce accountability. 

Video & Session Transcript

Video: https://youtu.be/KY3N-0NaVBw

Transcript: https://www.intgovforum.org/en/content/igf-2022-day-2-ws-440-declaration-for-the-future-of-the-internet-%E2%80%93-raw

IGF 2022 WS #217 Joint efforts to build a responsible & sustainable Metaverse

Updated: Sat, 17/12/2022 - 05:33
Addressing Advanced Technologies, including AI
Key Takeaways:

Metaverse is a concept that all parties have yet to unite because it is a collection of scientific ideas and potential technologies in the next few years, such as Web 2.5/3, Blockchain, and AI. Its subsequent evolution, application scenarios, and corresponding specifications in various industries and fields are still unclear and need to be built jointly by multi-stakeholder groups.

Calls to Action

A standard development organization can be set up in conjunction with governments, technical communities, private sectors, civil society, and other multi-stakeholder groups to provide a stable environment for its members to discuss, define, and compile metaverse technical specifications and reports.

Session Report

On November 30th, IGF 2022 workshop #217 on "Joint efforts to build a responsible & sustainable Metaverse" was held. There were five experts from different fields and countries presented their views on Metaverse development, which will serve as an open and fair foundation for the future world.  Specifically, they discussed the Metaverse's key challenges, governing issues, and what policy framework may help a healthy development.

 

Xiaofeng Tao, professor at Beijing University of Posts and Telecommunications (BUPT), the vice chair of Consultative Committee on UN Information Technology (CCIT), China Association for Science and Technology (CAST), chaired the workshop.

 

Professor Gong Ke, the chair of  CCIT/CAST, and the past president of the World Federation of Engineering Organizations (WFEO), made an opening speech on building the accountability and sustainability of Metaverse. He introduced the core values and principles that could and should apply to Metaverse development for the good of humankind and the planet.

 

In this workshop, five speakers presented their views on the topic "Joint efforts to build a responsible & sustainable Metaverse", and the details are below.

 

Horst Kremers, the General Secretary of CODATA-Germany, presented the critical elements of a digital information strategy. He introduced basic management principles and challenges for international legal instrument information management. The demands for coherence and mutual synergies for the UN declarations and other UN instruments texts are urgent.

 

Li Yan, the Vice President of Singapore Blockchain Association, explained what Web 2.5 is, and Web2 and Web3 seem to converge towards Web2.5. From his speech, the most significant risks are regulations and a need for more non-technical talent familiar with political economy.

 

Wen Sun, professor at School of Cybersecurity, Northwestern Polytechnical University, shared her excellent ideas and thoughts about the challenges and potential solutions to build a trusted Metaverse. One of the promising solutions is Blockchain technology and other advanced technologies such as Digital Twin and Federated Learning. These new emerging technologies can ensure data integrity, privacy, and security and implement seamless and secured data sharing.

 

Daisy Selematsela from the University of the Witwatersrand and Lazarus Matizirofa from the University of Pretoria noticed the gap between academic research and the knowledge needed by policymakers and bridged the gap by providing policymakers with access to relevant research.

 

Ricardo Israel Robles Pelayo, professor of Universidad Anahuac online, UNIR México and EBC, introduced legal education and information communications technology such as digital books, AI, and big data. And in the end, he reminds us what the role of Metaverse in legal education. That is, the Metaverse is the link between academic theory and professional practice.

 

After all five speakers finished their presentations, the experts started an open discussion. Several key questions are discussed, such as the main challenges, including technical and legal aspects of building an ecological, responsible, sustainable Metaverse, cooperation among multi-stakeholders, and how to build a policy frame. The experts mentioned that the more important thing is to conceive what the issues are in Metaverse instead of how to apply Metaverse related technologies in education.

 

At last, we agreed that building a legal framework is challenging. This is because 1) we need an innovation of new technologies and 2) potential risks brought by these new technologies need to be identified. Hence, instead of technologies, ethical principles might a good starting point for the very first step to build a legal framework. We shall encourage a broader participants and efforts not only from tech engineers and scientists but also from multi-stakeholder communities to join the Metaverse development.

IGF 2022 WS #491 The future of Interplanetary networks-A talk with Vint Cerf

Updated: Sat, 17/12/2022 - 05:32
Connecting All People and Safeguarding Human Rights
Session Report

Report Session 491: The future of Interplanetary networks

  • There’s a potential for the interplanetary network to solve the problems with the diverse internet network infrastructure on the ground. It is not only a topic related to how to transport data from Earth to the different planets but also how to communicate between them. This implies the development of new ways or forms of packaging the information and ensure will be transmitted fast and similar to what we do on Earth. 
  • The main developments started in 1998 with the Pathfinder project on Mars —(1997). The necessity of creating communication between Earth and Mars. 
  • What we need from there in 25 years (2023) is IPN to use TCP/IP as it works on Earth. 
  • The solution found is in the Bundle protocols to answer the constantly changing nature of the planets. Store data on the network. Transmit to the right planet, and then you try to figure out how to get there. Two-way process (28.5 kb/s direct to Earth radio link // Delated communication.)
  • For this some technical aspects are the ones to have into account: Hold, orbiters, deep impact space networks, transmit. Low latency → TCP/I, IBR-DTN // ION, the implementation of DTN and LEO, near-term terrestrial and cloud testing 
  • But is not everything about the technical aspect, there will be legal aspects to this and some of them are part of Artemis' accord on how jurisdiction would like in an extraplanetary context. Mission Artemis to the moon.
  • This can be considered as a possible beginning of the commercialization experience of the Internet beyond the LEO. Private operation on space. 
  • At this moment, Vint asked himself and invited the public to think about how we create new institutions in order to cooperate together in a way not threaten space in the same way as earth space.
  • Additional registry for IPN? No, it will be better an independent Internet on the other planets, with distinct and separate IP addresses for the lower cases we have on the moon and other planets. 
  • Solar relay to transmit no matter where you are in the internet solar system. Congestion Control: Storing models, store, and forward network. 
  • All standards in IPN are open, and usually reject things that have patents and restrictions on them. Open source as a way of driving innovation. 
  • Veronica addressed the problem of using commercial technology in order to develop a new one. How this could possibly translate into the future of IPN?
  • Nowadays, research is built on top of the previous ones, like in a pyramid. In order to scale the pyramid and place a new block on the top, you have to get the permission of each person who placed the block below and supports yours.
  • The problem is when the block is a patent, you have to pay handsomely to lay your block on top of it, and in the ICT industry, there’s a huge amount of patents.
  • Some patents get included in standards (SEP, standard essential patents).
    • Bottleneck, because all the market operators have to pay to implement that standard, and it gives the patent holder the power to foreclose the market and stifle innovation.
  • The reasons why a patent becomes a SEP.
  • Market-driven, 
  • the standardization process is not very transparent. When private companies join a Standard setting organization, they should be obliged to disclose their patents (to allow the SSO to find an alternative to proprietary technical norms), and, in case no alternative can be found, they should commit to grant FRAND royalties.
  • Not all the SSOs provide for mandatory disclosure, or they don’t provide penalties for violation.
  • The technical community has to keep into consideration these elements if they want the interplanetary networks to be developed on open standards.
  • A way to solve this could happen with a more gender balance in the development of the technologies related to IPN one example of the benefits of diversity is the development of federated learning made by a woman which consists of different AI models and gathers as one.
  • Leading to the conclusions,  one can be promoting alliances between commercial and others to reduce costs, some way of multistakelholderism with Government and private sector. 
  • How this development of IPN actually also is going to help possible extraterrestrial life, we found 
  •  Serve humanity and not governments.
  • LEO can possibly overcome shutdowns, and we say possibly because after the signals and all of that can be detected. 
  • Possible Takeaway: One is the legal and commercial aspect of the development and deployment of the technologies related to the Interplanetary Network (IPN) and how these regulations would happen in a way that actually what we do in space doesn’t become a thread as it happened and is happening now on Earth. Also, how these are going to allow easy collaboration between the private sector and the States with interest to design, develop, and implement these Technologies related to the creation of an or several IPN.
  • The other Takeaway is related to how these technologies can work as a way to solve problems of interconnection here on Earth and in Space, how the combination, implementation, and appliance of those could complement and work in a way that can guarantee basic human rights and a constant connection beyond cultural, political and other related sociological activity, these means get the technical standard that can work as much in the independence of human intervention but having embedded a humanity first perspective. 
IGF 2022 Town Hall #81 Digital Rights Learning Exchange

Updated: Fri, 16/12/2022 - 23:40
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Promoting open and democratic Internet will take everyone, not just well-established professionals and elites, but those from different sectors and geographies. Hence, it is crucial to support entry-level digital advocates and provide them with opportunities to make an impact.

Calls to Action

Connect with digital rights advocates from the Digital Rights Learning Exchange program for collaborations on Internet advocacy projects.

Session Report

 

The session aimed at discussing challenges faced by digital rights advocates and spotlighting the Digital Rights Learning Exchange (DRLX) program that Digital Grassroots held in cooperation with the Open Internet for Democracy Initiative. During the opening of the session, the onsite moderator and Digital Grassroots co-founder, Uffa Modey, introduced the objectives of the program, highlighting the importance of providing capacity-building for digital rights advocates coming from underrepresented communities. As part of the program overview, one of the program leads and Digital Grassroots founder, Esther Mwema, presented program core components and highlights from participants' feedback, mentioning that the majority of program alumni found it specifically beneficial to work with other advocates from different regions on developing a campaign and learning from each other over the course.

Sarah Moulton, the deputy director for the National Democratic Institute, discussed how the Digital Rights Learning Exchange came to life as a learning space for emerging digital rights activists starting their advocacy work to acquire basic skills across such areas as stakeholder mapping, communications, digital safety, and security. The panelist also emphasized the lack of foundational programs that can help budding digital advocates start and lead their advocacy projects and the need to support activists at the entry level. 

After the program overview, the word was given to a DRLX alumna Fatou Sarr, who shared her experience of participating in the program. Fatou acknowledged the importance of the participatory and interactive nature of the program, where participants have space to engage with hands-on learning materials and build connections with fellow participants and advocates from different backgrounds.

During the open floor, the program leads were asked about the commonalities of the participants in the program cohorts. It was noted that coming from different countries, the participants were able to find common ground for cross-country collaboration and take a regional perspective on the hypothetical advocacy campaigns that they were working on during the program. Another feature of the program spotlighted by the program leads was participants' interest in particular thematic areas, such as access and affordability, freedom of expression, and internet shutdowns. The panelists also covered the issue of program sustainability, stressing the significance of building networks between the participants and hosting organizations and offering alumni different pathways to engage after the program, i.e. as project mentors and guest speakers.

The session was attended by 12 online participants, who had an opportunity to engage in discussion with onsite panelists. The participants came from different backgrounds, including individuals interested in digital rights training programs, civil society representatives working with digital issues, and IT specialists. The questions were asked on the Zoom platform and the social media channels of Digital Grassroots by participants following the session live stream on Youtube.

 

IGF 2022 WS #475 Balancing Digital Sovereignty and the Splinternet

Updated: Fri, 16/12/2022 - 22:54
Avoiding Internet Fragmentation
Key Takeaways:

There is a need to define more precisely what internet fragmentation is, but we are currently far from this consensus

Calls to Action

There is a need for more dialogue among stakeholders with diverse positions, especially to define if there is such a thing as fragmentation on higher layers of the Internet

Session Report

There is great difficulty in discussing digital sovereignty and Internet fragmentation today due to the lack of common grammar and concepts among the different actors that address the issue. This makes the discussion to debate the relationship between these two concepts even more difficult, which is why it is necessary to maintain a continuous dialogue between the various stakeholders involved to try to achieve solutions and cooperation.

There is a perception that people talk about fragmentation on different layers of internet architecture. The most worrying kind of fragmentation happens on the transport layer. It is still not an immediate major threat and previous attempts have failed, but these acts, such as the ones from China, may even re-design core internet protocols. If these ideas and actions spread, in the medium term this could threaten the Internet as we know it today. Governments should be careful to avoid creating legislation that results in obligations to Internet actors that may create profound barriers to access, and this concern requires appropriate assessments. Some experts argue that digital sovereignty will inevitably lead to nternet fragmentation, because the traditional sovereignty exerted by states within their borders do not apply well to cyberspace.

However, one should not ignore that states have legitimate interests in regulating that do not interfere with the adequate functioning of the Internet, sarising those concerns arisen from the Global South orsignificante that take into great consideration the protection of fundamental human rights that may conflict weconomiclogical and economical development. People should not confuse these core values with features that can be hindered by regulations or generate costs for companies, since this is just a shift in priorities. Market actors clearly prefer the idea of no regulation andbehaviorrs, but this behaviour may harm other stakeholders.

There is no clear consensus if we can define content issues as aspects of internet fragmentation. It should bg on the specifics of what happened, such as censorship acts. This is particularly important because there is a legitimate discussion at the application (or content) layer, debating if it is problematic, leading to dangers such as communication and discourse control by governments, or necessary to protect citizens, national and individual security, or even democracy.

The session stresses the need for continued dialogue between actors not only from different sectors, but also from different regions, to align bit by bit the concepts that are being used in the global debate.

IGF 2022 WS #458 Do Diverging Platform Regulations Risk an Open Internet?

Updated: Fri, 16/12/2022 - 20:21
Avoiding Internet Fragmentation
Key Takeaways:

The elaboration and enforcement of global standards may pave the way for greater alignment in regulating digital platforms. Its feasibility and desirability, however, remain much-contested. The development of standards, and more generally deliberations on platform regulations, must be done against the democratic context of each country and region in addition to their respective political, socio-cultural, legal and historical backgrounds.

,

In addition, platform regulations bear great importance in shaping the power balance between governments, ‘big tech’ companies, civil society and everyday users of platforms. Greater resources must be dedicated to promoting and facilitating honest and inclusive multi-stakeholder discussions; protect digital platforms as an open and neutral civic space; and ultimately foster a healthy digital ecosystem for all.

Session Report

Background

The last few years has seen a plethora of new laws and proposals which would regulate online platforms and other internet intermediaries. In the absence of any international frameworks or consensus on how to govern intermediaries, many of these laws and proposals are diverging widely. This is leading to potential impacts on intermediaries’ ability to operate globally, barriers to entry for new intermediaries in already concentrated markets, and risks to freedom of expression and other human rights.

Chatham House, along with Global Partners Digital, seeks to better understand the regulatory landscape around digital platform governance, and how this varies between regions. To this end, Chatham House convened a workshop at the 2022 Internet Governance Forum, bringing together experts and practitioners from Latin America, Africa, Europe, South Asia and South-East Asia to discuss and better understand the various regulatory approaches, and to extract common themes between them. The discussion sought to communicate and share this understanding widely and highlight areas which could benefit from further investigation and exploration.

The discussion focused on several questions, including: What forms of online platform regulation are emerging in different parts of the world, and in what ways do they diverge? How is the local democratic context in each region? What does a human rights-based approach to regulation mean? Are these measurable through reviewing legislation? What risks does policy divergence pose to an open an interoperable internet, as well as to human rights? And how can these risks be mitigated? What opportunities are there for encouraging harmonisation and consensus? 

Key Regulatory Trends Across Regions

The European region is often perceived as pioneering the regulatory landscape surrounding digital platform governance. Legislations both at the European Union (EU) level (e.g., the Digital Services Act, DSA) and at the national level (e.g., Germany’s Network Enforcement Act) often serve as an example shaping regulations in other countries. Within the context of the EU, the adoption of the DSA is hailed as a particular success, and requires, among others, platforms to: have clear terms of service and redress systems in place for users; publish transparency reports; and each member state to appoint a national independent regulator as the Digital Service Coordinator, likely fostering greater collaboration and information sharing across countries. Yet, the success of the DSA will highly depend on implementation and enforcement – particularly in the light of human rights. Furthermore, beyond the EU bloc, concerns arise surrounding ‘outliers’ (in particular Belarus, Russia and Turkey) with regards to their non-alignment with the Act, in addition to vaguely worded restrictions on politicised content types (e.g., those offensive to public morality) and potential criminal sanctions on individual platform employees. 

In Latin America, there is no established regulatory body producing regulations; this results in legislations varying from one another and for which there seems, at the moment, to be no appetite for alignment. Despite this fragmented regulatory landscape, one common approach across countries corresponds to considering major platforms as holding great influence over social discussions; yet user experiences and harm (e.g., misinformation and abuse) are often overlooked. In this sense, there is a pattern in regulatory approaches where instruments now pay greater focus to harm on social media platform over the bigger picture/internet regulation. With regards to the protection of freedom of expression, the Inter-American human rights system serves as a system to help safeguard this right across the region. 

Regulatory instruments in Africa have also shifted their focus in recent years: the main concern changed from ICT access to heavily-politicised legislations. The most common approach consists of exercising control over the platforms and, subsequently, their users, which provides greater power and protection for states and their respective governing regimes. This approach is in contrast with standards where the overarching aim is to provide and guarantee protection for all. Such control is, for example, reflected in the increase in requirement, over the past 18 months, for platforms to formally register; thus paving the way for risks related to licensing; these platforms’ accessibility to the people; and proactive requirements by states in the realm of content moderation. In addition, issues surrounding non-compliance with human rights norm in the ‘offline’/real world bear influence online; for example as seen through the prevalence  and normalisation of emergency laws, and their effect on online platform governance.

The South Asian regulatory landscape is, at the moment, highly dynamic and evolves very quickly. It does not only comprise legislations directly governing digital platforms; they also include those indirectly affecting these platforms, their users’ activities, and the power of states and governments over them. In India, the dominating approach is characterised by a general sense of distrust against non-Indian platforms; while greater protection is provided for national platforms by fear of external influence on the civic space. Echoing, to a certain extent, the approach adopted in African countries, two draft legislations were flagged as raising questions surrounding the government’s power and control exercised over platforms: the Indian Telecommunication Bill, establishing a licensing requirement and thus, raising questions over an open and free internet; and the Digital Personal Data Protection Bill, which will expand surveillance powers to the government. Pakistan’s regulatory landscape is also heavily focused on control, given that digital platform governance is framed around criminal law, with a particular focus on exercising control over dissidents. Concerns also arise with regards to the mandate conferred to regulators to interpret constitutional provisions, who often may overstep on the role of judges. Nevertheless, in both countries, multistakeholder advocacy efforts at preserving human rights and an open, free internet bear strength and influence over the regulatory landscape. 

Commonalities and Question Marks

  1. In the absence of a supra-national regulatory body (akin to the European Union), the alignment and eventual harmonisation, within a region, of regulations governing digital platforms remain a challenge. Whether such harmonisation is, at all, desirable remains, however, debatable: in the light of countries’ respective priorities, legislative landscape and regulators’ varying mandates, the adoption of global standards working for all constitutes a challenge. This fragmented landscape makes it difficult for digital platforms to navigate different, and sometimes competing regulatory instruments across countries; especially with regards to enforcement and implementation. 
  2. Concerns arise surrounding the increase of regulatory tools conferring the responsibility to moderate and respond to online content towards the platforms (in contrast with an independent regulatory body); oftentimes threatening, if not ‘hostage-taking’ platform employees with risks of individual criminal liability, while paving the way for deteriorating compliance with human rights norms. 
  3. Digital platforms remain, at times, the last civic space available and accessible to all. This is particularly in the light of licensing requirements and other restrictions surrounding other forms of media (e.g., the radio, television broadcasting, etc.); thus, in certain countries, these platforms ought to be maintained as the ‘last fortress’ to enable open, democratic and participatory civic engagement.

Risk Mitigation & Solutions

  1. Discussions and deliberations surrounding the regulation of digital platforms, as well as the eventual establishment of international standards and other soft law must be inclusive and of multistakeholder nature. There is a particular desire for governments to demonstrate greater political will in engaging and including civil society in shaping the regulatory landscape surrounding digital platforms. 
  2. Stakeholders with significant resources must facilitate and pave the way for inclusive and multistakeholder discussions and fora, in addition to leveraging these resources to improve the general understanding, across stakeholders, on the dynamics and trends surrounding platform regulation.  
  3. Governance deliberations and analyses must take into account local democratic contexts. These include, for example, local laws and customs, socio-political realities on the ground, human rights approaches, as well as the power relationships between the state and the people. 
  4. There is a need for digital platforms, in particular those bearing great presence over the population (e.g., Meta), to acknowledge the important role and influence they have; exercise responsibility in their approach to content moderation while preserving and safeguarding human rights norms; and show and exercise greater equity in the way they engage with users across regions. 
IGF 2022 WS #411 Move Fast and Fix Policy! Advocacy in an era of rapid change

Updated: Fri, 16/12/2022 - 20:17
Connecting All People and Safeguarding Human Rights
Session Report

 

On Friday, December 2, the Center for International Media Assistance (CIMA), the Center for Private Enterprise (CIPE), and the National Democratic Institute (NDI) hosted a hybrid roundtable discussion, “Move Fast and Fix Policy! Advocacy in an era of rapid change.” 

The objective of the session was to discuss processes and mechanisms for diverse stakeholder groups to provide input on proposed digital policies. The session aimed to explore different modalities to provide this meaningful input. Participants heard from two alumni of the Open Internet for Democracy Leaders Program, Paola Galvez (University of Oxford) and Catherine Wanjiru Muya (Article 19), as well as Mira Milosevic (Global Forum for Media Development) and Constance Bommelaer (Project Liberty’s McCourt Institute). Daniel O’Maley (CIMA) moderated. 

 

Key takeaways

This insightful one-hour discussion focused on the need for inclusive, participatory, and long-term approaches in the development and implementation of policies that impact the digital space. From this session, two main takeaways emerged:

 

Firstly, speakers agreed that successful policy change in an era of rapid digital transformation requires building a foundation of trust between all involved stakeholders. Recommendations for accomplishing this include establishing ongoing, meaningful engagement opportunities (avoiding one-off meetings and events), identifying neutral mechanisms and spaces in which to collaborate, and ensuring participants are informed and have access to the information they need ahead of time in order to provide meaningful inputs.

 

Secondly, when working on policy change initiatives, it is important to conduct a regional mapping of experts and stakeholders. Civil society in particular tends to be treated as monolithic, when in fact there is typically a diversity of opinions and perspectives among that stakeholder group.

 

Identifying the problem: knowledge gaps and disconnected legislation

Mira Milosevic, who leads an international network of journalism support and media development organizations, noted that amid the challenge of an accelerating change in the digital policy landscape, the media sector is facing a huge gap in capacity and knowledge. She points to a lack of experience, evidence, research, and understanding of different policies and decisions. 

 

Despite these changes in landscape, laws are being brought forward and collective action is contributing to policy spaces. And although some legislation can be beneficial for users in regards to privacy, Catherine Muya highlighted that legislation can either be cumbersome for editors or in countries with repressive governments could be harmful for news outlets -- especially freelancers without support.

 

During the question and answer portion, virtual and in-person audiences asked about the issue of speed in developing legislation. Throughout the conversation speakers addressed the process of engagement, but when you factor in course correction and feedback loops, how are stakeholders able to address a timely and speedy solution without letting policies fall behind? Milosevic acknowledged that there are no mechanisms in place to act quickly given that there needs to be more transparency from key actors such as platforms and big companies.The session left audiences with food for thought.

 

Tackling policy gaps through multi stakeholder approaches

Digital policy takes form in various countries with different governance structures, and there is no one-size fits all approach. Paola Galvez, a Peruvian lawyer, said that just because teams are working on understanding legislation, they have the adequate knowledge of the digital ecosystem and embrace multistakeholderism. “Building trust and having meaningful relationships with all stakeholders is key. I didn’t realize that at first, but when we tried to approach stakeholders to do roundtables and workshops, that’s very practical, but we need to take a step back and figure out how I am building trust with these stakeholders.”

 

Reflecting on existing fora

On the subject of building “trust,” Milosevic recognized that in order to help build trust over time, there needs to be a space for different stakeholders to come together on a permanent basis while recognizing confidentiality and safety of the issues discussed. Ad hoc and last-minute engagements are “not enough.”

 

Constance Bommelaer, whose organization ensures that digital governance is prioritized in development of new tech and embedded in the next generation of the web, pointed to the IGF as a multistakeholder example where representatives of civil society, consumer societies, youth audiences, and a diverse set of countries come together. She noted that it’s important to not only participate but also support and create impartial platforms such as the IGF where learning can occur. Catherine Muya quickly jumped in after this to agree that when contributions come from both civil society and the private sector, the government can take issues more seriously. She pointed to the time when a strong community of artists came together in Kenya to educate legislators on the impact of a copyright bill. 

 

Participating with a hybrid audience

In terms of participation, the session drew an online audience from a wide range of countries outside of Africa, including India, Philippines, North Macedonia, and Armenia. Questions both online and in-person came from a range of individuals from organizations that work with the private sector, civil society, and technology education. Guy Berger, former director of Policies and Strategies regarding Communication and Information at UNESCO, live tweeted the event to his network and highlighted Bommelear’s comments on transparency and compliance for existing regulation and Muya’s comments on the need for evidence-based advocacy to speed-up policymaking. In-person and online participants also noted that the title of the session intrigued them. 

 

IGF 2022 WS #254 Trustworthy data flows: building towards common principles

Updated: Fri, 16/12/2022 - 20:17
Governing Data and Protecting Privacy
Key Takeaways:

Cross-border data flows underpin today’s business, government & societal functions. Processing & transfer of personal data are integral to these, making trust a vital element for sustainable growth. Trust is eroding over concerns that government demands to access data may conflict with universal human rights & freedoms or cause conflicts with domestic laws when access transcends borders.

,

High-level principles & safeguards on government access are a much-needed foundation towards scalable measures and global dialogues.

Calls to Action

Principles for trusted government access must be based on international human rights law, which may demand protections that some countries do not currently have in place. (see more on these principles in the ICC White Paper on Trusted Government Access)

,

Such principles must lead to effective multilateral and multistakeholder collaboration to foster interoperable approaches and legal certainty to enable data to be exchanged and used in a trusted manner, thereby aiming for high privacy standards.

Session Report

 

Introduction

Global data flows are at the heart of the world’s economy and well-being. This became evident with the COVID-19 pandemic lockdowns in 2020, when companies of all sizes, across all sectors around the world enabled remote work and transitioned their businesses to online-first or online-only. Data transfers are estimated to contribute $2.8 trillion to global GDP—a share that exceeds the global trade in goods and is expected to grow to $11 trillion by 2025.

Companies rely on these flows to conduct day-to-day business with customers, partners, and suppliers; innovate in their business and operations; detect cyber threats and intrusion patterns; and compete more effectively – in sectors as diverse as agriculture, healthcare, manufacturing, banking and shipping. In 2021, the G7 also emphasized the importance of cross-border transfers, noting that the “ability to move and protect data across borders is essential for economic growth and innovation.”

However, trust in international data flows is being eroded over concerns that government demands to access data for criminal and national security purposes may conflict with universal human rights and freedoms, including privacy rights, or may conflict with other national laws when such data transcends borders. These concerns have led to uncertainty that may discourage individuals’, businesses’, and even governments’ participation in a global economy, negatively impacting inclusive and sustainable economic growth. 

Key takeaways

Participants in the session agreed that governments need to access personal data to protect public safety and national security, but warned that access without safeguards inevitably leads to abuse, violations of individuals’ fundamental rights, and a loss of trust in data flows.

The session highlighted the need for cooperation between governments and stakeholders, including business and multilateral organizations to develop interoperable policy frameworks that would facilitate cross-border data flows and enable data to be exchanged and used in a trusted manner thereby aiming for high privacy standards.. The OECD effort to define principles and safeguards for government access to personal data held by the private sector was referenced as an example to provide a firm foundation for data-free-flow-with-trust.

The conversation also highlighted various factors that impact trust when governments request access to private sector data. For example, legal barriers to data transfers can arise from differences in laws governing government access and discrepancies in safeguards when data transcends borders. In addition, companies that receive government requests for data they hold must decide (a) whether the demand is lawful; (b) whether any cross-border demand presents a conflict of law between jurisdictions in which they operate; (c) how much data they are compelled to disclose; and (d) what information about their responses to these demands may be disclosed to customers and the public. These concerns also significantly contribute to public sectors’ reluctance to deploy digital technologies broadly, fearing that this would potentially expose their public sector data to third-party governments that may demand access.

Call to action

The session called for the creation of commonly agreed shared principles for trusted government access to ensure that government access to personal data is consistent with the protection of individual rights and the rule of law.

Such principles must lead to effective multilateral and multistakeholder collaboration to foster interoperable approaches and legal certainty to enable data to be exchanged and used in a trusted manner, thereby aiming for high privacy standards.

Policymakers should support open cross-border data flows, while also ensuring that users have adequate privacy, security, and IP protections and that those protections are implemented in a manner that is transparent, non-discriminatory, and not a disguised restriction on trade.

Further reading

IGF 2022 WS #253 Towards Cyber Development Goals: Implementing Global Norms

Updated: Fri, 16/12/2022 - 20:14
Enabling Safety, Security and Accountability
Key Takeaways:

Tension between the need to advance digital transformation versus the lack of a strong cybersecurity posture poses risks to achieving the SDGs and a safe, secure online environment. While doing more to increase the resilience of digital infrastructure is necessary, it is not sufficient. Translating existing international agreements into feasible actions is long overdue.

,

The international community should explore practical ways to mainstream cybersecurity capacity building into broader digital development efforts.

Calls to Action

To promote safe digital transformation during the Decade of Action and beyond, the international multistakeholder community should come together to agree and adopt Cyber Development Goals (CDGs) to mainstream cybersecurity into the development agenda.

,

Complementing SDGs, CDGs can help define global benchmarks and practical activities to support countries implementing universally endorsed UN norms and mobilize the UN Development System and stakeholders worldwide to achieve concrete goals, facilitate coordination.

Session Report

 

Introduction

Accelerating digital transformation is essential to achieving the Sustainable Development Goals (SDGs). A secure, trusted, and inclusive digital infrastructure is the backbone of today’s economic and social development. With just over half of the world’s population connected to the Internet, closing the digital divide is essential to reducing inequalities and socioeconomic gaps between those with access to digital services and those without.

Digital transformation and the expansion of the digital ecosystem also comes with increased cybersecurity risks, especially in low- and middle-income countries that may lack adequate cyber resilience against constantly evolving digital threats. This tension between the need to close digital divides and advance digital transformation versus the lack of a strong cybersecurity posture can be considered a risk to achieving the SDGs and a threat to achieving safe, secure, and rights respecting online environment.

Against this background, the workshop, discussed how the international community could explore practical ways to mainstream cybersecurity capacity building (CCB) into broader digital development efforts to empower and protect societies from increased cybersecurity risks associated with digital transformation.

Key takeaways

Through an engaging and dynamic conversation, the panelists and audience members debated the idea of developing Cybersecurity Development Goals (CDGs), a set of aspirational and feasible goals to rally the international community to collaborate in closing digital divides, bolster resilience by fostering access to digital transformation, and enable the implementation of international law and norms to curtail malicious cyber activities.

The conversation confirmed the pressing need to mainstream cybersecurity into digital development to support a safe digital transformation and thus a better and more sustainable future for all. The idea of coalescing around shared goals received much support, highlighting the need to bring together various policy silos that work on cyber issues, as well as those working on development issues, to create a shared language and to build on the existing work, in particular the various capacity building initiatives.

All participants strongly underlined that  any process to develop  such shared goals must fundamentally hinge on trust and inclusiveness, and their success will depend on the ability of translating goals, norms and international frameworks into understandable and practical action through a whole-of-government and whole-of-society approach.  

Call to action

The session aimed at promoting a safe digital transformation during the Decade of Action and beyond. In particular, it called for the international multistakeholder community to come together to agree and adopt Cyber Development Goals (CDGs) to mainstream cybersecurity into the development agenda.

Complementing SDGs, CDGs can help define global benchmarks and practical activities to support countries implementing universally endorsed UN norms and mobilize the UN Development System and stakeholders worldwide to achieve concrete goals, facilitate coordination.

Further reading

IGF 2022 IS3C General Meeting: Recommendations to Make the Internet More Secure and Safer

Updated: Fri, 16/12/2022 - 18:55
Enabling Safety, Security and Accountability
Key Takeaways:

The findings of the IoT and Education and skills research are best practices, in line with IS3C's goals. Key findings are applicable to all stakeholders. To go from theory to practice, the outcomes need to be acknowledged and deployed. Gender balance is still a problem. An extra effort is needed to reach female respondents and key persons opinions. Our report is here: https://is3coalition.org/docs/study-report-is3c-cybersecurity-skills-ga…

Calls to Action

Outreach to all stakeholders to distribute the findings of IS3C's outcomes. Set up hubs and teams to discuss deployment of outcomes. Maintain a relationship with related stakeholders, as a resource for future work. Reach out to female resources and key persons from developing countries to establish the balance of the research. Cybersecurity and its related issues is long-term work; each day lost in another day.

Session Report

IS3C held its general meeting at the IGF in Addis Ababa where it presented on the results of work carried out in 2022 and looked ahead to the near future.

Working Group 1 : Security by design, sub group on Internet of Things

Research has confirmed that there is a large gap between the theory of security and the daily practice of IoT security. The working group focuses on identifying the solutions needed to close this gap. The first results will be reviewed in December and January, after which the final report will be published in the winter of 2023. The IGF open process of consultation with stakeholders worldwide will be announced soon.

The WG's research focused on: (a) a review of current security-related IoT initiatives and practices worldwide, and (b) to develop a coherent package of global recommendations and guidance for embedding security by design in the development of IoT devices and applications. The report will include the outcome of research questions shared globally. One of the outputs of the research is a compilation of all the Security Best Practices that could be collected from the documents. These best practices are divided into four categories: Privacy and Exposure; Update; Non-technical; and Operation/Community.

Also, attention is given to the consumer side. What do they need to know about IoT security by design when they deal with a device containing IoT? Current labelling schemes have been compared to ascertain this. When consumer knowledge is upgraded and it is ensured that they are fully equipped to use a device securely and are aware of their rights, focus shifts to the manufacturers of the IoT devices and tools. They will feel the burden of having the obligation to make sure the device is in a good condition and safe to use more strongly, as well feel a growing awareness to deliver security updates to the devices they manufacture.

A  call for action was launched by chair Nicolas Fiumarelli to all stakeholders to participate in the open consultation process of the draft report.

 

Working Group 2 : Education and skills

A major factor undermining the development of a common culture of cybersecurity is that students graduating from tertiary ICT-related educational programmes often lack the skills that business and society as a whole need in order to understand the benefits of security-related Internet standards and ICT best practices. In order for ICT security to be better understood, it has to be integrated into tertiary ICT educational curricula; at all levels. This may result in the structural development of ICT(-related) products and services that include cyber security Internet standards and ICT best practices. The coalition’s Working Group 2 has therefore identified the following goals:

  • To detect and resolve cyber security skill gaps in tertiary ICT education curricula;
  • To encourage tertiary educational institutions to include in their ICT curricula the essential skills, knowledge and understanding of security- related Internet standards and ICT best practices, building on current best practices, in order to bring tertiary education in line with emerging workforce requirements;
  • To strengthen collaboration between educational decision-takers and policy makers in governments and industry in order to align tertiary ICT curricula with the requirements of our cyber future;
  • To ensure effective collaboration between key stakeholders in order to keep tertiary ICT educational materials in step with new technologies and standards and prevent new skills gaps from developing;
  • A need to make cyber security education more interesting to young people and especially women;
  • To make cyber security education part of life-long learning programmes.

The research used two methodologies. First interviews with cybersecurity experts in multiple countries and second, a questionnaire that was extensively distributed through internet governance fora. This resulted in input from 66 countries from all regions around the world.

The results show the gap between what people learn in the formal education and what the need in the cybersecurity industries is. When the technical skills could be learned in formal education, employers need to add some soft skills, creativity and critical thinking for example. The resources said that there is a need for collaboration between education and industries, to ensure that knowledge becomes more compatible to employers’ demands. Also, there is a need for constant knowledge sharing from the experts in cyber securities.

The report 'Closing the gap between the needs of the cybersecurity industry and the skills of tertiary education graduates' was formally presented by WG 2 representative Teuntje Manders to MAG chair Paul Mitchell and to the research's sponsors, Mieke van Heesewijk of SIDN Fonds and Julia Piechna of NASK. It can be found here: https://is3coalition.org/docs/study-report-is3c-cybersecurity-skills-gap/, on IS3C’s website.

The working group on Data Governance and security was not able to present but will present its report in the winter of 2023.

 

Global Digital Compact

IS3C has launched a special working group for its response to the Global Digital Compact. Dr. Allison Wylde leads this body of work which will reflect the outcomes and work underway within IS3C that ought to become a part of the GDC to ensure a more secure and safer Internet, thus world.

 

The future

Two working groups will start their work in 2023, Procurement and supply chain management, and a prioritisation list when procuring secure by design ICTs. Others are in the process of formulating their mission statements: post-cyber encryption; A working group aims to offer a roadmap for anticipatory governance strategies for the field of emerging technologies, initially focusing on AI and Quantum technology; Consumer protection and advocacy and finally; A working group that focuses on the (barriers preventing) deployment (of) three standards: DNSSEC, RPKI and IPv6.

Everyone is invited to join and/or support the upcoming body of work that IS3C endeavours to undertake in 2023.

IGF 2022 Open Forum #48 Internet Society Open Forum "Protecting the Internet"

Updated: Fri, 16/12/2022 - 18:49
Avoiding Internet Fragmentation
Key Takeaways:

Internet fragmentation is a theme for good reason, this year we have seen an increasing number of government decisions on geopolitics that are bringing greater concerns that could lead to the splinternet.

,

We all (stakeholders) have a responsibility to protect the Internet from fragmentation

Calls to Action

Join the Internet Society movement to protect the Internet

Session Report

Session Notes

Augustina and Andrew introduced the topic and speakers.

Panelists' discussion on the topic:

  1. Natalie
  1. Referred to knitting as an analogue to appreciate the efforts put to build the internet and making it such an incredible source. She emphasized that stakeholders could work together to build a bigger, stronger and more resilient internet by following a simple pattern that is bigger than anyone as it brings value through global connection.
  2. We all have a responsibility to protect the internet from fragmentation.
  3. The internet has seen concerning trends on the splinternet that can lead to a future we do not want (splinternet).
  4. The internet is made of the foundation of critical properties that all together form the internet’s way of working. She compared it to being a business model for the internet. It is the simple foundation that the internet exists, it is what separates it from other types of networks like an office internet.
  5. Referencing the knitting analogy, the internet is a simple pattern that enables any network to become a part of the global sweater that benefits all.
  6. The internet is not just about technology, every network that wants to participate on the internet must adhere to the foundation that enables us to be globally connected.
  7. The splinternet is the opposite of the internet. It is the idea that the open globally connected internet that we all use that splinters into a collection of islands that do not connect to each other.
  8. We are worried about the splinternet because businesses, governments and organisations are increasingly making decisions that can undermine how the internet works. And unknowingly or knowingly starts to unravel the incredible resource that we have put so much effort into creating.
  9. Internet fragmentation is a theme for good reason. This year we have seen an increasing number of government decisions on geopolitics that are bringing greater concerns that could lead to the splinternet.
  10. The splinternet would complicate our ability to communicate with each other by fragmenting the internet to separate networks that do not work together so easily. Having zoom calls will be difficult, and people may have to pay to work on a shared document.
  11. There many causes of the splinternet including:
  1.  Internet shutdowns- When the government tries to disconnect the networks within its borders from the internet has serious consequences for its citizens. It's like unravelling the sleeve from the sweater and disconnecting from the global resource (internet).
  2. There are political decisions that could lead to the splinternet. For example, in the Ukraine war there have been calls to disconnect other networks from the internet. This goes against the principles that make the internet thrive.
  3. Threats to mitigate policies and business decisions that do not protect the internet and its existence. With governments tackling hard and complicated issues like misinformation, disinformation, and online harm. In trying to mitigate these harms, governments are proposing decisions that do not understand the impact on the internet and what makes it thrive.
  1. To protect the internet, the Internet Society has created an impact assessment toolkit which is like the environment impact assessment but for the internet. The toolkit is based on two white papers:
  1. The critical properties of the internet’s way of networking which establishes the internet; and
  2. The enablers of a globally accessible, secure, and trustworthy internet.
  1. The Internet Society has been using the tool kit to analyze decisions and proposals around the world to understand how they impact the internet and educate businesses and governments on how to mitigate these harms. This has allowed the Internet Society has collaborated with the community and led to engaging in conversations with the said governments and businesses.

 

  1. Emmanuel
  1. Heads the #deargov organization in Nigeria.
  2. He indicated he had applied the Internet Society toolkit, which is easy to understand the incentives and motivations for those governments around the world, particularly in Africa continue to seek pathways for consolidating interests and control around the internet and data resources.
  3. The concern of bits of the internet being controlled without recourse of the long-term effect and the impact will be on the open model of the internet.
  4. They have been supporting the Nigerian government through policy recommendation. Recently they worked on two policy recommendations or regulations that the government sought to propose they include:
  1. The Social Media Bill which we recommended the government to address the impact of the regulation on an interconnected model of the internet.
  2. 2021 Twitter Ban. They looked at the economic implications in the long run on the ban.
  1. Governments usually have genuine intentions and interests in tackling internet issues e.g., cybercrimes, child pornography, hate speech, misinformation, and disinformation. However, the approach is the problem there is a disconnect between the intentions and the drafting of policies and laws. This leads to stakeholders feeling their interests and concerns are undermined in the drafting or proposals of the regulations. Thus, the Dear gov organization works with governments and stakeholders to try and build a multi-stakeholder perspective that allows all stakeholders to see the entirety of what the regulation proposes in terms of human rights, and accessibility of the internet and does not favour certain players more than others.
  2. Mirja
  3. The Internet Architecture Board (IAB) is one of the three leadership groups of the IETF, the main organization tasked with developing and maintaining some of the internet protocols.
  4. The role of IAB is to provide architecture oversight, not just the protocols, but it tries to get the big picture of how everything works together and whether there are gaps and trying have conversations with the organisations to fill the gaps.
  5. It's also a contact point for STOs.
  6. The IAB is a group of experts who sometimes have differing opinions, but the group is monitoring and discussing the global development of internet governance and its impact on the internet and the quest for digital sovereignty.
  7. The IETF’s mission is to make the internet better. But is you look at its mission statement, it defines ‘better’ in terms of where the internet comes from. The mission statement states, ‘We want to make the internet useful for communities that share our commitment to openness and fairness.’ It is what the internet is based on and what the IETF is committed to.
  8. The success of the internet is based on how we design it based on a set of principles and create building blocks that you can use in different ways together, which has provided the success and innovation on the internet that you can see so many services that have bloomed over time. It is one of the base principles for maintaining the protocols and technology that provide the internet.
  9. Deployment of new technology (protocols) gets blocked in the name of digital sovereignty. This is concerning, especially if the protocols are supposed to provide better security and user privacy. It is also concerning that the blocking is happening on the internet infrastructure, thus fundamentally affecting internet connectivity and interoperability. Further, it limits innovation.
  10. The internet is designed as a global network of networks, and trying to enforce measures that set national boundaries on it goes against some of the basic design principles and puts the future of the internet at risk. Thus, we should all work together to keep it as one, open and globally connected internet.
  11. Noelle
  12. On the topic of there being many paths of internet fragmentation and focused on one path. The one has been driven by governments that want to exercise their sovereignty over how the internet works within their borders. It is referred to as digital sovereignty or internet sovereignty, or tech sovereignty.
  13. This has been a subject of the Internet Society this year. Digital sovereignty means a lot of different things to different people in different countries. Thus, it's unwise to equate internet sovereignty to internet fragmentation. Some may be well-meaning people who are using this term (fragmentation) and expressing support to digital sovereignty.
  14. There is one approach to digital sovereignty that could fragment the internet. One of the reasons a government or State wants to assert sovereignty in the digital space is when it is worried about its national security. Thus, its reasons for wanting to secure the digital space within its borders are a way of making the country more secure. This becomes a threat to the internet because a State wants to implement it by giving itself the power to control how the internet works locally. It wants to have a greater hand in managing the internet infrastructure or directing how networks operate; for example, when a government through its agency wants to control the flow of traffic within the country or to and from the country, it comes up with its own routing policies.
  15. Another example is when a government tells everyone to synchronize their clocks in their ICT system to only one source at a time (the government servers). Typically, systems synchronize their clocks with multiple time sources to minimize the risk of getting the wrong time. It's part of what makes the internet robust and resilient.
  16. From the above examples, we can see an attempt to centralize the processes and mechanisms that are decentralized and distributed on the internet.
  17. A final example is one country requiring operators to use the DNS reservoirs that are controlled by the government. This allows governments to change how name resolution works in the country or creates an alternative to the global DNS. This would prompt fragmentation in the global network of networks. Imagine if all governments were doing this.
  18. Invitation to read the Internet Society’s upcoming report called, ‘Navigating Digital Sovereignty and Its Impact on the Internet’ coming up on Thursday, 1st December 2022.
IGF 2022 WS #326 Platform Responsibilities for Journalist Digital Safety

Updated: Fri, 16/12/2022 - 17:58
Enabling Safety, Security and Accountability
Key Takeaways:

Gendered online violence against journalists is a structural problem, how to combat it needs to be an integral element of the internet governance discussion and a more effective institutional response, including by social platforms is urgently needed.

,

The UNESCO/ICFJ research "The Chilling" provides key insights into the impact of online violence as well as clear recommendations to different stakeholders on how to address it.

Calls to Action

All actors of the internet governance structure, including social media platforms must take gendered online violence against journalists serious as an attack on Freedom of Expression and put in place effective counter-measures.

Session Report

The session was co-organized by UNESCO and APC and brought together speakers with different areas of expertise on the topic of gendered online violence. UNESCO and APC both have implemented an extensive range of projects on the topic of safety of women journalists and as well as cooperated on this issue, last by organizing a consultation process looking at how gender perspectives can be more strongly integrated in the implementation of the UN Plan of Action on the Safety of Journalists and the Issue of Impunity.

Julie Posetti from the ICFJ presented key statistics from a report jointly published with UNESCO. “The Chilling” highlights the severity and the impact of gendered online violence on women journalists and on freedom of expression more broadly. She specifically stressed the online to offline trajectory and pointed out that 20% of surveyed women journalists said that they had been attacked offline in connection with online violence. According to Posetti, “online violence aids and abets impunity for offline violence”.

Nompilo Simanje confirmed similar findings for the Southern African region, where perpetrators also target women journalists largely without consequence. Due to this situation of impunity, Simanje spoke of a “normalization of online violence”. She particularly emphasized two manifestations of online violence, namely doxing, the sharing of personal information of a victim and digital surveillance.

Guilherme Canela from UNESCO stressed that gendered online violence is a structural problem, calling for an institutional response. He argued that combatting gendered online violence should be considered as an integral part to the internet governance discussion and called for an internet which is free, independent and pluralistic but also safe for all of its users. He introduced recommendations published by UNESCO and ICFJ in 2022 as part of “The Chilling” which provide actionable advice to different stakeholders on how to effectively address online violence against women journalists.

Building on this introduction of the recommendations, Julie Posetti provided further insights regarding the specific sets of recommendations directed at internet platforms and at political actors and States. In both cases, she emphasized the need to put in place mechanisms and structures that specifically stop actors perpetrating violence against women journalists.

In the following, Nompilo Simanje emphasized the need for tech platforms to increase capacities that allow for an understanding of local languages and contexts. Julie Posetti raised the increasing issue of extraterritorial attacks against journalists and how violence against them perpetrated online radiates into offline spaces, event internationally.

Finally, UNESCO’s Guilherme Canela introduced a risk assessment framework currently being developed by UNESCO for digital platforms. This risk assessment framework can guide platforms on how to better minimize risks and harm for users, including by taking into account risks of gendered attacks and the proliferation of gendered disinformation. The risk assessment framework will be presented in February 2023 during the “Internet for Trust” conference on platform regulation organized by UNESCO.

The session terminated with a series of questions from the audience.  

 

 

 

 

IGF 2022 Town Hall #91 The war in Ukraine and the disinformation war

Updated: Fri, 16/12/2022 - 17:46
Enabling Safety, Security and Accountability
Key Takeaways:

The war in Ukraine has been a relevant stress test for the European fight against disinformation. The organisational framework put inplace and recently augmented by the Digital Services Act on the basis of the coordination of all interested parties (Code of Conduct forthe platforms, fact checking, access to data, intervention of the regulators and validation of the results) has demonstrated to be robust,even if progress is still needed

,

It is useful to stress the importance of having a distributed system, not a vertically integrated one. Interaction and collaboration betweenthe differents organisations involved is fundamental, as is the transparency and neutrality of the actions and decisions of thoseorganisations. The Commission which has the power to inflict heavy fines, will only intervene when one of the actors does not respectagreed rules and procedures

Calls to Action

Measures to counter disinformation needs to be improved all around the world, in a multistakeholder form.

,

Independent regulatory bodies could play an essential role and are a good compromise between the respect of Human Right and the restrictions required by the fight against disinformation.

Session Report

Giacomo Mazzone introduces the Town Hall meeting explaining that the title “The war in Ukraine and the disinformation war”, was proposed by Edmo (the European Digital Media Observatory) and Eurovisioni. The objective of the session is to understand how the Internet can be used as a weapon, not in the battlefied, but in a battle to influence public opinion about the war in Ukraine, in the rest of Europe and in the world. 

Disinformation around Ukraine is also a testbed for the recent measures that the European Union has put into place to fight disinformation, such as the Code of practice and the European Digital Media Observatory (which were presented at the IGF in past meetings) and the newborn network of national observatories of EDMO. The present war is a laboratory of what could happen in a future cyber war.

Krisztina Stump, Head of the Unit in charge of the Commission’s policy to fight disinformation, presents the unique European approach. It is unique because it is based on a strong toolbox, whose tools are fully rooted in freedom of speech, combining regulation and industry-led solutions (reflected in the Code) in the form of co-regulation, with the Digital Services Act backing up the Code; it is rooted in a multi stakeholder approach, which is also demonstrated by EDMO, its national/regional hubs and the diverse stakeholder community it is assembling. 

Within the Code, did we find a single magic bullet to fight disinformation? This is not the case, as disinformation is a complex problem, requiring complex solutions. The code of practice is a therefore toolbox with a variety of instruments that all together can be efficient in fighting disinformation.

The key areas of the revised, 2022 Code of Practice are Demonetisation, Transparent political advertising, Reducing manipulative behavior including detecting fake accounts, User empowerment measures including media literacy, Fact checking coverage throughout EU with fair financial contributions and Data access for research.

The Code comes with strong transparency measures to allow users to consult how signatories of the Code implemented it. There is a Transparency Center, a Permanent Task-force chaired by the Commission which continues to work on the implementation of the Code, and a robust monitoring framework to make sure the commitments are properly implemented.

There are 35 signatories of the Code, which include major online platforms (Google, Meta, Tiktok, Twitter, Microsoft, etc.), but also associations and smaller and specialised platforms, the advertising industry, fact-checkers, research organisations and players offering technological solutions to fight disinformation. This is putting into practice the multi stake-holder approach.

In case signatories – who are considered as Very Large Online Platforms -  do not live up to their responsibility to mitigate the risks stemming from disinformation, the Digital Services Act offers regulatory intervention (hard regulation entered in force in all EU countries on November 16th 2022.

The war in Ukraine and the war propaganda surrounding it is a very specific situation. The Kremlin’s propaganda machine is part of hybrid warfare, it is in that light that the EU adopted  sanctions against certain Russian broadcasting channels. At the same time,  the implementation of the Code of Practice by the signatories offers also a variety of measures fighting disinformation around the war. The Commission is working with the signatories to make sure they live up to their commitments, notably to demonetize Ukraine related disinformation, to increase  fact-checking, and to apply all the other measures such as giving users reliable information, labeling State affiliated accounts, and take measures against coordinated manipulative behaviour.

EDMO secretary general Paula Gori presented the Ukraine’s war observatory, that since February is regularly analyzing and reporting about disinformation campaign across Europe while  Claire Wardle – EDMO expert for Media Literacy- explained that disinformation need to be tackled also in the long term through digital and media literacy regular efforts and campaigns.

Two fact-checking organizations participating to the debunking activities of EDMO (Tommaso Canetta/Pagella Politica and Adam Maternik/Demagog.org) presented some of the cases of disinformation and of toxic information propagated during the war, mainly by Russian sources, but also, in a smaller percentage, by Ukrainian sources.

Francesco Sciacchitano from the Italian regulatory body added that the disinformation war has been a very tricky issue to be treated by independent authorities of regulation across the EU, because this effort lies on the thin edge that divides freedom of expression from hate speech and propaganda from reliable reporting.

The panel’s session was followed by a short but intense session of questions and answer with the audience in the room and on line in which intervened Russian, Ukrainians and Iranians participants.

LINK TO THE PRESENTATIONS SHOWED IN SESSION:

https://edmo.eu/wp-content/uploads/2022/09/Stump-IGF-EDMO-2022.pptx

https://edmo.eu/wp-content/uploads/2022/09/Canetta-IGF-EDMO-2022.pdf

https://edmo.eu/wp-content/uploads/2022/09/Maternik-IGF-EDMO-2022.pptx

 

IGF 2022 WS #219 Global AI Governance for Sustainable Development

Updated: Fri, 16/12/2022 - 17:10
Addressing Advanced Technologies, including AI
Key Takeaways:

Approaches to global AI governance should be based on transparent and inclusive multistakeholder processes to render them resilient against changing political interests, and they should acknowledge the different realities in the Global North and Global South with regards to digital inclusion. 2:There is a need for AI governance structures that actively promote sustainable development. AI governance should be focused on ethical foundations an

,

There is a need for AI governance structures that actively promote sustainable development. AI governance should be focused on ethical foundations and safeguard human rights.

Calls to Action

Governance for AI must focus on the entire technology lifecycle from development to application to assure ethical AI and safeguarding human rights.

,

The voice of the Global South must be heard in AI governance approaches to significantly decrease digital inequalities between the Global North and the Global South.

Session Report

After opening remarks by the Brazilian Ministry of Science, Technology and Innovation (MCTI) and the German Federal Ministry for Digital and Transport (BMDV), the moderator Onike Shorunkeh Sawyerr (GIZ) introduced the audience questions to be answered during the opening statements of the panellists. The idea was to engage the online and on-site audience early to allow for integrating their responses into the discussion. The questions were:

Q1: What do you associate with the term “AI governance”?

Q2: In which area(s) do you see the greatest potential for AI to contribute to sustainable development?

Q3: Overall, do you expect AI to have rather positive or rather negative impacts on sustainable development?

Q4: What risks do you associate with AI?

 

Urvashi Aneja (Founding Director of Digital Futures Lab, India) described how her institution investigates AI benefits for different areas, including sustainable development. She also highlighted that people still think of AI as a product, and that she considers that framing too narrow. The Digital Futures Lab sees the whole life cycle of AI interventions. For example, she argued that labour conditions for building AI and energy consumption by AI are also important aspects of the discussion. She concluded by stressing the need to improve understanding of the impact of AI on sustainable development.

As the second panellist, Ledénika Mackensie Méndez González (Executive Director for Digital Inclusion at the Secretariat for Communications and Transport of the Mexico City Metropolitan Area) said that AI is a way of innovating the public sector. It requires increasing availability of data as well as transparency. In her statement, she urged governments to assure that AI respects human rights. To accomplish that, ethical challenges must be solved.

Kim Dressendörfer (Technical Solution Leader Data & AI at IBM, Germany) started her statement by expressing that as the technology evolves so quickly, the developers should open the “black box” for everyone, teaching people how to use it properly. Dressendörfer stated that AI is an opportunity to create something new and better. AI governance has multiple layers and also involves the individual developers who must consider the ethical implications of their products. In her presentation, Dressendörfer gave examples of the use of AI: Monitoring animal health in agricultural applications, assisting astronauts on the ISS, and quantifying carbon sequestration in urban forests to store more carbon emissions.

Secretary José Gontijo (Ministry of Science and Technology (MCTI), Brazil) stated that Brazil already accomplished some progress in the field of AI governance by publishing a national AI strategy and starting a discussion on AI regulation within congress. He cited the thematic chambers for the AI strategy, which bring together government, private sector, academia, and civil society to discuss transparency and the applicability of AI. Gontijo also pointed to the ongoing debate between lawyers and technical groups about the regulation of AI and how the legislation should be applied. He highlighted that AI has great potential to boost sustainability. In Brazil, for example, AI may be used in the water management or improvement of the efficiency in agribusiness, in disaster prediction, or public security. Gontijo emphasised the need to reduce the gap between the Global South and the Global North regarding the development and the usage of AI. He expressed that science diplomacy has an important role in reducing the existing inequalities, by keeping technology in the Global South in pace with the Global North and making tech affordable and available for everyone.

Following Gontijo’s opening statement, panellists responded to the first round of questions. Asked about the building blocks of meaningful regulation of emerging technologies, Méndez González highlighted the importance of exchange of experiences between states, to enhance cooperation on building blocks of meaningful regulation of emerging technologies. Resource allocation and distribution to apply these emerging technologies in the countries is crucial to make these regulatory building blocks become a reality. Méndez González expressed that a public policy for sustainable AI should be human centric and intersectional. She highlighted that the inclusion of minorities and marginalised social groups in the AI ethical debate is crucial to reduce inequalities.

Next, Gontijo emphasised the importance of multistakeholder approaches in establishing governance structures nationally and internationally. He acknowledged that it is challenging to find consensus with diverse stakeholders at the table, but once agreement is reached, it provides a strong, broadly legitimated basis. He gave examples of the policies related to the internet and new technologies in Brazil, such as the Brazilian Strategy for Artificial Intelligence (EBIA), e-Digital, the IOT plan (still in development) or the Brazilian internet bill of rights. These strategies, some of them implemented in Brazil since 2010, adopted the multistakeholder approach and it made them able to survive subsequent political changes.

The moderator asked Dressendörfer about the role of the private sector in AI governance for sustainable development. Dressendörfer is convinced that every company has the duty to work towards sustainable AI governance. From that point of view, it is important to bring the ethical discussion into developing teams, so that everyone is aware of the potential positive as well as negative impact of their work. Companies need to make sure people can use AI and that the technology boosts sustainability. She highlighted the challenges that come with the “one-size-fits-all” regulatory approach, as AI is applied differently in many different sectors. Rather, Dressendörfer advocates for transparently describing the algorithms behind AI applications. This allows for meaningful discussions on ethics, human centricity, and governance for sustainable development.

Aneja stressed that there are multiple challenges (economic, social, political, and environmental) related to AI. Politicians sometimes do not have the knowledge about the system, so they rely on the private sector. This can lead to a biased approach. Especially in developing countries, it is harder to regulate the private sector’s influence on political frameworks. She emphasised that risks should already be reduced during the development of AI applications. Building people’s capacities is also a crucial aspect to make AI operate as humanly as possible. Aneja also warned that labour issues do not get enough attention. She ended by pointing to the challenge of building technology as green as possible while dealing with ethical dilemmas as well. While this is not an easy task, reconciling these aspects is of utmost importance.

In the next part, the moderator presented the results of the audience survey. Regarding the first question, the audience associates ideas such as cybersecurity threats, fear, security and regulation with AI standards. Regarding the second question, the audience expects that AI will have rather positive effects on sustainable development. They see this potential mostly connected to global productivity and economic growth, followed by climate action and environmental protection (third question). The main fears of the audience related to AI were digital war, surveillance and misuse that leads to human rights violations (fourth question).

Dressendörfer reacted and argued that people often associate AI with dystopian movies – but she was glad to see that people have good expectations towards the potential of AI, and that AI will not completely replace humans in labour markets. Rather, AI will take over repetitive tasks and support humans to focus on complex assignments. Aneja found it interesting that people associated the use of AI with sustainable growth when still there is a necessity to associate the economic system with sustainability in general. Spreading technologies to other parts of the world is important considering that the majority of the global population still does not even have access to the internet. The hypothesis of the positive impact of AI on sustainable development still must be proven. Although many promises for the future are being discussed, harms are currently still more evident even in developed countries – for example, when looking at the labour conditions for platform workers. According to Aneja, we need more scepticism when talking about AI, as there is still a huge gap between its potential and reality.

After that, Davis Adieno (Global Partnership for Sustainable Development Data, Kenya) was introduced to the panel. He stressed that AI is already a reality, but it is in the hands of the private sector and connects the technological avant-garde rather than the masses. For the civil society, AI and technology seem rather detached from the real world. According to Adieno, our global society has other, more urgent problems, such as poverty and lack of resources. AI has critical ingredients to be an enabler of sustainable development, but for AI to be a solution for the needs of the day-to-day life of society, we need to consider its potential harms next to its benefits.

After the speech made by Adieno, the floor was opened to the questions from the audience. The majority of questions revolved around the objectives, potential and risks connected to AI governance. Aneja as well as Dressendörfer agreed that the quality and the management of data must be improved as a precursor for more meaningful technological developments – as more data does not mean better data.

 

IGF 2022 Town Hall #43 EuroDIG Messages - Internet in troubled times

Updated: Fri, 16/12/2022 - 16:34
Avoiding Internet Fragmentation
Key Takeaways:

• The European vision of a rights-based, open, accessible, resilient Internet has to be upheld even in times of conflict and crisis in the region and beyond. A strong multistakeholder approach to digital cooperation and Internet governance is the only way to avoid and counter fragmentation, exclusionary processes and harmful effects of existing and new technologies. Open multistakeholder processes need to include youth and marginalized voices. Al

,

• Sustainability should be at the core of digital cooperation, this includes environmental sustainability. This perspective is becoming more pertinent as questions on energy insecurity and resource effectiveness in regards to the Internet are becoming a lived reality for many users, as well as governments and the private sector. In the current consultation on the Global Digital Compact, environmental sustainability shall be suggested as a key are

Session Report

Long Report Town Hall: EuroDIG Messages - Internet in Troubled Times

 

Participants onsite: around 30

Online: around 15

Gender split: approximately 60% men, 40% women

 

The session was opened by explaining how the EuroDIG Messages 2022 were drafted and agreed upon. For the first time, the new mode of Focus Areas was being used.

 

The Focus Area surrounding the topic of digital sovereignty was presented first. The main take-aways were, that new regulation has brought clarity, but always has to be carefully evaluated against potential harms, such as Internet fragmentation – a topic heavily discussed at the IGF2022 – and harms to human rights and democracy.

In the discussion it was pointed out that a main challenge for Europe at the moment is the war on Ukraine. Peace and sovereignty are not in opposition, but the term is sometimes used to describe a closed and territorial approach.

It was mentioned that a European vision of sovereignty has to foster openness and interconnectedness.

The Focus Area on effective regulation the importance of the multi-stakeholder approach was championed. Example where regulation can play a key-role in the near future is the green transition and rapid cybersecurity standards/criminal justice.

In the discussion it was commented that the environmental aspects of the digital transformation were a topic that the EuroDIG community very actively worked on, also picked up by the IGF community and a subsequent intersessional policy network. It was recommended that EuroDIG should continue to be a strong voice. One way to highlight it was suggested in making a remark in the European stakeholder consultation on the Global Digital Compact.

It was noted that due to the energy crisis in Europe, the topic might naturally gain traction again. It was also suggested that regional and global workstreams, especially intersessional work, should be streamlined and integrated, not parallelized.

The Focus Area on effectiveness of governance bodies was presented. Some main points were the call to take a fresh look at the multi-stakeholder approach. Youth should be included more readily. Discussions and regulatory processes around artificial intelligence, digital identities and innovation such as delay-tolerant networks were noted as examples for updated approaches to policy-making.

In the discussion, it was pointed out that the human-centric approach to digital policy that Europe portrays is not the same in all world regions and it is a continuous effort to align values, while not hegemonizing different approaches.

The Focus Area of Internet in troubled times came about in the wake of the war on Ukraine. The main messages were presented. Again, Internet fragmentation has to be estimated as a risk, interconnectedness and openness should be championed by the UN Tech Envoy and by all stakeholders in Global Digital Compact. Broader engagement and open Internet governance processes are to be centered. Another aspect is the integrity of information, which has to be protected and disinformation has to be countered.

On the topic of disinformation, it was noted in the discussion, that in a pluralistic society, a diversity of opinions has to be protected, while fostering the accessibility and reach of neutral, fact-based information.

The youth messages were presented and comprised perspectives by YOUthDIG participants upon AI, social media, cryptocurrency, and sustainability. Education and literacy were highlighted as one important precondition to all of these aspects. The messages also called for more research and funding to promote a safe, sustainable, innovate digital sphere.

The youth representative thanked the EuroDIG community that the current dark times due to the war are acknowledged. Access to the Internet and digital infrastructures in Ukraine is hampered by energy shortages and attacks on infrastructure, depriving the people of many important, sometimes life-saving, services and technologies.

In the discussion the importance of youth voices was complimented. EuroDIG commits to continuously involving youth in its processes.

In a next thematic section, EuroDIG 2023 in Tampere was presented, specifically the overarching theme “Internet in troubled times – risk, resilience, and hope”. The community was invited to participate in the conference June 19-21, as well as contribute to the program. A Finnish member of European parliament extended the invitation, pointing out the rich history of the city of Tampere in science and technology.

In the last thematic segment, EuroDIG’s process regard the Tech Envoy’s survey to the Global Digital Compact was outlined. Messages and outcomes of EuroDIGs were mapped, with the outcome that almost all topics of the GDC will receive input. The commenting platform for the European stakeholder consultation is still open and all are invited to contribute. Strong multi-stakeholder engagement in the process is important regarding the high-level nature of the compact.

IGF 2022 Open Forum #58 Promoting Internet standards to increase safety and security

Updated: Fri, 16/12/2022 - 16:27
Enabling Safety, Security and Accountability
Session Report

Report Internet.nl workshop
 
30 November 2022, Caucus room 11
 
This Open Forum focused on the need of modern Internet standards to be adopted in a faster and more scalable way in order to make the Internet and its users more secure and safer. It took the form of a tutorial, in which  the focus lies on a testing tool, that helps one to check whether a website, email, and Internet connection are up to date, i.e. comply with modern Internet standards such as IPv6, DNSSEC, HTTPS, STARTTLS, DANE, DMARC, DKIM, SPF, and RPKI.
 
The Dutch Ministry of Economic Affairs and Climate explained the origin of the Internet.nl tool it created in 2015. It is a multistakeholder initiative intended to create awareness on Internet standards deployment and safety. Any organization can check its own domain name whether security measures, i.e. deployment of Internet standards, are in place or not. You can check the level of security of your domain name here: www.internet.nl. Within seconds the level of security is shown to you, including advice on next steps.
 
The software behind Internet.nl is open source and can be used by other organisations willing to run a local version in their respective countries. You can find the information on Github. Three other countries have adopted the tool: Australia, Brazil and Denmark. The former two presented on their experiences in adopting the process into their local environment.
 
What stood out from the three presentations is that local customs and perceptions on standards determine the way the tool can be used and presented. These differences did, however, not stand in the way of building a local version of the tool and launching it.
 
In the first presentation, Gerben Klein Baltink of Platform Internetstandaarden (Dutch Internet Standards Platform) stressed the importance of Internet.nl being a Public Private Initiative. All participants cooperate without commercial intent, joined by the intention to create a more secure Internet that is open, transparent and safe. He showed how the tool works to the audience and points to its hall of fame. All organisations showing a 100% score can apply for “membership”.  (The local IGF connection scored 10%.)
 
Bart Hogeveen of the Australian Strategic Policy Institute (ASPI), presented .auCheck. It is technically a full copy of Internet.nl, the organization behind it is not. It proved harder to create a PPI. The current result was four years in the making. Research had shown that Australia is not in a position where the need for the deployment of Internet standards is broadly understood and accepted. There’s a lot of education and awareness raising to be done. The tool was only launched quite recently, so it’s hard to show any effects at this point in time. The tested outcomes however, show the need for more awareness. Deployment percentages on average are (too) low.
 
Gilberto Zorello of NIC.BR presented on TOP, Teste os Padroões (Test The Standards). The programme was launched in December 2021. TOP is a collaboration between the NIC.BR environment and experts. Tests show that average scores are below 25% for those who have tested for all standards. TOP is promoted in technical events in government and academia and it works closely with ISP associations. Although it is still rather early to truly measure effects, TOP already sees organisations coming back with better scores.
 
Maarten Botterman presented on behalf the Global Forum of Cyber Expertise on its Triple-I initiative (Internet Infrastructure Initiative) “This GFCE initiative is meant to “facilitate” awareness raising and capacity building events in different regions of the world in order to “enhance justified trust” in the using of Internet and/or email in those regions. Local and regional actors are stimulated and supported in setting up and running local/regional events between regional stakeholders, bringing in local expertise.“ If you need help, reach out to the GFCE. It has all the toolkits and information you need. (See: https://thegfce.org/ for more information.)
 
Moderator Daniel Nanghaka adds that this initiative started in 2017 by way of a campaign, after which some CERTs started to work together. In 2023 the trusted Africa Internet Initiative will start. It is expected that through cooperation with the GFCE all regions will be reached.
 
Gerben Klein Baltink points to the fact that in The Netherlands results are measurable. There is a clear uptake in the past eight years, where naming and faming has an effect. Around the world it is far too limited as the situation now stands. The world has to step up to make itself more secure and safer. He makes a call for action: “Modern Internet standards are essential for an open, secure and resilient Internet that enables social progress and economic growth. These standards are readily available, but their use needs to rise significantly to be fully effective. The UN is called upon to help accelerate the global uptake of key standards, by including their promotion in the Global Digital Compact, and supporting advocacy and capacity building, as well as initiatives to test and monitor deployment, especially where many people aren’t connected yet.”
 
From the room Mark Carvell pointed to the work undertaken by the IGF Dynamic Coalition on Internet Standards, Security and Safety (IS3C) that is working on recommendations and toolkits on the goal of faster and massive deployment of security-related Internet standards and ICT best practices.

 

IGF 2022 WS #354 Affective Computing: The Governance challenges

Updated: Fri, 16/12/2022 - 14:56
Addressing Advanced Technologies, including AI
Session Report

Session Report
IGF 2022 WS #354 Affective Computing: The Governance Challenges

Tuesday, 29th November 2022 (12:05 UTC) - Tuesday, 29th November, 2022 (13:05 UTC)

Speakers: Dr. Diogo Cortiz, Dr. Lisa Feldman Barrett, Dr. Javier Hernandez, Dr. Jessica Szczuka, Mrs. Marina Meira
Moderator: Dr. Henrique Xavier

Rapporteur: Mrs. Pollyanna Rigon Valente

The moderator opened the session by introducing the theme of the discussion: How we can use computers to interpret and simulate human emotions and its potential, issues and other challenges.

Dr Diogo Cortiz, researcher at Web Technology Study Center (Ceweb.br), a center of the brazilian network information center (NIC.br) and professor at Pontifical Catholic University of São Paulo (PUC-SP), started his initial contributions presenting some inputs and concepts about Affective Computing (AC). Dr. Cortiz introduces the concepts of Affective Computing and how it is part of IGF Agenda: Affective Computing is not a specific technology, but an area of knowledge. It’s possible to develop different types of application to recognize, detect, simulate, and organize data about human emotions. Dr. Cortiz stated that AC is closed to AI when the discussion is about governance and regulation, because they are not a specific technology but a broad area of knowledge that could involves different types of applications. Dr Cortiz also appointed an important note: Affective computing does not always use AI, it could use technology for self-report, for example, but the most important cases in the moment are based on AI models for emotion recognition. Dr Cortiz ended his initial talk presenting two challenges (sensitive problem) that need to be addressed:

  • Using Affective computing with AI, how is it possible to be sure that an AI application is right? When inferring about subjectivity, AC may be wrong but make us believe it’s right.
  • Global models: we use models that were trained in the most cases with data from users from the global north, but that model will have impact and will be used over other regions and cultures in the world. How can we ensure it will work? What are the risks?

Dr Lisa Feldman Barrett, professor at Northeastern University, shared about one specific subject about affective computing: automating emotion recognition. Using an example from images of a research, she showed that it is possible to understand how wrong AI could be in recognizing human emotions. With more examples over facial expression and the emotion, Dr. Barrett argued that is important to remember that facial movements are only expressions and they we are not necessarily related to internal emotional state. That is the challenge for the affective computing and AI models that uses facial expressions to detect emotions. If we really want to be able to use technology to our benefit, affective computing must measure many signals, not just one, two or three. Dr Barret ended her initial talk arguing that for emotional AI to be successful, the entire ensemble must be measured across different situations, different people and in different cultures.

Dr. Hernandez spoke, researcher at Microsoft, highlighted we need to have a discussion across multiple disciplines, because probably many of us are excited and very worried about the potential applications of this technology. In addition to what Dr. Cortiz had shared previously, Dr. Hernandez also got more context about the research over affective computing: it had start around 1995 and is the study and development of systems and devices that can recognize, interpret, and simulate human affects. Talking about his role at Microsoft, he explained they have different categories and that is the area of comfortable sensing, a lot sensing with wearable devices trying to find way to capture information from users, doing lot of work on AI and how they can use it to better understand what emotional states really mean and how they can sense them in settings and with that they can create affective interaction/experiences that use that information in unique way to help achieve certain goals. Looking to all those things Javier says the one of the core mission statements is improving regulation and help users become better at managing their own emotions. It was in 2015 that affective computing started as emerging technologies and even it seems a good opportunity to research, on the other hand the companies started to look at it as an opportunity to them.

  • Challenges: The theory of human emotions is evolving; Human emotions are difficult to describe and label; A lack of representative and generalizable data; Oversimplified language is used to communicate system capabilities; There is blurred boundary between what should be private and public.
  • How to minimize challenges: communication; consent; calibration; contingency.

Dr Jessica Szczuka intervened to present a subject over the affective computing that probably some of the audiences haven’t thought much about: intimacy and sexuality, as well inviting us to explore how important emotion can be. We have three different ways how we can come to intimacy and sexuality with technologies:  through, via and with. The last one can be looked as very futuristic or sci-fi, an actual intimate or sexualized interaction with the technology itself, but we are not that far away. One of the challenges she presented was how affective computing is now really related to short-term and long-term interactions? As she presented a part of a research she highlighted one part of the model that’s relevant for the question made before: sexual arousal, which shifts your attention and your cognitive resources to reflect of the aspects away to the great fixation of the sexual fulfillment, therefore you do not have in this specific moment all capabilities to reflect that maybe this machine is not understanding the emotions right. Dr. Szczuka also present other research that shows that recurring interactions that evolve for us in the dynamics is key to what makes artificial entity chatbot or whatever and comparing to our daily contacts and how we perceived things it’s super hard to implement and we really may need to make sure that companies that are using this technology are aware of the potential consequences. To have more context about the consequences we saw examples: using affective computing is actually a way to nudge user into using a specific technology as we have this need to rely on others and use our emotions for this, if you think about the way people will interact in emotionally intense state, which wrong by affective computing obviously, it will also come along with manu and very sensitive data. As part of how minimize the challenges: we should stay technology positive, providing platforms for satisfying needs for intimacy and sexuality and being responsible, anticipating and implementing possible consequences and vulnerabilities.

Mrs. Marina Meira spoke about regulation of AI in general where emotional AI is inserted into. The first thing to think about is why regulate AI and technology in general, because the development of technologies can be supported by regulation while people’s rights are protected, individually and collectively. We have a big challenge about how to regulate technology, especially when it comes to AI because there are not many regulations throughout the world so in general, they are learning while technologies evolve. Looking to the past and when technologies started evolving principles and ethical guidelines started being thought around the world, but they didn’t have binding effects and it reflects on a lot of challenges when it comes to being followed. Those guidelines were most related to transparency, explainability, safety, security, fairness, non-discrimination, responsibility, privacy, and human control over technology and that were not followed by the companies, because in that case for example, they were establishing ethics Councils within their companies or nominating people who were specialists in AI ethics however they were not changing their practices. All that scenario showed the big challenge over law regulation, which means laws that can be enforced that will be sanctioned if they’re not followed and that translates into very specific measures. Looking to the nowadays it’s possible to see a similar scenario: several laws being discussed, and the most of these regulations follow what they call a risk-based approach, that means that the more risks to human rights that are the technology present to those who are going to be affected by it, the more obligations those developing the technology. There are risks following this risk-based approach idea and regulation sense previously assessed, because a very important figure in general regulation are the impact assessments that must be conducted under a strong and scientific solid methodology to assess and understand that are the actual risks that technology can present and think of ways to mitigate them. She also highlighted the importance of those risks should be assessed with a big participation of society. Even over all that challenges, Mrs. Meira finalized her presentation talking that it’s possible to regulate and it’s a positive thing we can achieve a better society with regulation as well as with technology, but first we need to consider the most vulnerable groups and how emotional computing affects them.

 

IGF 2022 Open Forum #77 Implementing the AU Data Policy Framework

Updated: Fri, 16/12/2022 - 13:26
Governing Data and Protecting Privacy
Key Takeaways:

The overall objective of the Africa Data Policy Framework is to raise awareness about data and its growing importance as a strategic asset for Africa economy and society and lay the foundations for the development of coherent, harmonized and integrated data governance systems that facilitate data access and cross borders data flows.

,

To build a shared data ecosystem across the continent, close cooperation of regional and national stakeholders is necessary in order to align the different existing strategies and policies that already exist to the AU Framework and allow a free flow of data across and within countries. Global nexuses and cooperation also need be considered to ensure that African perspectives are represented on the international level as well.

Calls to Action

In order to harness the opportunities of digital economy, it is necessary to harmonize data governance systems across Africa and thereby enable a data single market, which will allow for both an increased private and public data value creation.

,

To enable data sharing among countries and sectors, there is a need to develop data sharing and data categorisations frameworks that take into account the different types of data and their associated levels of security and privacy.

Session Report

UN IGF Session Report

 

Session/event: Open Forum 77: Implementing the Data Policy Framework

Date: 1 December

Time: 10:45 – 11:45

Moderator: Souhila Amazouz

Reported by: Pierrinne Leukes (GIZ)

 

Name of panelists:

Dr Alison Gillwald                   Executive Director of Research ICT Africa

Mrs Aretha Mare                      Project Manager in charge of Data Governance at Smart Africa

Mr. Guichard TSANGOU         Director of Postal, Telecommunication and Digital Economy ECCAS 

Mrs. Stella Alibateese               National Personal Data Protection Director: Uganda

Mr Torbjorn Fredriksson           Head, E-commerce and Digital Economy Branch at UNCTAD

 

The purpose of this panel, as introduced by Mrs Souhila Amazouz (moderator) as the representative of the African Union Commission (AUC), is to raise awareness of the African Union Data Policy Framework (DPF) and discuss the readiness of Africa as a continent when it comes to data usage, data governance, data ownership and how that will support the development of digital economy in Africa. The DPF is the continent’s strategic framework for data governance and aims to set the priorities, vision and principles with regards to data in order to harness its transformative potential. It also aims to empower African countries and citizens whilst safeguarding their rights, to achieve equitable and equal opportunities for all African citizens in the digital space. The objective of the DPF therefore is to provide guidance to African countries in developing comprehensive, coherent and harmonized data systems across the continent which will enable the efficient use of data and enable data to flow across countries in support of digital trade and also data driven businesses. Now in its second phase, the DPF is supported by an Implementation Plan which has been validated by member states, inclusive of a Self-Assessment Capacity Tool to help countries gauge their various levels of readiness as well as identify the support they need for successful domestication.

Mrs Stella Alibateese, as Director of the National Personal Data Protection Authority in Uganda, underscored that domestication of the Framework requires of member states to make sure that they provide for the DPF recommendations within their own policies. Once policy development is completed and the necessary legislative processes have been finalized, it will become easier for the implementing ministry to cascade it to the other ministries considering that this policy framework requires a lot of collaboration across different sectors. What follows then is a review of standards to enable interoperability and an assessment of the relevant infrastructure required. Ensuring that those needs are included in the national development plans is another imperative step to ensure that the necessary resources are made available to support implementation.

Approaching the discussion from a more regional perspective, Mr Tsangou as Director at the Economic Community of Central African States (ECCAS), emphasized that the Regional Economic Communities (RECs) are the building blocs of the African Union. In this vein, they play a prominent role in actualizing the goals of the DPF and his recommendations included operationalizing regional Internet Exchange Points, building regional data centre capacities and ensuring that Model Laws are aligned to the continental framework whilst taking into account regional specificities and needs. Key to achieving these objectives is addressing the obstacles currently prohibiting cross-border data flows between member states. According to Mrs Aretha Mare, Data Governance Project Manager, Smart Africa has conducted extensive research revealing that barriers to data flows can be broadly encapsulated by three main challenges: Lack of trust, lack of infrastructure and lack of technical capacity. Foundational institutions such as Data Protection Authorities are further hamstrung by the lack of financial resources needed to increase and ensure the required enforcement capabilities.

The principle of harmonization – a central tenet of the DPF and its Implementation Plan – will be a key driver in addressing the abovementioned challenges. Dr Alison Gillwald explained that harmonization is essential for enabling and harnessing the benefits of the data economy. It creates the economies of scale and scope needed to ensure equitable participation in the global data economy and in so doing helps guard against uneven development. The DPF makes a principled commitment to the realisation of a Digital Single Market - an integrated trade environment that we're going to see in process with the African Continental Free Trade Area - but also creating a rights preserving environment for users from the continent. By adopting an approach informed by progressive realization, low hanging fruit such as setting standards for integrated national data systems in order to unlock the public value of data can ensure that Africans share in the benefit of the data that they are producing. For too long Africa has been the recipient of the data subjects that are excluded from these markets, and it's really the commitment to harmonization that will allow stakeholders from Africa to create this enabling and trusted environment.

This scale is also needed for Africa to assert its place in the global data economy. Mr Torbjorn Fredicksson, who leads UNCTAD’s Digital Economy Branch, highlights that data can help to address many of the world's and Africa's major development challenges such as green transitions, food insecurity, pandemic preparedness, as well as more transparent and accountable governance. It holds the potential to transform research and development to improve the quality of decision making at all levels. However, should data be mishandled, the growing reliance on data may result in greater and greater inequalities. Additional risks are the continued fragmentation of the global landscape of data governance which will in turn exacerbate rising tensions among the matriarchs of the governments like China, the US and the EU, in addition to the increased fragmentation of the internet triggered by increased use of data localization requirements as an attempt to try to protect data inside the country which reduces the opportunities for internalizing the benefits of said data. These factors inform the call for a balanced global approach to data governance to help secure inclusive development gains. Reaching agreements on definitions and taxonomies for establishing terms of access with different types of data, dealing with data as a public good, exploring new forms of data governance and agreeing on principles as well as standards all require the active participation of African member states to ensure equitable benefits.

Questions from the floor related to data sharing agreements and mechanisms, increased civil society participation and the inclusion of local languages to give expression to governance in various settings – are all indicative of the desires of various actors to take up their roles in the shaping of global data flows. This is further evidence of heightened awareness of the strategic value of data and an enthusiasm to ensure that the digital dividends of data are shared in by all.

IGF 2022 Open Forum #89 Enabling a just data-driven African digital single market

Updated: Fri, 16/12/2022 - 13:24
Governing Data and Protecting Privacy
Key Takeaways:

The achievement of a Digital Single Market in Africa by 20230 as envisioned by the AU Digital Transformation Strategy is a bold and long-term vision which requires the commitment of all stakeholders to bring it to fruition. This entails the collaboration of multiple actors with different levels of knowledge, different interests and different levels of readiness and different levels of capacity.

,

Harmonization of Legal and regulatory frameworks across the continent remains imperative in order to materialize the benefits of the digital economy in Africa. Creating a conducive digital ecosystem namely digital connectivity, digital platforms and interoperable online payment and digital ID systems are pre-conditions to foster intra-Africa digital trade.

Calls to Action

As part of the second phase of AfCFTA negotiations on digital trade protocol, Member states are called upon to come up with agreements that consider cross-border data regulations, innovation, privacy as well as cyberspace security issues.

Session Report

UN IGF Session Report

Session/event: Open Forum 89: Enabling a Just and Data-Driven Single Market

Date: 30 November

Time: 09:30 – 11:00

Moderator: Souhila Amazouz

Reported by: Pierrinne Leukes (GIZ)

Panellists:

Dr. Ify Ogo:                            Regional Coordination Specialist for the African Continental Free Trade Area (AfCFTA) at United Nations Development Programme (UNDP)

Mr. Jean-Paul Adam               Director, Technology, Climate Change, and Natural Resource

Mr. John Omo                         Secretary General of African Telecommunications Union

Mr.Kenneth Muhangi             Lecturer-Intellectual Property, Partner-KTA Advocates, World Economic Forum 4IR Committee Member, Chair Technology, Media, Telecoms Committee East Africa Law Society.

Eng. Murenzi Danniel             Principal Information Technology Officer at East African Community, Tanzania

Mr Samatar Omar Elmi           Chief ICT Specialist, Africa Development Bank Group

Dr Talkmore Chidede             Digital Trade Expert at the AfCFTA secretariat

 

 

This session was primarily devoted to creating a platform for the panellists to share their sentiments on how the African continent can work towards enhancing digital trade and facilitate cross-border digital trade. The African Union Commission (AUC), represented by Souhila Amazouz (moderator), initiated the discussion by highlighting that the AUC has taken great strides in recent years evidenced by the development of the Digital Transformation Strategy (2020 to 2030) of which the main objective is to achieve this digital single market in Africa. To strengthen capacities in managing data and facilitate the movement of people and goods across the continent the AUC has also developed the Data Policy Framework and Digital ID Framework following extensive consultation and collaboration with both international and national organizations for the development of a harmonization strategy to create an enabling environment for creation of Digital Single Market in Africa.

 

Dr Talkmore Chidede, as Digital Trade expert for the African Continental Free Trade Area (AfCFTA) secretariat, provided a report on the progress of the Digital Trade Protocol since the Committee on Digital Trade (comprising of all State Parties) was established in May 2021 to coordinate and facilitate the negotiations of the Protocol on Digital Trade under the AfCFTA. Since its inception, this committee has conducted explanatory and preparatory work by consulting with non-state actors through brainstorming with digital trade experts from the continent at high-level sessions to hear the expectations and key issues to address this protocol on digital trade and hosted regional stakeholder consultations to hear the views of businesses and civil society organisations. Formal negotiations are set to take place between 5 and 9 December 2022, where a report on these hearings will be submitted for consideration and validation by the negotiators, together with a situational analysis on digital trade across the continent which maps the state of digital trade, policy and regulatory frameworks. Once consensus has been reached regarding the rules of engagement and guiding principles, this Protocol is going to develop a continent-wide legal regulatory framework for digital trade that governs intra-Africa digital trade.

 

Mr Jean-Paul Adam of UNECA recognized and congratulated the AU for the exemplary partnership and leadership it has displayed because digital transformation is the tool that will allow the acceleration of the implementation of the Sustainable Development Goals. He cautioned that collectively we must address the gaps which exist between the promise, the present and the potential of the AfCFTA. Amongst the regulatory challenges that need to be addressed are ensuring Infrastructure and Connectivity, ensuring Cyber-security and Artificial Intelligence for the enablement of trade. Initiatives such as the Africa Trade Exchange (operated by Afreximbank) – a platform which facilitates access of African companies to trade their goods, as well as African Regional Centre on Artificial Intelligence which was launched in the Republic of Congo earlier this year (with a priority focus on trade facilitation), are already in place to incentivize the continued investment in harmonization of the regulatory environment across the continent.

 

In his reflections on the role that political leadership can play in enabling data to flow within countries and across countries, the Secretary General of African Telecommunications Union -  Mr John Omo, asserted the importance of imbuing political leaders with a sense of urgency and the necessary knowledge regarding the importance of data for the management of national, communal and regional economic systems. He emphasized that we need to address the asymmetry that exists between the technical stakeholders, and the political class – which has access to grassroot networks and can initiate skills transfers. In acknowledgement of the fact that there's not a single organization, or individual, whether politicians or private sector, that has a monopoly of knowledge over this jurisdiction, it is imperative therefore that everyone in the ecosystem is brought together for purposes of data management and this will also bring to light any institutional overlaps and jurisdictional conflicts in terms of the partnership engagements in Africa.

 

Mr Daniel Murenzi of the East African Community shed light on the domestication of the DTS within the region, underscoring the importance of coordinated implementation and highlighting the gains made thus far under the umbrella of the “EAC Single Digital Market Vision and Digital Agenda”.  Buttressed by four pillars, namely Online, Data, Connectivity and an Enabling Environment this Regional Economic Community is promoting digital trade by ensuring all foundational components work across borders, removing trade and customs barriers, ensuring data protection and privacy laws allow cross-border data transfers, sharing cybersecurity resources and removing cross-border barriers to infrastructure and connectivity (in both wholesale and retail). Echoing the importance of addressing the supranational issues in collaboration with various economic communities as well as the African Union Commission in order to ensure overall consistency, Mr Samatar Omar Elmi of the African Development Bank introduced the Upstream Project for Digital Market Development in Africa. This $9.73 million project, which supports the implementation of both the AfCFTA and the DTS, contributes to the implementation of digital enablers such as universal access to broadband infrastructure, sovereign African cloud, African digital market, e-commerce and digital trade promotion programs for medium, small and micro enterprises and start-ups kicks off in Addis Ababa in January 2023. In the main, it aims to facilitate the creation of a conducive ecosystem for digital trust, skills and African experts’ networks.

 

Within this context, Dr Ify Ogo, with extensive experience in supporting member states in the first phase of the AfCFTA coordination in her role at UNDP accentuates the reality that states will be negotiating from their own interests which must effectively be reconciled in order to give expression to the Protocol. She called on these actors to reflect on the constructs of these rules, the reality they were intended to create and whether they fully serve the interest of the African continent.         Similarly, when asked about the importance of provisions in the Intellectual Property and Competition Chapters of AfCFTA, Mr Kenneth Muhangi (a World Economic Forum 4IR Committee Member) stressed the importance of member state buy-in for minimum standards, since this consensus provides the foundation needed for harmonization. To further underscore the importance of reciprocity, he shared the view that Intellectual Property will be the driver of digital trade because it gives companies the confidence to trade freely, in the knowledge that their goods and brands are going to be respected within the countries on a mutual basis.

 

This session highlighted the incredible opportunity which currently exists to galvanize the synergies in the work of the multiple agencies working on this topic across the continent. By strengthening cooperation, raising awareness about shared priorities to ensure complementarity between the different initiatives and building individual and collective capacities, the African Continental Free Trade Area (AfCFTA) can facilitate an integrated approach which promotes the shared prosperity from global digital dividends by enabling a just and data-driven Digital Single Market in Africa.

 

IGF 2022 DC-DT Fact-checking the DNS: towards evidence-based policy-making

Updated: Fri, 16/12/2022 - 13:16
Governing Data and Protecting Privacy
Key Takeaways:

The session focused on the access to the DNS-related data for informed decision making at a time that the DNS is receiving a lot of attention from policy-makers. Tensions discussed included disrupted measurements in the face of encrypted DNS, emerging proposals like the DNS4EU initiative and distribution of roles and responsibilities of actors in the DNS value chain. The various speakers commented on the rich nature of the DNS, being both commerc

Calls to Action

Our approach to tackling these policy questions has to be forward looking, thinking 20 years forward and how we want the space to evolve. Participants highlighted the importance of maintaining multi stakeholder conversations on the subject, taking into account views form civil society, governments and technical community.

Session Report

Fact-checking the DNS: towards evidence-based policy-making

IGF Report 2022

 

The DNS is receiving increased attention from policy makers. This session sought to explore to what extent the DNS ecosystem relies on data to make informed decisions, what tensions have been created by increased DNS encryption in terms of accessing data and how to develop evidence-based solutions to tackle DNS Abuse. The session collected views from various stakeholders from the ecosystem.

 

We started with Geoff Houston from APNIC (technical community). He spoke about how the DNS was not designed thinking it would evolve into a global network, and therefore it was built with virtually no security features, it was ‘trusty.’ This became a vulnerability when the Internet consolidated as a global communications network: any adversary could intrude upon the DNS, observe what was happening and tamper with the answers. Following the Snowden revelations, a series of protections were built around the DNS (DNS messages are encrypted, sources of information are authenticated, DNS content is now verifiable, etc). However, as a result, the DNS has become obscure, “gone dark”, generating problems of its own in preventing abuse and keeping tabs of drivers for centralization. In his words, when we speak of evidence-based/data based policy making around the DNS, “there is no DNS data to talk about, it just does not exist.”

 

Mallory Knodel from the CDT (Civil society) challenged the notion that the DNS has gone dark. Her view being that just because we had not secured data before, it does not mean there was a good reason for DNS queries to be global data. The data was visible before, and we have now found ways to make it private. She does agree, however, that this has generated issues and has broken things, and to her, it is important from a public interest and Human Rights perspective, that we acknowledge those issues. These include initial centralization of services to make DNS lookups private, challenges for abuse mitigation, censorship becoming more blunt in regimes that previously relied on DNS data for blocking and filtering.

 

Emily Taylor from OXIL inquired about the availability of data for researchers to study the impact of encrypted DNS, highlighting how for studying the resolution space it is very hard to get that data. Mallory Knodel pointed how measurement initiatives tracking censorship are confident they will be able to overcome that challenge. Geoff said that the reason why query data is not shared is because it has incredibly privacy implications, most operators don't release it for good reasons. When you strip query data from personal, sensitive data you are left with something quite limited. As a result, our window to look into what is happening at the level of the DNS is small and getting smaller. No regulation will change that. The more functions are picked up by applications (QUIC, DoH), the smaller the role of networks will become. This push is the result of interest by large operators and what they perceive users want in terms of privacy that has led to this push.

 

The conversation then moved on to pick up on existing industry practices to deal with abuse on the DNS with the participation of EURid, CENTR, .ng and Verisign and contributions from academia from Latin America.

 

Peter Van Roste from CENTR spoke about DNS4EU. The initiative seeks to create a European-wide public recursive resolver. The reasoning has to do with concerns by European institutions that (a) some dominant players –especially interested in the valuable data generated by public recursive resolvers– have captured a significant market share and that public recursive resolvers are typically not European. DNS4EU was probably informed by market or commercial concerns related to the value of resolver data. CENTR welcomes the initiative, as long as the use of the resolver not be made mandatory, and noted that nearly a dozen of European ccTLDs are running local instances of public resolvers contributing to the diversity and resilience of European networks.

 

Jordi Iparraguirre from EURid spoke about actions taken by EURid to prevent harm to users of the .eu space. These actions are evidence-based policies, but they are also informed both by the existing legal framework (contract with the European Commission, local law in Belgium GDPR) as well as with EURid’s commitment to .eu brand and customer protection. Concrete existing actions include keyword detection on domain names (for example, searching for specific strings related to COVID pandemic) and analysis of domain names at the time of registration; improved Know-Your-Customer procedures to check on Whois data, and information sharing with law enforcement on domain names deemed suspicious of harmful activity.

 

When considering reliance on data for abuse mitigation, Keith Drazek from Verisign highlighted the importance of recognizing that there are different actors with different roles, responsibilities and operational capabilities. He highlighted ongoing activity at ICANN to identify improvements in DNS abuse mitigation focused on threats that are not content related. The gTLDs registries and registrars have recently sent a letter to ICANN to say they are prepared to take additional responsibilities to deal with DNS related security threats. But there is also a need to focus on content related abuse and for considering additional tracks for dealing with abuse in a multi stakeholder way which may belong outside of ICANN. He also highlighted the need to work with other actors for them to understand what it means to take action at the DNS level when trying to mitigate broad abuse. Mark Datysgeld - chair of DNS abuse group in GNSO supported Keith’s points about work at ICANN and mentioned that the group has also submitted a letter to ICANN asking to renegotiate contracts to change responsibilities.  

 

Beyond experiences from the global North, there were contributions from the African and Latin American perspective. Carolina Aguerre explained how the technical community in the LAC region is aware of the level of centralization that exists (the region relies on large, international providers). She also pointed out that concerns around privacy on the DNS are not being matched with initiatives to deploy privacy protections and protocols at the architecture level. APNIC has done a good job of mapping the adoption of protocols for the protection of privacy on the DNS, in Latin America there are some initiatives, like in Brazil and Chile, but very little data. The community is currently focusing on raising awareness among users and policy makers around this particular issue. It will likely generate tensions in the region as DNS blocking is common practice.

 

Biyi Oladipo from .ng spoke about not just Nigeria but how ccTLDs are managed in Africa. He expressed concerns about recent developments where governments take over the running of ccTLDs and potential implications of such developments on how freely and easily users access domain names. The regulatory environment is far more complex in Africa, with each country having its own data protection laws; he sees data protection as a potential opportunity for evidence-based policymaking to take place. Lastly, in practice there are few domain names taken down due to abuse in the continent, this is an additional area for collaboration with law enforcement and where an evidence-based system would be important. Some developments are taking place, a coalition is forming to collaborate with law enforcement on abuse and takedowns.

 

Lastly, Nigel Hickson from DCMS added a government perspective. He highlighted the importance that government officials be involved in these discussions to address valid government concerns as they impact government policy and regulatory development. He also called the group to reflect on ongoing UN processes and our vision for the DNS, particularly in the face of the WSIS +20 review and UNGA.

 

IGF 2022 WS #58 Realizing Trustworthy AI through Stakeholder Collaboration

Updated: Fri, 16/12/2022 - 12:23
Addressing Advanced Technologies, including AI
Key Takeaways:

- Key takeaway 1: As more and more countries are planning to introduce some type of regulation over AI, all relevant stakeholders should seize this window of opportunity for collaboration to define concepts, identify commonalities and gather evidence in order to improve the effective implementation and enforcement of future regulations before their launch.

,

- Key takeaway 2: Ensuring that all actors, from both technical and non-technical communities, exchange and work together transparently is critical to developing general principles flexible enough to be applied in various specific contexts and fostering trust for the AI systems of today and tomorrow.

Calls to Action

Stakeholder collaboration remains critical as the global community continues to grapple with how to tap the benefits of AI deployment while addressing the challenges caused by the rapid evolution of machine-learning. Ongoing human control remains critical with deployment of AI advancements to ensuring that "the algorithms do not get out of our control. Critical to this is is breaking down silos between engineers and policy experts.

Session Report

Speakers:

  • Norberto Andrade, Global Policy Lead for Digital and AI Ethics, Meta
  • Jose Guridi, Head of the Future and Social Adoption of Technology (FAST) unit, Ministry of Economy, Government of Chile
  • Clara Neppel, Senior Director of European Business Operations, IEEE
  • Karine Perset, Senior Economist/Policy Analyst, Artificial Intelligence, OECD
  • Mark Datysgeld, Internet Governance and Policies consultant, San Paulo State University, Brazil

 

  1. Stakeholder cooperation is at the core of the development of frameworks for trustworthy AI

As a general-purpose technology, Artificial Intelligence (AI) presents tremendous opportunities as well as risks, while having the potential to transform every aspect of our lives. Alongside the development of new systems and uses of AI, stakeholders from the public and private sectors, as well as civil society and the technical community have been working together towards the development of value-based and human-centred principles for AI.

The Internet Governance Forum is the perfect place to have discussion on the different existing initiatives to create policy frameworks for trustworthy and responsible AI, including the work conducted by UNESCO with the “Recommendation on the Ethics of Artificial Intelligence”, or the Council of Europe with its “Possible elements of a legal framework on artificial intelligence based on the Council of Europe’s standards on human rights, democracy and the rule of law”.

The OECD AI Principles developed in 2019, as the first internationally agreed principles, have set the path towards an interesting and necessary process involving different stakeholders to develop a policy ecosystem benefiting people and the planet at large. Similarly, global standards developed by the Institute of Electrical and Electronics Engineers (IEEE) aim at providing a high-level framework for trustworthy AI, while giving the possibility for the different stakeholders to operationalize them according to their needs.

  1. Standards versus practice: applying frameworks to real use-cases

In that regard, both public and private sector organizations present unique challenges in relation to AI according to their specific situations and requirements. It is therefore critical for policy makers to ambition to bridge the gap between policy and technical requirements as Artificial Intelligence systems are undergoing constant improvements, changes, and developments. The case of Generative AI is especially representative as, in less than a year, it superseded the discussion on deep fakes, which shows how fast the technology is evolving and the need to involve engineering and coding communities from the very start of policy discussions. 

When ambitioning to move from principles to practice, difficult decisions must be taken by organisations due to value-based trade-offs, as well as technical and operational challenges dependent on the context. These challenges are often dealt with in silos within the technical community and not documented. Breaking these divisions is essential for companies to implement the principles in a holistic manner and better understand the conflicting nature of some of the principles.

For example, ensuring fair and human-centred AI systems may conflict with the requirement of privacy. In some cases, AI developers need to access sensitive data and detect if specific bias occur to know if models are impacting people with specific attributes, but this questions the privacy of people’s data. A similar tension can be seen between requirements of transparency and responsible disclosure regarding AI systems, and the explainability of predictions, recommendations or decisions coming from AI systems as specific and technical knowledge might be required to fully understand the process.

  1. Towards implementation: operatizing and monitoring policy frameworks into practice

To ensure the implementation of frameworks for trustworthy AI, international organizations and associations are developing frameworks to effectively manage AI risks by defining, assessing, treating, and monitoring AI systems, but also by working on the definition of common language and understanding, design different agile instruments to fit the different stages of the AI life cycle, and foster training and skill-development opportunities. 

 As different use-cases and applications of AI carry different risks, defining the most important values and principles depending on the specificities of a situation is critical to properly ensure the application of AI systems in a trustworthy and human-centric manner. Further, assessing the risks in a global, interoperable and multistakeholder way would allow the identification of commonalities to improve the effectiveness of implementation and facilitate enforcement of value-based principles. Alongside this risk-assessment approach, the OECD is proposing to collect good practices and tools to share knowledge between different actors and help a wider implementation of their principles. Finally, monitoring existing and emerging risks related to AI systems through different means, legislations, standards, and experimentations for example, would allow to create a governance system both informed by the past and present AI incidents while providing foresight on future AI developments.

Regulatory experimentations are of the utmost importance to ensure multistakeholder participation and resolve numbers of technical challenges including the opaque and dynamic nature of AI systems, the difficulty to measure impacts as well as the uncertainty around the effects of the regulation over technology. In the case of Chile specifically, the successful use of regulatory sandboxes benefitted from a participatory process involving the expert community, including engineering and coding communities, and policy makers in a transparent manner, which proved to bring down prejudices and barriers both groups had prior to working together. Other initiatives exist to connect policy makers, academics, and technology companies such as Open Loop, which is a project looking at understanding how guidance resonate in the real world before being implemented.

 

Working towards the realization of standards for trustworthy and human-centred AI proves timely and ahead of the curve as regulators are starting to design and implement regulations. Strong evidence-based analysis remains essential to feeding the policy dialogue, which is to be conducted at a truly global level by involving developing countries. If the different stakeholder communities present unique insights, objectives, and sometimes conflicting views their common objective remains to developing a holistic framework ensuring trustworthy and human-centred AI.

IGF 2022 WS #183 Digital Wellbeing of Youth: Selfgenerated sexualised content

Updated: Fri, 16/12/2022 - 12:20
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Since usually legislation refers to consensuality in order to differentiate images of abuse and sexual violence from usual behaviour in adolesence, a common definition of what "consensual" means is necessary, taking into account cultural differences.

,

General Comment 25 on the rights of children in the digital environment provides for a framework to address the issue of sexualised content, that needs to be translated into national legislation and transnational measures.

Calls to Action

In order to address the issues properly, consider the wording in regard of self-generated sexualised content, the definition of "consensual" and the wording in regard of sexual abuse, sexual exploitation and sexualised violence.

,

Make the voices of young people heard in alle matters that affect them and give the views of the child due weight in accordance with the age and maturity of the child. Take into account that sexual orientation and the formation of one's own sexual identity is a developmental task in adolescence.

Session Report

The first step in the workshop was to define the term self-generated sexualized content. Therefore, Sonia Livingstone (Professor of Social Psychology, Department of Media and Communications, London School of Economics and Political Science) differentiated three definitions of self-generated sexual content and the implications that these have for youth and the law. Self-generated sexual content can be produced in:

  1. an exploitative situation with a remote adult. This includes e.g. extortion or pressuring the young person into sending sexual material of themselves. The 25. General Comment emphasizes the importance of safeguarding, protecting, and rehabilitating the victim and criminalizing the abuser. It also highlights the platforms responsibility as well as regulation for both prevention and redress.
  2. An exploitive situation, but the perpetrator is also a child. In this case, restorative and non-criminal measures of the perpetrator are encouraged when possible.
  3. A fully consenting situation between children. Here a non-punitive approach based on evolving capacity should be taken. 

In all these cases, the state and business bear responsibility for all sharing of such images, for which prompt and effective take down is vital, to ensure that children that have been subjected to abuse are supported and helped by knowing images are no longer there to avoid re-traumatization.

Children say that the digital environment is critical to their capacities to develop and explore their identities, both as individuals and as members of communities. They do understand, nonetheless, that the digital environment is strongly connected to the offline environment. Thus, in addressing the risks of sexual abuse and exploitation online, children recommend not only measures that can be done within the online space, but also actions that transcend digital boundaries. Many local languages are not popular online, which is why Hazel Bitana (Child Rights Coalition Asia) also emphasized the need to make reporting sexual abuse easily understood for children and possible in their local language.

In order to address the question of which answers legislation provides, Gioia Scappuci (Executive Secretary to the Lanzarote Committee, Council of Europe) summarized the new monitoring report adopted by the Council of Europe’s Lanzarote Committee in March 2022, which aims to address challenges raised by the significant increase and exploitation of child self-generated sexual images and videos. The report covers 43 European state parties to the Lanzarote Convention, and highlights ways to improve their legal framework, prevent this particular form of sexual exploitation of children, investigate and prosecute it and enhance the victims’ identification and protection. The report shows that only 11 out of 43 countries specifically address self-generated material in their legislation and they do not distinguish between consensual or non-consensual. The Committee strongly recommends that children should not be prosecuted for possessing or sharing self-generated sexual images and/or videos of themselves when the possession/sharing of such material is voluntary and is intended only for their own private use. The report calls for measures to assist child victims of sexual exploitation and abuse, short and long term, in their physical and psycho-social recovery. It also calls to abandon the terminology “child pornography” and instead use “child abuse material”.

Martin Bregenzer (klicksafe Germany) explained that since last year the distribution, acquisition and possession of sexual pictures of minors is a crime by law in Germany and the penalties have been increased accordingly. On the one hand, this is a major achievement in the combat against child sexual abuse. At the same time, the legislation results in significant hurdles for consensual sexting by young people so teenagers are committing a crime in many cases when sexting. Since the new law came into effect, policy makers noticed that this could backfire, so there will probably be a revision of the law in the future. He also pointed out that consensual sexting between young people can be seen as a regular and healthy part of sexuality.

Tebogo Kopane (YouthxPolicyMakers Ambassador) emphasized the role of young people as active agents/participants, but said that there is a culture of silence in much of Africa, which leads to very little open conversations regarding sexual abuse of children being led with caregivers, teachers, parents, etc. This shows that a common approach for children’s protection needs to be flexible enough to be adapted to different cultural and political contexts. She mentioned that a space for open discussion, questions, and education has to be created. Many sensitive questions are asked online, instead of asking parents, so ensuring high-quality content as well as children’s digital literacy is necessary.

The project Love Matters was mentioned from the audience. They have regional sex-education websites, where young and pleasure-positive language is used, which attracts more young people: https://www.rnw.org/?s=love+matters

Considering further national policies and transnational strategies, Chloe Setter (Head of Policy, WeProtect Global Alliance) showed that more children have internet access nowadays, they are online younger, use new chat platforms and offenders are learning more, which makes the risk of abuse much higher. However, sexual abuse online is not inevitable, but instead a preventable problem.

The speakers, also discussed the question of how Internet Governance can support a common approach in respect to different political systems and cultural backgrounds. The need for a common cross-cultural definition of consent that takes children from different backgrounds and situations into account was highlighted and at the same time formulated as a challenge. There is no simple solution to these complex problems. To address the question of the right balance between privacy and data protection on the one hand and child protection on the other, the cooperation of all stakeholders is crucial to create safe, child-appropriate and empowering spaces.

At the end, the speakers and audience members discussed how to involve young people directly and many ideas were mentioned such as creating a children’s domain (.kids) as a safe space for children. Ensuring that children from different backgrounds and situations get a space in the decision-making process was highlighted, as not all children have supportive parents that can help them with everything, so different perspectives need to be considered.

Number of participants: overall 69 participants. 26 on-site (12 female, 14 male) 43 online (20 female, 8 male, 15 not defined)

IGF 2022 WS #454 Accountability in Building Digital ID Infrastructures

Updated: Fri, 16/12/2022 - 12:03
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Authorities need to right historical wrongs and promote civic education before introducing and rolling out new digital identification regimes.

,

There is a need to build into these frameworks Legislation that ensures these systems are inclusive and respect privacy and data protection principles.

Calls to Action

Governments should stop rolling out digital identification programs without taking the reality of their demographic into adequate consideration..

Session Report
  • On the colonial legacy of exclusionary digital ID systems: Colonialism was designed to dominate and extract and the first targets were border communities. The manual ID was designed as a visa system to control people’s movements; the IDs contained information such as people’s names, tribes etc. These dynamics of control and domination are also present in contemporary identification systems. They require people to have their biometrics and data collected in order to access essential services, because of this, people do not have a choice but to comply.  Marginalized communities are the ones that suffer the most in this new iteration of colonialism.
  • On the conflation of legal identity and legal identity: Digital identity is just a tool, legal identity and identity can be accomplished in various ways, we need to ask if this is the right way to accomplish that, we need to reframe the narrative.
  • On how policy infrastructure can move from systems that enable surveillance to systems that enable social protection: The advocacy efforts have to be towards all arms of the government, we also need to understand what the executive wants to achieve with its agendas, the goalposts are always shifting. In litigation on digital ID, we need to push the court to define what the irreducible minimums are and clarify what cannot be abrogated from. There also needs to be a definition of what national security is because this has been most abused in limiting human rights, it has to be something that threatens the integrity or human rights of a country.
  • On moving away from enabling surveillance capabilities: There needs to be more done on creating systems that have privacy by design to prevent a panoptical view on everyday interactions. If digital identification is a precondition for access to essential services, many people e.g migrants and elderly people will be excluded.
  • On what the obligations are to perform human rights due diligence: We need to define the actions and effects of the actors but we don’t find this in human rights due diligence, which makes attribution for actions difficult. There’s a lot of obfuscation which is hidden from the public view in how digital id is deployed, we don’t know the tender processes, they are shrouded in national security protection it makes it difficult to get evidence on who is doing what, and that makes obligations difficult to obtain. Most of the time the client is a state, this is important in HRDD because this makes it different from contracting to a private entity because the state has control over private data and a monopoly on violence, so there is a real distinction in obligations because of this.
  • On private sector accountability:  Corporations need to be accountable and this goes beyond making sure that the systems are being rolled out, they need to ensure that all people are able to access government services. In Uganda, the digital id system has been tied to mandatory access to essential services. This means that it becomes the single source of access to social protection. With most of the country being below the poverty line, accessing these services becomes a matter of life and death, people such as PWDs, cross-border communities and rural communities are being left behind. These systems are exlusionary by design for people who are seen as people who should not be included. Some tribal communities are denied from being registered because they are not considered Ugandan enough. The corporations rely on provision of biometrics or identification documents to authenticate people in order for them to access services. This speaks to duties that corporations have, while they are providing social protection services, they have an obligation to ensure that they are inclusive. They have been touted as being more inclusive and leading to more accountability from the government but they are very removed from the real situation on the ground, there needs to be conversations on alternatives.
IGF 2022 Open Forum #40 An internet that empowers all women: can we make it happen?

Updated: Fri, 16/12/2022 - 11:48
Connecting All People and Safeguarding Human Rights
Key Takeaways:

The session highlighted the importance of addressing the mobile gender gap – mobile internet empowers women and supports achieving the SGDs. This requires a focus on the key barriers women face with speakers highlighting the importance of ensuring affordability of internet-enabled devices, providing women with the required skills and confidence, ensuring accessibility (for persons with disabilities) and safety and security concerns.

,

Speakers highlighted the importance of partnerships and including women from the start in projects or initiatives, in particular those who are marginalised, and setting clear goals and targets.

Calls to Action

1. Commit to specific gender digital inclusion targets. This requires an understanding of women’s needs and barriers to mobile internet and use and taking targeted, collaborative action to address key barriers such as affordability, digital skills, accessibility and safety and security concerns.

Session Report

This session titled ‘An internet that empowers all women: can we make it happen?’ brought together representatives from the private sector, international development community and civil society to discuss the unique challenges that women face in accessing the internet and what can be done to bridge the digital gender divide.

GSMA opened the session and set the scene for the discussion by providing the latest data on the mobile gender gap in low- and middle-income countries (LMICs) which stands at 16%. This means that women are 16% less likely than men to use mobile internet in LMICs. GSMA noted that progress towards closing the mobile gender gap has stalled and highlighted the need for urgent action by governments and a range of other stakeholders to work together to address women’s needs and barriers to accessing and using mobile and mobile internet.

Speakers unanimously reiterated the importance of addressing the digital gender divide and noted that the internet offers an opportunity to transform women’s lives, including women with disabilities. They highlighted that improving women’s digital inclusion also creates opportunities for economic growth, and can improve women’s well-being and society at large. The cost of exclusion was also raised and a speaker cited research which estimates that women’s unequal access to and use of the internet has cost low-and lower middle-income countries $1 trillion over the past decade.

The barriers to women’s digital inclusion were discussed at length. The session speakers identified a number of the key barriers to women’s internet use in Ethiopia and across LMICs which were related to:

  • Affordability of internet-enabled devices;
  • A lack of digital skills and awareness of the benefits of mobile internet;
  • A lack of access to devices and a lack of accessibility of devices and online platforms;
  • Safety and security concerns including online harassment; and
  • Lack of relevant content and services including content in local languages.

Panellists shared practical examples of how these barriers can be addressed and how their organizations are working to address the barriers. For example:

  • Digital Opportunity Trust (DOT) Ethiopia’s programs equip women with technology, business, entrepreneurial and digital skills so that they can create opportunities for themselves and participate fully in the economic and social development of their communities.
  • Vision Empower designs and teaches curriculum that is accessible for children with visual impairments on digital literacy and STEM related subjects.
  • Safaricom has implemented device financing programs to lower the upfront cost of purchasing a smartphone and to date has sold around 1 million devices using a pay-per-use model.
  • Africa 118 supports small and medium sized enterprises in Africa with cost-effective digital services to reach their target audiences and improve their online presence.

Additional noteworthy calls to action that speakers shared were around:

  • The need for stakeholders to hold themselves accountable by setting targets to reach more women through their initiatives and to consistently monitor and evaluate against their key performance indicators;
  • The need to mainstream gender in ICT policies;
  • The need for policymakers to focus on and improve the implementation of legal frameworks and policies that aim to protect women's online safety;
  • The need to include diverse women, particularly marginalized women and women with disabilities, in the design of projects, initiatives and policies aimed at improving women’s digital inclusion from the start;
  • The need for the courage ofprivate sector organizations to pursue initiatives to drive women’s digital and financial inclusion, particularly initiatives aimed at improving affordability of internet-enabled devices; and
  • The need for digital skills training content that is tailor-made to women’s needs and preferences and use cases.

UNCDF concluded the session by summarizing the discussion and highlighting the importance of partnerships to catalyze collective action to tackle the barriers women face to becoming financially and digitally included. UNCDF shared the work they are doing in this space and invited stakeholders to join the Women's Digital Financial Inclusion Advocacy Hub (WDFI) network in Ethiopia.

 

The session put gender front-and-centre. This was reflected by the diversity of the panellists (4 women, 3 men) as well as the topic of discussion, which concerned the barriers to women’s digital inclusion. The session was attended in-person by approximately 70% women and online by 85% women. 

IGF 2022 WS #395 Creating a safer Internet while protecting human rights

Updated: Fri, 16/12/2022 - 11:47
Connecting All People and Safeguarding Human Rights
Session Report

There is a clear need for greater regulation on online safety, but human rights, specifically the importance of maintaining freedom of expression, must be a major consideration throughout. The panel noted the dangers of internet shutdowns, especially on marginalised groups, and the importance of collaboration with the tech sector, and the use of new technologies such as AI, when considering solutions to improving online safety. The panel noted the importance of privacy and avoiding government overreach, and noting the importance of technologies such as end-to-end encryption, which should not be compromised. 

IGF 2022 WS #393 Protect the Digital Rights and Data Security for the Elderly

Updated: Fri, 16/12/2022 - 10:40
Governing Data and Protecting Privacy
Key Takeaways:

The convergence of "digitalization" and "aging" has formed a new era feature. Digital technology is suitable for aging has become a global social governance topic. Governments, international organizations, Internet enterprises, non-governmental organizations, individual citizens and other subjects urgently need to increase the provision of more reasonable, sound and effective protection measures for vulnerable groups such as the elderly.

,

At present, different subjects have formulated and taken various measures one after another, but there are still "three big gaps" in network access, knowledge and skills, and risk awareness among the vulnerable groups such as the elderly. China's experience and practices in data security and personal information protection for the elderly can be used for reference by the international community

Calls to Action

In response to the call of the United Nations to "let no one fall behind , we will further explore, study and formulate laws, plans, policies, standards and norms, constantly improve the policy system for governance of the digital divide for the elderly, the rules and standards for the Internet to adapt to aging, IGF will exert greater influence, guide more stakeholders and responsible parties to participate in it, give play to greater value

,

Countries and entities should maintain close exchanges and cooperation, learn from each other effective measures and methods, effectively protect the digital rights and data security of the elderly, and create a more friendly digital environment. Strengthen domestic and international practice and cooperation, connect all sectors for joint consultation, cooperate with relevant policy formulation

Session Report

1.Views by Stakeholders

(1) In recent years, relevant Chinese government departments have successively issued relevant work plans and carried out special actions. Cyberspace Administration of China, along with other four departments, jointly issued the 2022 Work Points for raising digital literacy and skills of the publictaking multiple measures to raise digital literacy and skills of all people.The next step should be to enhance the work of the elderly and other vulnerable groups, further carry out international exchanges and cooperation, and analyze excellent practical experience.

(2) To improve the digital environment of the elderly and protect their digital rights, we should ensure their legitimate rights and interests in personal information and data, participate in improving the quality of aging products and services, strengthen the formulation of international rules and international practice and cooperation.

(3) Governments and international organizations (standardization and technical organizations) should strengthen coordination and collaboration and establish dialogue mechanisms.

(4) China attaches great importance to ensuring the data rights and data security of Internet users, and has formulated the Implementation Plan on Effectively Solving the Difficulties of the Elderly in Using Intelligent Technology. Relevant departments have issued more than 20 policy documents on the elderly's travel, medical treatment, payment and other aspects, striving to solve the problem of the "digital divide" facing the elderly.

(5) Learn from China's experience and establish a think tank of cyberspace of China and Africa. Carry out training in the field of network security for the elderly; Call on Internet companies to further carry out aging adaptation.

(6) When carrying out work on the elderly, we should have humanistic care, consider their needs in all aspects, give full play to the role of multiple subjects to improve the overall digital literacy of the elderly, innovate data circulation and use the underlying architecture to strengthen data security.

(7) In terms of protecting the digital rights and interests of the elderly, China has continued to take legal measures to provide basic protection, and has deepened concrete results through training and the "feedback" of the younger generation on the quality of the older generation.

(8) As an Internet enterprise, Kwai has established an anti fraud governance system to form the actual effect of "pre education, blocking in the event, and combating operations", and make contributions to protecting the rights and interests of vulnerable groups such as the elderly from infringement.

2.Feedback by Remote Participants

About 70 guests from China, the European Union, Africa, the United StatesRussia and other countries and regions participated in the workshop remotely in the form of Zoom and Tencent conferences. The number of women online participants is about 26. During the workshop, the remote participants actively asked and exchanged questions around the theme.

IGF 2022 Town Hall #54 Help! The Kill switch is taking away my limited agency

Updated: Fri, 16/12/2022 - 08:52
Connecting All People and Safeguarding Human Rights
Session Report

The session had speakers from regions impacted by Internet Shutdowns. The session saw reflections from Activists, Researchers, and Lawyers on Internet shutdowns in their side of the world. Some major themes that came up from the session were how shutdowns impact people. Internet shutdowns basically Exacerbate humanitarian crises.  and they do not allow support to journalists and human rights defenders.  On network measurement, it was felt that it  Helps increase internet sensitivity around the world. It has been observed that it is a pattern that access to social media platforms is blocked in times of conflict and unrest. The impact is felt hardest in marginalized communities which often goes unnoticed. In the case of Myanmar it was reported that the shutdown had lasted for 18 months in 54 townships where 12% of the population does not have access to data. Loss of the Internet means loss of access to each other. For women, this is especially hard as they are married off at an early age because of lack of information.  In case of Balochistan, it was reported that the general public is becoming more and more frustrated with internet shutdowns, especially in cases of students and young professionals. There is not just a shutdown but also throttling where speed is downgraded. There is a lack of regional and national level cooperation. In the case of India as well, there is no government documentation for the same which leaves the onus upon civil society to document shutdowns, advocate and litigate against them.

The key takeaways of the session were -

1. The need for more documentation to understand the nuanced impact of shutdowns.

2. The need to have coordinated efforts to actually understand shutdowns

3. More transparency on shutdowns from both government and private institutions.

 

 

 

 

 

IGF 2022 WS #318 Gen-Z in Cyberspace: Are We Safe Online?

Updated: Fri, 16/12/2022 - 07:45
Enabling Safety, Security and Accountability
Key Takeaways:

The safety of children and young people not only lies in the parents' responsibility, but it requires many distinct stakeholders' role including the government, technical experts and civil society. It was a consensus that the Internet is a double edge sword for children as it is not only imperative to give them the safety they deserve on the internet, but a safe space must also be created for them to develop and grow.

,

There are a lot of frameworks regulating child online safety including digital competencies. Essentially, more awareness has to be instilled in the children and young people as after all they need to know how to protect themselves. To improve child online safety, more punitive measures must be regulated and all stakeholders must transcend from policy to action.

Calls to Action

All stakeholders have a vital role in facilitating the implementation of suitable policies to improve child online safety.

Session Report

 

Report for IGF 2022 WS #318 Gen-Z in Cyberspace: Are We Safe Online?

Rappoteurs: 

Dalili Nuradli, [email protected]

Ariff Azam, [email protected]

 

Before introducing the speakers, the moderator kicked off the session by inquiring about the perspectives of everyone pertaining to child online safety. It is an acclaimed notion that child online safety can literally be fathomed as the frameworks, the policies, and the regulations that are centralised in making sure online engagement for children is done safely and securely and that our youth are in no way exploited, taken advantage of, or ended up losing their lives because of being active in the online ecosystem.  It is a well-known fact that the internet, which is readily accessible and available via the use of mobile phones and other electronic devices, has provided children and young people with levels of access to information, culture, communication and entertainment that everyone would never have imagined that would be possible 20 years ago. So, we are faced with a myriad of challenges nowadays. 

In terms of the domain name space, child online safety is a very complex notion that requires a lot of participation and responsibility from different stakeholders, education and platforms. It is imperative to not just restrict certain types of content such as extreme violence and child pornography which should be unquestionably unavailable to children. On the other end of things, it is also essential to realize that children must be able to express themselves, create content that is appropriate for them to learn and take benefit of what the internet can bring to their education and the development of the children. So from the tech community standpoint, there is kind of a double-edged sword here as we need to allow the children space for them to grow and learn in a safe environment and we also must be more responsive to any abuse and questionable materials. This takes a lot of coordination with law enforcers, trusted partners, trusted notifiers, and child rights organisations i.e., Watch Foundation which are experts in taking down child sexual abuse materials. However, when it comes to child safety, there ought not to be a differentiation between online safety and offline safety in which both carry the same weightage of concern that should be paid attention to. This alludes that in catering for the need of the children in regard to their safety, the way of treating the safety of the children in both settings must be self-same. Therefore,  protecting the safety of the children online is not a responsibility solely placed on the parents as other stakeholders i.e., government civil society, and society as a whole also should partake in championing the safety of the children online. 

In the matter of updating the current frameworks governing child online safety in ensuring child online safety policies are practical to be followed in different regions, it must be foregrounded that the reasons why such challenges spiral to the extent the issue of child online safety cannot be combatted stem from the fact that there is an inadequacy of the awareness amongst the parents or even the older generation. Consequently, this ignorance renders to the stunted growth of the legislation concerning child online safety as it is perceived by the parents and older generations to be a ‘silly matter’ to be pondered upon. Therefore, the rules or frameworks which govern the discussion of instilling awareness towards parents and others should be enacted. 

Before conferring the issue of the effectiveness of the regulations, it is indispensable to converse on the implementation of the regulations i.e., data protection and privacy acts. Albeit these legislations are in force, there is still a dearth of implementation. For instance, the United Kingdom has General Data Protection Regulation, which is deemed to be the strongest data protection act. Notwithstanding that, Tiktok was fined 92 million in 2021 and was fined 13 million in 2022 for privacy concerns that affect children. Ergo, the yardstick of the effectiveness of child online safety is not the existence of the laws regulating child online safety, but the implementation of those laws that should be stressed in order to ensure a safe digital safe for children’s and young people’s usage. 

One of the initiatives that we can pursue is to grant the parents with the awareness program from the outset. This is due to the fact that the past generation is not ‘tech-savvy’ which leads to difficulties in deciphering the behaviour of their children in cyberspace. Hence, the participation of the parents can also come in the form of joining the awareness programme or webinar to shed a light on what the current technology can do and what are the effects towards their children.

Moreover, the mechanism that the stakeholders can employ is an age verification system. To illustrate this better, United States websites relating to alcohol, gambling, and movie-rated-18 plus compel the users to verify their age. However, this entails a problem as to how the service providers are going to verify somebody’s age if the act of service providers trying to access a certain database will constitute a breach of data. The next problem arises is in regard to the absence of universal standards in tackling the said issue due to the existence of different jurisdictions. By way of illustration, the users of social media are required to confirm their age, yet, such verification is never really verified as the users do not have to tender their identity cards. This is evident that most of the time, the verification only relies on an honour system. Thus, the first step to solve this issue is by creating secure standards in which the service providers are able to access certain databases, however, these secure standards must still venerate data privacy laws and the jurisdiction of the country. However, it is indisputable that these secure standards are still flawed as they only cater for documented persons, not undocumented people in general. This is due to the fact that these secure standards will impel every online platform user to tender their identification cards and this will precipitate a crisis for these undocumented persons. It is significant to revert to the fundamental attribute of the internet which is that the internet should be open to everyone. Next, the age verification step will ensue the problem of a data breach, data privacy, data regulations and how to implement them. 

Due to the scattered standards of child online safety, one of the mechanisms that the developers of the applications may utilise is the creation of the minimum standards and norms for child online safety concerns. This will assuage the problem which the developers might encounter in appraising the content of the internet such as pornography and sexual abuse content. For instance, there should be applications which have different modes that will enable parents to indicate whether that device will be used by a child or otherwise. This will make it easy for them to monitor and determine who accesses it. Nowadays children have devices as early as 3 or 4 years. The content a 5-year-old is exposed to may not be the same content that should be exposed to a 15-year-old. So having segregation of what content should be accessible for different ages and categorising demographic for children according to age would be of assistance to mitigate issues vis-a-vis child online safety.

Moving on, one of the vital questions which arose in this session is what are the possibilities that exist to eventually achieve child online safety on the internet. It is feasible to provide a safe online environment for children, yet, it is a long way to go before the children can safely utilise the internet without external threats. This problem relates to the huge gap which exists between digital immigrants and digital natives, the digital literacy issues among parents, lack of control of the technology itself. Thus, an alternative has to be a place which is creating an Internet for kids. Dot.Kids inaugurated by Dot.Asia is a conspicuous example that is viable for keeping children safe online. This is because there is no constraint thrust on the children insofar as the children are incompetent to express themselves and even explore the internet space. This can only be attained by giving the children space so that there is no extreme content such as pornography, sexual abuse material, gambling, or violence. However, it is noteworthy to reiterate that preserving the safety of the children in such an online platform demands the ultimate level of reliability and integrity of the educators, teachers and parents and people who are around the kids. 

Another solution is by going down to the children's level of awareness. For example, creating comic strips shows how to deal with the Internet. Children will be invested in reading it as it includes images, graphics and conversation which provides them with the safety measures that are required of them on the Internet.  There will be an inclination of the children to actually read such reference materials. It is also critical to seek a resolution for the problem that arises from the parents’ ignorance pertaining to the use of technology as the parents are the first point of contact for children’s development. Regulations concerning online child safety must be developed in a sense that parents can receive and absorb such content. Measures must be taken in accordance with the parents’ level of understanding by acknowledging and being cognizant of the life of the parents which differs due to various factors including experiences and regions. 

On a further note, in answering the question of how we should develop a mechanism to ensure the stakeholders are fulfilling their responsibility to ensure child online safety, one of the important mechanisms is to make global efforts and have an international convention on the issue.  This is important as children are a vulnerable class of society. Thus, to be effective, strategies need to incorporate measures and messages appropriate to different levels of ages of understanding. This also requires children’s and young people’s participation to know their opinions and feedback. We also need to empower the adults, and educators and give them support when they lack understanding of such issues. Not only that, the golden rule of “A little less talk, a little more action.” shall be applied and there should be more cross-regional collaboration in the sense that they should be consistent and actually increase globally in order to combat child online safety issues.  Moreover, there is a need for synergy and collaboration between government and civil society when it comes to the promotion, protection and fulfilment of the rights of children. Hence, all stakeholders must strengthen the capacity and process that relate to the realisation of children's rights and bring everyone to the table for such discussion. 

All speakers and participants are in consensus that social media companies ie BigTech are accountable when it comes to child online safety issues. Given the centrality of the private sector to the internet, BigTech has major responsibilities for child online protection. Social media companies have an obligation to both respect human rights and prevent or mitigate human rights which directly impact their operation, service and products and by-products i.e. advertisements that are shown to children online. Child abuse and exploitation are manifestly adverse human impacts, thus social media companies should be held accountable. There must be a regulation of the data stored and advertisements of certain products to children. For example, the application of WhatsApp, Telegram and Facebook store data of children and teenagers and these BigTechs surely have saved the said data. Hence, there needs to be an improvement in terms of the texts, images and videos being stored and removed when it comes to children and teenagers. Furthermore, to ensure the safety of children online, it is necessary to introduce responsibility for global platforms at the legislative level. Responsibility should be comprehensive which indicates that all stakeholders should be responsible including online platforms and social media since children and young people spend much of their time on these platforms. This issue should be raised in many other Internet Governance forums and aim for the BigTechs to also be involved in this crucial discussion. Additionally, breaches made by BigTech that concern child online safety should also be addressed in upcoming forums. The solutions to the breaches should revolve around money and to a certain extent, shutting down businesses i.e. factors which hurt them the most in order for them to comply.

In conclusion, the three takeaways from this session concern the dire need for awareness, the push for implementation and third, to take action from all ends regardless of which sector one comes from in order to combat online child safety. Thus, it is vital to strengthen all stakeholders’ responsibilities and start with the laws which are already in place concerning online child safety and ensure its implementation with the cooperation of all stakeholders. Moreover, it is imperative to create more awareness by educating parents and children. Most importantly, we must emphasise punitive measures to stop online predators and offenders. Finally, all must have a need to transcend from policy to act as online safety for children is a universal issue that must be. 

 

IGF 2022 WS #406 Meaningful platform transparency in the Global South

Updated: Fri, 16/12/2022 - 07:10
Addressing Advanced Technologies, including AI
Session Report

 

At our session at IGF, we sought to identify key factors for regulators in the Global South to consider as they contemplate transparency regulations for social media platforms. We discussed the kinds of regulatory interventions being contemplated, the kinds of harms sought to be addressed, and safeguards and other considerations that regulations must account for. The following are the key themes and conversations that emerged over the course of the discussion.

Participants and panelists spoke of the importance of using a two-pronged approach in considering transparency mechanisms for platform governance, which provides transparency obligations for both platforms as well as governments. The importance of considering how different States in the Global North and the Global South have challenges which are contextual to them was also highlighted. Transparency regulations must be framed such that they do not become tools for enhanced control over speech, and applying transparency requirements to States is essential in this regard.

The panelists spoke about the importance of recognising that transparency is an iterative process which will have to adapt to the changing technological and regulatory environments, and will evolve based on insights gained from information provided by platforms. As first steps, it would be important to develop enabling frameworks to see what kind of information would be useful for platforms to provide, and to incorporate measures such as data audits and researcher access to platform data.

Fernanda Martins spoke about experiences on platform behaviour during elections, and on the importance of working together to reduce political violence and misinformation in Brazil. She highlighted how the harms of disinformation or political violence are not limited to election periods, but are rather spread across broader timelines, meaning that platform efforts to tackle these behaviours could not be restricted to election times. Fernanda also spoke about the unpredictable nature of social media platforms – changes in governance or ownership structure have significant implications on online speech, and harms such as political disinformation and violence. Platforms can change behavior in significant ways if they are bought or sold, and such decisions can have massive effects on political speech and have other real-world consequences.

Shashank Mohan spoke through some of the goals and challenges of operationalising transparency. Ideally, transparency would lead to a fair and unbiased experience for users on social media platforms, and a system that respects user rights. Any measures to operationalise transparency would have to include contextual factors such as the scale of relevant populations and the level of heterogeneity. Information provided without accounting for such considerations could be incomplete or have limited utility - for example, broadly worded requirements for transparency in the context of content takedowns may mean that platforms provide broad metrics on their moderation efforts and not account for nuances in local contexts which may be necessary to address harms. This would not serve the purpose of transparency regulations, and therefore regulatory interventions would need to balance the level of granularity required by transparency mandates. Shashank also highlighted the importance of the Santa Clara principles in developing standards in this context.

Emma Llanso outlined the history of transparency mandates and provided an overview of various approaches to transparency that are currently being adopted. She spoke of the different kinds of regulatory interventions and their goals – the Digital Services Act, for example, sets out different obligations for platforms, and requires that they provide information on the automated tools they use, and also on the considerations behind the design of their algorithms. Such information would provide insight into the content that gets promoted on various platforms, and how these assessments are made.

Emma pointed out that another core focus area for transparency regulation is on digital advertising, particularly on how targeted advertising works online. Another avenue of reporting targets users, and requires platforms to provide notices for content moderation, and policies and processes for content takedowns and appeals. Such measures, and others targeted at making websites more accessible, are aimed at helping users understand platform behaviour and empowering them to seek redressal. Emma also pointed out that another large bucket of regulation focuses on researcher access to data, and the use of audits to understand the internal processes driving algorithmic decisions. Measures that require platforms to share access to information with independent researchers are crucial in understanding the relationship between platforms and harms, and to identify areas for further intervention. In this context, regulations would need to find ways to provide necessary information to independent researchers while also maintaining privacy of users of platforms. Emma pointed out that it is currently difficult to assess what the consequences of such interventions would be, and that transparency regulations would need to be iterative and responsive to information that is provided.

Chris Sheehy stressed on the importance of using a multi-stakeholder approach in transparency regulations. In part, existing efforts have been a response to previous multi-stakeholder collaborations. Chris highlighted the importance of the role of multi-stakeholder forums in checking transparency commitments of various platforms and also in auditing the frameworks of information and communication technology companies. In this context, he spoke of the Action Coalition on Meaningful Transparency (ACT), which is part of a global multi-stakeholder initiative led by civil society that aims to identify and build collaboration across various streams of digital transparency work, as well as to increase the diversity of perspectives considered in those efforts.

In response to a question about the role of the government in the context of transparency requirements, panelists spoke of how more granular reporting (such as on what category of law was violated, clarity on when takedown requests have been made by governments, etc) would provide more useful information. The importance of requiring governments to be transparent about takedown orders, and on the importance of including States in such obligations was stressed on, as a way to make sure that transparency obligations are effective and centre users and their rights. Panelists pointed out that existing transparency requirements in this regard could be strengthened across Global North and South countries. The challenges of instituting such mechanisms in countries with a history of State censorship was also discussed, along with ways to balance speech and other considerations.

IGF 2022 WS #269 Data privacy gap: the Global South youth perspective

Updated: Fri, 16/12/2022 - 07:05
Governing Data and Protecting Privacy
Key Takeaways:

Leakage of personal information with the covid-19. Economic problems with natural places, violence in Brazil examples, affect data protection, for example in Amazonia, cut-outs.

,

It’s urgent to have training programmes that are contextual and sensitive on data privacy and data protection.

Session Report

 

Data protection must be mandatory in school education, for the youth to understand the concept of protecting their privacy. Most African countries don't have data protection laws, or are not enforced. They have schools on internet governance, but it is not sufficient, so they need to bring these programs to the school level.

The issue of non-english languages was raised, also people don’t understand how big tech companies are handling their data. The concept of peer-to-peer education was also touched on, where recent research shows that when it comes to children and adolescents having problems on the internet,  they learn from each other.

Via games they could better understand the implications of privacy also, is a good approach of engagement for the children. When people download apps, they are not paying attention to the privacy issues, and they cannot easily understand if they are safe or not.

In the global south specifically,  The European region has more protection than the global south, that is a reality. Also in countries like Brazil. There is a compromise to take with Global south developers for them to provide applications and services when we can communicate properly and safely.

 

IGF 2022 DC-SIG Role of Schools of IG in sustaining efforts to support SDGs

Updated: Thu, 15/12/2022 - 23:42
Connecting All People and Safeguarding Human Rights
Key Takeaways:

1. Recommendion that schools make multistakeholder contributions towards the GDC, so that the consensus voices of the young and active learners can be part of the considerations.

,

2. Capacity building efforts, including Schools of Internet Goverance, must keep promoting gender balance, especially promoting STEM and ICT careers and education among the young without bringing in outdated notions about appropriate gender roles and gender limitations.

Calls to Action

1. Schools of Internet Governance, and other educational institutions in the digital age, cannot be created or succeed where there are Internet shutdowns. End Internet shutdowns for education’s sake.

,

2. Schools of Internet Governance should spend as much time on teaching the critical thinking necessary for the digital age in addtion to teaching about existing technology and governance models.

Session Report

 

DC-SIG 2022 IGF session

 

This year’s session, hybrid in nature, was also a combination of a regular yearly DC SIG review of schools, the yearly activities and planning for the next year, but with an issue based focus: on Schools and their contributions toward the achievement of SDGs. the meeting agenda was divided into discrete sections. 

  1. Self-introductions: at the start of the session there were 30 participants in the room and 15 joining remotely. While a specific count was not taken, the room had a good gender balance.
     
  2. Report from New Schools
    1. Zimbabwe IGF and Zimbabwe ISOC have been organising their SIG for 3 years now. First as a consultative moment, second as a remote session, and lastly as a physical school.
    2. ISOC Comoros organised its SIG in the year 2020, sponsored by PRIDA. They want to have a consultative moment for everyone to know their roles in the ecosystem.
    3. Cote D’Ivore IGF organised their SIG in 2019 and 2020. Got funding from PRIDA but because of Covid, they did not organise any SIG this year.  They need support for speakers and to organise the Western School on Internet Governance and Western IGF this year. It should be noted that as we wid not have translation services, this was done by a bilingual participant. IGF would be well served if translation services were to be provided to all sessions.
       
  3. Details on new things existing Schools have done:
    1. AFRISIG: it is a leadership development school and it is for people already playing a role in the ecosystem.  What was done differently at this year’s SIG was they brought very experienced people from the Cyber Security Authorities across Africa for them to have discussions and have a working paper on Cyber Security.  This was presented in New York and the Africa Union also used it.  The important thing about this effort is that the schools are learning places for negotiation processes.
    2. APSIG: they created a separate programme for people with disabilities.
    3. SouthSIG: the new thing is they did 2 months of online learning and then one-week hybrid learning. Then the last stage was a collaboration with a university and those who got to this stage got a fellowship and a diploma from the university in internet governance. Everything was done for free.
    4. EuroSSIG: this year’s practicum was the most relevant since it contributed to the Global Digital Compact discussion and it is on the website of the UN.  They made recommendations for other schools to make contributions towards the compact and make contributions towards other global discussions.
    5. GhanaSIG: the new initiative is using their fellows as expert speakers to make proposals for events at local and global events. 
    6. InSIG: they reserved seats for people with disability.
    7. RussiaSIG (Illona): they are now using their SIG to research on internet fragmentation issues. 
    8. Chad (tdSIG): they intend to close the gender gap in this year’s SIG.
    9. VSIG: they intend to bring in GDPR for citizens that match various countries’ needs.
    10. North America (Andre):  they intend to make the NASIG process multilingual in future events.
    11. Bangladesh-SIG: hands-on learning on the time-demand learning process.  They also intend to do the SIG in local languages.  They requested to be added to the DC-SIG since they have done 6 schools already.
       
  4. Discussions on SIGs and SDG:
    1. What has been done and what could be done:

While this section of the session was intended to allow discussion of SIG contributions in several areas including SDG 5 on Gender, SDG 7 on Energy, and SDG 13 on Climate Change, the meeting only managed to have a single extended discussion on SIGs and SDG 5. Further sessions, yet to be scheduled, are planned for the 2023 to cover the other SDGs.

Regarding SIGs and SDG 5, the following points were discussed.

  1. Increase gender inclusion: this was done by the Russia SIG.  They keep getting more applications from ladies.  
  2. AFRISIG: they have questions on what applicants think on gender equality and they include sensitive issues such as the LGBTQ in their application process.  They encouraged all schools to include the practicum session in their SIGs.
  3. EuroSIG: they get a vast majority of females as their applications, but rather not more male applicants.
  4. BrazilSIG was done which includes a legal school dedicated to lawyers and judges, which has been a very good experience. 
  5. RSIG: SIGs must keep promoting gender balance, especially promoting STEM and ICT careers and education among young students.
  6. ZimbabweSIG: they get more female applicants, however, female engagements is less in the actual participation.  Anriettte’s responded that AFRISIG does not have such experience, hence, the ZimbabweSIG must look at what works for them, and also include people in the moderation process in order to make it inclusive. Design the school in order for the fellows to know that they have an expertise that will be acknowledged.
  7. Chad Youth IGF coordinator: how can schools engage in discussions when they were being subjected to  internet shut downs (Khouzefi).
  8. Joshua (GhanaSIG fellow), be deliberate and keep pushing female fellows to participate in the fellowship process.
  9. Liana (online): more than 90% are females in the Armenia SIG.
  10. Sarata (online): we are making conscious efforts to include females in the community. 
  11. RRSIG (online): according to Illona they select everyone with clear interest in their SIG.

Because the session was running out of time, there are plans for continuing the discussion at some point during 2023..

  1. AOB:  
    1. The chair encouraged all SIGs to come together and have a global faculty, and global fellows.  
    2. There are no schools in Japan and they would want to have one. 
    3. There will be a networking session for all fellows of all SIGS in next year’s IGF.
    4. Given the number of issues that those attending wished to discuss, consideration will be given by the DC SIG to a Day 0 event at IGF2023 and/or an intersessional event during 2023.
       
  2. At the end of the session, we had 45 participants in the room and 24 people remotely.
IGF 2022 WS #482 Internet Shutdowns: Diverse risks, challenges, and needs

Updated: Thu, 15/12/2022 - 19:06
Connecting All People and Safeguarding Human Rights
Calls to Action

Need more support to activists in countries for longer-term coalition building, training and advocacy to prepare for and prevent shutdowns.

Session Report

 

Participatory research for internet shutdown advocacy

 

The OPTIMA project works with advocacy communities all around the world to predict, prevent, prepare and respond to internet shutdowns. What we’re going to present today is research we have done over the past 6-9 months regarding the assessment of needs from our partners in different countries, which feeds into an internet shutdown toolkit that you can find online.

As it can be read in the report, every context is different, politically, socially, and in terms of resources and capabilities. There’s also lots of different kinds of stakeholders involved in shutdown advocacy strategies (governments, ISPs, activists, technologists, journalists, industry). Normally when we do this kind of work we’re in crisis mode; shutdowns occur in times of crisis such as elections, political uprising, etc. If we can plan for something, such as in the case of elections, we will attempt to mobilize resources in advance, but this is very often not possible.

We wanted to understand different stakeholder’s perceived experience regarding shutdowns, and also what they think are their skills and the perceived gaps around doing this kind of work, in order to understand how to broaden coalitions, provide support mechanisms, and develop better campaigns.

We first did a survey -snowball sampling- and then we followed up with a workshop in specific settings talking to people about the results we had. And then finally we had co-design workshops, which allows people involved in processes to be more involved in designing the outcome and to tell us what they really needed.

 

Bangladesh (Miraj Chowdhury)

  • From 2012 to April 2022, we have seen at least 17 shutdowns. Just in this past month, we have seen at least 4 throttling events targeting political rallies. Targeting mobile networks, Facebook blocking are all common practices and internet censorship is growing since 2018.
  • There is an election in 2024 and they are already anticipating some sort of censorship event for that time.
  • All of these events were mostly targeting political tensions and mostly communal riots.
  • This outcome comes from talking to hundreds of different people from different organizations.
  • 88% of people said they have experienced internet shutdowns in the past three years.
  • Most said that the largest impact of shutdowns is in business and economy.
  • Shutdowns are often justified to contain disinformation, and what we found is that when there is a shutdown people are unable to get proper information. 
  • Whenever there is a shutdown there is no accountability because neither the telecom service provider nor the government issues any kind of statement justifying why the internet is being shut down.
  • Civil society doesn’t have the technical capacity nor the larger understanding of digital rights to respond to shutdowns properly. 
  • Most people do know how to use VPNs but they’re not responding to shutdowns in a way that creates advocacy in a national level.
  • We need to create digital rights capacity and broaden communities engaged in this issues, as well as support technical skill-building. Even in remote areas, if there is a shutdown it might be never reported, and we need to document these cases and bring them into the discussion. 
  • For a country like Bangladesh, internet shutdown advocacy will need to start from scratch.

 

Senegal

  • Long been seen as one of the most stable democracies in Africa, but backsliding under Macky Sall. High rate of internet penetration; high rate of mobile devices.
  • There was an incident in March 4, 2021, when following a day of protests across the whole country, social media platforms were unavailable, but there were not enough people on the ground measuring the disruption, and so the shutdown was not well documented because there was not enough people on the ground and also not enough technical skills to document it. 
  • We found that civil society is not well prepared on internet shutdown topics, which is why we don’t have good data regarding the 2021 event. When the internet is not working, people will just assume it is a technical issue and leave it at that. There aren’t the skills on the ground to prove if it’s technical or if it is a shutdown.
  • 64% of respondents believe that a shutdown is very unlikely within the next year.
  • 61% reported civil society capacity as low or nonexistent.
  • There is low general awareness of circumvention tools, a dire need of skill-building regarding network measurement, low levels of awareness amongst lawyers and judges for censorship and digital issues, and a need to develop strategies to engage the government and the private sector.

 

India (Chinmayi SK)

  • India continues to top the list of countries for most internet shutdowns carried out in one year (106 shutdowns in 2021 alone).
  • There are many different reasons why shutdowns happen in India (law enforcement, exams, etc).
  • Media freedom is decreasing, internet censorship is increasing.
  • There are certain laws that enable internet shutdowns to be used as tools in certain scenarios.
  • They wanted to build and plan based on how people are affected and what their needs are.
  • They needed to add interviews on top of the surveys because the surveys were targeting English-speaking people and they needed to add access for people who didn’t have that skill. 
  • They were able to involve people from 14 states in India, including students, researchers, journalists and activists. These were people who had experienced shutdowns, challenged shutdowns in some cases, these were smaller groups so the discussions could be free-flowing.
  • 76% of people had experienced at least one internet shutdown in the past three years.
  • Internet shutdowns disproportionately impact certain states and communities. Even within the same state, some people had very different experiences than others.
  • 60% are familiar with shutdowns, but do not understand how they occur technically or legally, how they were implemented or any of that.
  • Certain pockets of the country have certain capacities —in certain places, people does have the capacity and the understanding to fight shutdowns, reporting a capacity of 33% in network measuring, which we can consider to be high. They also have the shutdown tracker so we could consider that the capacity for documenting is good.
  • They have been able to engage in litigation an fight cases to question the necessity and proportionality of shutdowns, and in some cases they have been able to give good judgements. 
  • There is a lot of hesitancy regarding the usage of any sort of circumvention tool. “Are VPNs illegal? Is it risky?”
  • It is important to document the events even if we’re not able to fight back.

 

Tanzania (Rebecca)

 

  • Restrictions to posting on social media, restrictions to NGOs.
  • Even after the change in president, people are being very cautious.
  • The 2020 shutdown was the first of its kind in Tanzania and it means that now, the communities are thinking more about that as something that can happen again. 
  • It seems that media and activists have been figuratively taken out of “prison”, but the laws that put them there in the first place haven’t changed. 
  • Awareness about shutdowns is high, but knowledge is low. 71% of the respondents reported having experienced an internet shutdown, but 46% of them said that they can’t tell —or aren’t sure if they can tell— the difference between a shutdown or an internet connectivity problem or a technical issue.

 

(At this point in the note taking process I got a horrible headache and life got really hard)

 

(Miraj) In order to advocate against shutdowns, we need to document the impacts, because that is the only way in which we can develop the arguments and the evidence to fight the shutdowns. I think this is where we are lacking the most in Bangladesh. On the other hand, we need to engage businesses; sometimes businesses have a stronger voice than civil society, depending on the social and political context of the country. What kind of advocacy is needed to empower and engage businesses to also advocate against shutdowns from their perspective and their interests?

 

(Chinmayi) In the case of India, there is enough documentation to start conversations with, regarding the effects, the consequences, the problems caused by shutdowns. It’s now for us to have the government look at this documentation and really think about necessity and proportionality.

 

IGF 2022 DCNN Internet Openness Formula: Interoperability + Device + Net

Updated: Thu, 15/12/2022 - 16:12
Avoiding Internet Fragmentation
Key Takeaways:

Internet Openness is essential and instrumental to foster the enjoyment of Internet users' human rights, promoting competition and equality of opportunity, safeguarding the generative peer-to-peer nature of the Internet.

,

Internet Openness is a multifaceted concept, and the debate on Net Neutrality and non-discriminatory traffic management is only part of the broader openness debate. Net Neutrality is necessary but not sufficient to guarantee internet openness. Besides net neutrality, to guarantee Internet Openness it is essential to promote and preserve infrastructural interoperability and data interoperability, as well as device neutrality.

Calls to Action

Stakeholder should stop analysing internet openness threats based on a Internet layer approach but should work together to understand how internet openness is threatened and can be preserved via a systemic approach.

Session Report

The session was opened by DCNN coordinator, Prof Luca Belli (FGV Law School) and the session co-moderator Ms Smriti Parsheera (CyberBRICS Project). They stressed that over the past decade DCNN coalition has been advocating for open, secure, and non-discriminatory Internet, affordable and accessible to all people; promoting Network Neutrality as this fundamental principle plays an instrumental role in preserving Internet Openness; fostering the enjoyment of Internet users' human rights; promoting competition and equality of opportunity; safeguarding the generative peer-to-peer nature of the Internet

Since its creation this Coalition has explored the various dimensions of Net Neutrality and Internet Openness, acknowledging that Internet Openness is a multifaceted concept, and the debate on Net Neutrality and non-discriminatory traffic management is only part of the broader openness debate

Over the past years, Internet Openness analyses have increasingly focused on interoperability and device neutrality, acknowledging that net neutrality is only one necessary yet not sufficient ingredient of a successful internet openness formula, which include Interoperability + Device Neutrality + Net Neutrality.

Yet, net neutrality debates keep on being popular in policy circles, especially at the EU level, with the recent discussions regarding the introduction of a "fair share" proposal based on the "sender party network pays" model and its compatibility with networks neutrality principles. But also, at the south Korean and Latin American levels, as discussed by participants.

A large number of Interne Openness related issues were explored by the speakers, including:

  • Lina María Duque del Vecchio, Commissioner at the Colombian Communications Regulator, Comisión de Regulación de Comunicaciones, Latin America, Government, Latin America
  • Maarit Palovirta, Senior Director Regulatory Affairs of ETNO, Private Sector, Western Europe
  • Thomas Lohninger, Epicenter Works, Civil Society, Western Europe
  • Angela Daly, University of Dundee, Academia, Western Europe
  • Sabelo Mhlambi, Founder of Bhala, Private Sector, Africa
  • Kyung Sin (KS) Park, Director of Open Net Korea, Civil Society, Asia-Pacific

Stakeholders manifested diverging views on the so called “fair share proposal” stressing that it might undermine Internet Openness, as emphasised in the Open Letter addressed to DCNN members to EU Commissioners, in October 2022 https://internetopenness.info/29-internet-experts-and-academics-send-a-letter-to-the-commission-urging-to-abandon-the-sending-party-network-pays-proposal/

Stakeholders also broadly agreed on the usefulness of the elements defined in the DCNN 2022 Outcome, the Open Statement on Internet Openness https://www.intgovforum.org/en/filedepot_download/92/23885

Namely, the Statement stresses the importance of:

1)     

Network Neutrality is the principle according to which Internet traffic shall be treated without discrimination, restriction, or interference regardless of its sender, recipient, type or content so that Internet users’ freedom is not restricted by favouring or disfavouring the transmission of specific Internet traffic. Exceptions to such principles shall be necessary and proportionate to achieve a legitimate aim.

2)     

Interoperability is the ability to transfer and render useful data and other information across systems, applications, or components (horizontal interoperability) and for third parties to build upon a certain technology (vertical interoperability). The combination of transmission and analysis involves several layers of interconnection, requiring the achievement of various levels of interoperability. At a minimum, one should distinguish between the lower (network) and the upper (application) layers, pointing to a division between infrastructural interoperability and data interoperability.

3)     

Device neutrality is the property ensuring users’ right to non-discrimination in the services and apps they use, based on platform control by hardware companies. That means users can have a choice of the application they prefer to use, regardless of the brand of device they are using. In other words, device neutrality is instrumental to achieving the ability to run any application so that users can access and share to all applications, content, and services, as long as they are deemed legal in a given jurisdiction, which is essential to achieving an open Internet

Lastly, participants stressed that stakeholder should stop analysing internet openness focusing merely on the Internet access layer approach but should work together to understand how internet openness is threatened and can be preserved via a systemic approach.

IGF 2022 Open Forum #38 Data as new gold: how to avoid ‘gold rush’ and create value for all

Updated: Thu, 15/12/2022 - 15:57
Governing Data and Protecting Privacy
Key Takeaways:

1. Enforcement of the data policy regulation should be adequately addressed and ensured. Enforcement is crucial, but common understanding and harmonised approach across the globe are important too. 2. Civil society organisations play very important role to make sure human rights are defended. Multistakeholder engagement throughout the process is crucial. Human centric approach and human rights should be embedded in the legislation.

Calls to Action

Participants of the session called for models that benefit society starting from the principle that the most vulnerable should be protected. If they are – the whole society (is protected) too.

Session Report

Governance Forum’ brought together six leading EU and Africa’s experts representing private sector, academia/think tank, civil society and public sector/government.

The speakers of the session - Marek Havrda, PhD (Deputy Minister for European Affairs, CZ Presidency), Bridget Andere (Access Now, Africa Policy Analyst), Alberto Di Felice (Director for Infrastructure, Privacy and Security, DIGITALEUROPE), Maria Rosaria Coduti (DG CNECT, Policy Officer for Data Policy and Innovation), Chloe Teevan (Head of Digital Economy and Governance ECDPM), Johannes Wander (Policy advisor on digital development & innovation at GIZ to AUC) shared  their views on the forum topic.

 

Human centric approach – role of the governments and how to strike the right balance between making available more data for reuse and guaranteeing  privacy and data protection

 

Dr. Marek HAVRDA informed that CZ presidency presented a Joint policy statement - Human-centric approach at the core of the standardisation and connectivity, at ITU Plenipotentiary Conference (Sept. 2022) on behalf of EU27, signed by 57 countries. He states that in a modern world we are dealing with trade-offs, privacy is one of the criteria, understanding and being aware of the use of technology, respect for the individual autonomy are other important criteria. And finally there is the overall impact on human well-being at large. There are much larger risks and we need better checks and balances, combining different types of data, especially criteria of privacy. On digital divide: having a human centric approach should also help to bridge digital divide. We need to know what data is out there and how it is used in the communities.

 

Mrs. Maria-Rosaria CODUTI expressed opinion that data protection and access sharing & use should not be treated as contradicting elements, these are two elements of the same face. EC is putting in place data protection in legislation when drafting, and complement already existing legislation. You can't just protect data and impede the use of data. We create a trustful environment, a trustful ecosystem, data subjects, data users, etc. the basis of our data governance model is a human centric approach. We have put this into the legislation, into the Data Act for example. This empowers users, ie. of IoT objects, that by individuals, by interacting, by using IoT objects they produce valuable data and they need to have a say on the data. We need investing in technologies that includes privacy by default.

 

Digital transition and data economy – how ensure no one is left behind?

Alberto di FELICE noted that companies in Europe - big or small  - do not have monopolies in the market. These are all central points of the EC. And their data strategy has been centred around sovereignty and how data can protect the economy. In the EU we are in the middle of several proposals around data, such as AI act. And respect safety and fundamental rights, data sharing across sectors and players in the economy (Data Act), vertical proposals such as the EU health data space. It is a complex environment because there are lot of proposals and also lots of regulations in place (e.g.GDPR). ‘Gold is great but it's also heavy’, so we need to know the amount of regulation particularly if we have more of it, can also facilitate data sharing. We're also in the middle of a global crisis (pandemic, war in Europe). One aspect underlying data discussions is a connectivity. We're building partnerships worldwide, e.g. US: TTC strengthening joint initiatives built on the global gateway.

 

How to build a data governance model which benefit both the economy and the society? Regulation versus enforcement: margins for improvement.

 

Bridget ANDERE – expresses opinion, that  human being is central. We need always pick society if having to choose between society and the economy. Importing laws and infrastructure from other places without looking at consequences in own country are detrimental. So we need to build models that benefit society, what impact will it have on the end-user. Human rights diligence is very important. Look at the people whose data will be collected and who is at most risk, make sure they're protected and you don’t have to go back and fix it in retrospective. We need to engage in public participation in these processes. Regulation vs enforcement is not just a problem in Africa. We have lots of regulatory frameworks that are amazing and have policies and regulations that are supposed to go with them, these laws are often existing in vacuums. No absolute rights protection, opt-in/opt-out mechanisms, just use national security as a reason. Limitations on laws are very broadly formulated and not clear, creates lots of gaps. When it comes to operability we have principles but we find ourselves with really good laws but bad implementation. We need mechanisms that allow people to complain about infringement of their rights. 

 

Chloe TEEVAN argued that GDPR, one of most established European regulations, was based on multistakehokder consultation (incl civil society), but it is also not without its faults. It has become a model around the world but is not necessarily either adhered to in other contexts/countries. Ireland hosting many tech companies, this has an impact not only at EU data protection but also globally as other countries have to go through the Irish Data Commissioner. And if there are improvements it is because of civil society putting pressure. Data Commissioners across the EU have been discussing how to improve enforcement and also invited civil society to the table. Multistakeholderism is important element, and if enforcement improves it is because civil society constantly holds government accountable. Active CSO participation is really essential and also means adapting to the context you're in and bring the voices in. In certain contexts there aren’t even data commissioners in place to enforce such regulation or they don't have the resources and independence they need. Also questions whether big tech - even in EU- takes government seriously, they pushing limits in EU. Even more difficult for smaller African countries. These are just a few issues with enforcement, also when talking about GDPR as a gold standard.

 

Key challenges on regional level (focus on Africa)– how to act efficiently to closing digital gaps. Role of data to bridge digital divide, ensure strong data protection and inclusive economic growth - case study of EU-AU Data Flagship and the Digital Global Gateway.

 

Johannes WANDER presented a perspective is from the work with the African Union. Lots of countries in Africa are interested in more localisation which then tends to very much a locked-in approach and you can not leverage the benefits for the economy and society. AU developed Data Policy Framework and endorsed it this year, the EU is supporting this. AUC is in a leading role in formulating such policies at a continental level. EU should support this endeavour, also as part of the global gateway. But of course harmonisation is an issue, AU has double as many members as the EU, but we all know how many years it took for the EU to reach a consensus. Enforcement is also an issue. This is something we will work through the next three years, plan is to align existing stakeholders to bring this framework alive. We need to find solutions at national and community level. How do we ensure a just data governance? How can the African continent manage and use the data for itself, without localising more than 50 countries. Certainly the data is a new gold or oil, but the main question is how to make use of that.  Some countries have legislation in place and second step should be enforcement to ensure the economic growth.

 

 

Question/ Answers round.

 


·      

Tony Blair institute: how can we simplify protection evaluation to actually make data transfer possible across borders in case of GDPR?

Maria Rosario CODUTI answered the question by addressing international provisions of Data Governance Act, between EU and third countries. Provisions to ensure sensitive publicly held non personal data so it will not be subject to unlawful access. We have a regime abased on intervening acts for non-personal data that we think will create a bottleneck. We extend the provision in data governance act to cloud service providers and customers. These rules are similar to Schrems II. We don’t create data localisation but encourage data sharing.

 

Chloe TTEVAN mentioned South African example -  it took a certain amount of time to develop the Protection of Personal Information Act (POPIA) and by that time the EU has moved to GDPR and South Africa was not granted adequacy. So this is a really big question.

 


·      

Question (Government representative): knowing data is gold, how to manage our data if there is no standard framework internationally?

 

Bridget ANDERE. There is no one standard framework working for everyone. So how do we ensure adequate data protection? Take the person that is most at risk and then you have adequate data protection frameworks that will work more widely

Johannes WANDER added that in case of the AU data framework, its principles that can be interpreted at a national level and see what works for them

 


·      

Question: (representative of Ministry of Technology, Ethiopia): which values can be incorporated at national levels in African countries?

 

Johannes WANDER replied that in EU there is a law interpretation focused very much on the individual. In some countries in Africa the communal aspect tends to be more important or higher priority than the individual. For example, health data is very private and should not be at communal standard. But societies really vary across the continent.

 

Dr. Marek HAVRDA added that  there is a need for new methods to monitor enforcement. Especially on privacy rules that differ between countries.

Key takeaways from the forum:
-Enforcement of the data policy regulation should be adequately addressed and ensured. Enforcement is crucial, but common understanding and harmonised approach across the globe are important too.
-Civil society organisations play very important role to make sure human rights are defended.
-Multistakeholder engagement throughout the process is crucial.
-Human centric approach and human rights should be embedded in the legislation.
-A clear need for models that benefit society starting from the principle that the most vulnerable should be protected. If they are – the whole society (is protected) too. 


IGF 2022 WS #497 Designing an AI ethical framework in the Global South

Updated: Thu, 15/12/2022 - 14:54
Addressing Advanced Technologies, including AI
Key Takeaways:

The AI ethical framework in the Global South relies on hard and soft law. Countries like Brazil, Chile and China are closer in the development of hard law, while in other regions like Africa and India the soft law approach predominates. In any case, there is an intense connection with development and innovation when it comes to AI and regulations need to consider ethical guidelines, human rights, diversity and multistakeholderism.

Calls to Action

Government: be more transparent and inclusive, considering most vulnerable groups in the debate. Civil society: keep strengthening underrepreresented voices and raising issues related to impact to human rights when it comes to AI use and development.

Session Report

The moderators Cynthia and Alexandra started the panel by introducing the regulatory context of artificial intelligence. Then, they introduced the respective panelists, the dynamics and objectives of the workshop, which, in short, was intended to explore how the regulatory landscape of AI has been developed in the Global South.

 

The panel's initial question was asked by moderator Alexandra: “What steps have your State taken in creating a regulatory framework for AI? Are there any legislation, bills, policies and/or national strategies seeking to establish rules or recommendations for the development and use of artificial intelligence? If so, what are their main features?

The first panelist to respond was Smriti Parsheera, from India, representing civil Society. She is a Fellow with the CyberBRICS Project from the Fundacao Getulio Vargas.

Smriti responded that the focus of discussion in India has been issues of promotion, innovation and capacity building. She also mentioned liability and regulation, noting, however, that these have not been the main focus. She argued that the main processes are not legislative and binding, but soft law mechanisms. She also mentioned that government committees have been created that look at privacy and security aspects. She mentioned the 2018 “AI For All” document, which sets out principles for responsible artificial intelligence, and stressed that people are already beginning to talk about the need for risk-based regulation aligned with principles such as those enshrined by UNESCO. Despite the existence of discussions and proposals for regulation, she stated that the focus is still on compliance and self-regulation, with a long way to go before binding legislation emerges in India.

 

The next panelist was Wayne Wei Wang, from China, representing technical Community, through the University of Hong Jong and Fundação Getúlio Vargas.

 

Wayne argued that China has a governance model for AI and data protection, mentioning that Oxford held a conference called “The Race to Regulate AI” in mid-2022, where three approaches to AI regulation were mentioned. He pointed out that regulation is often not specific and can be used together with data protection legislation. He argued that Chinese regulation encourages the large population to participate in the digital transformation, for example, in 2015, where there was the Made In China 2015 Plan and the Internet of Plus Initiative. In 2017, a formal document called The New Generation AI development emerged that defined the commercialization of AI as a market goal. And in the last two years, China has established a government AI committee that, although centralized, allows multistakeholder participation. Wayne summarizes that China regulates AI through hard law mechanisms, such as trade and data protection legislation, and also soft law, mentioning national incentives and strategies. He ended by mentioning that China has introduced specific legislation, “The Preventions of Internet Information System”.

 

In continuity, the panelist Thiago Moraes, from Brazil, representing government, through the Brazilian Data Protection Authority, replied that the debate in Brazil has a national strategy (2020) and a bill in progress, also mentioning the importance of guidelines of the OECD in this process.

 

He mentioned that the national strategy is based on the horizontal axes of legislation, regulation and ethical use, AI governance and international aspects, in addition to six vertical axes with related themes. In the legislative field, he mentioned the bill 21/20 in which there is a Commission of Jurists with 18 specialists to prepare a substitute text based on the debate with public hearings and a multisectoral approach, in order to understand the socioeconomic reality of Brazil. In the field of supervision and governance, he argued that the idea of multiple authorities coordinated by a central authority is a possible proposal for Brazil.

 

The panel continued with Bobina Zulfa, from Uganda, representing civil Society through Pollicy.

 

The panelist pointed out that she would address an overview of what is being discussed in Africa as a continent, since unfortunately in Uganda there is still not much regulation on the subject. She mentioned that in Africa few countries have written about the subject and that progress in this field is still slow. For example, only about six countries have national AI strategies and only one country, Mauritius Islands, have an AI legislation. Much of the regulation stems from data protection and soft law discussions, such as the Malabo Convention of 2014 (which only thirteen countries have signed), and Resolution 473 of 2021 which aims to study AI and robotics in terms of benefits and risks. At the moment, it is suggested that attention is being paid to the principles being developed in other regions in the hope that this will reach people on the African continent in a positive way. On the other hand, it was mentioned that in these discussions there is still a lot of opacity, being necessary to add transparency and participation.

 

The last panelist, Juan Carlos Lara G., from Chile, representing civil Society, through Derechos Digitales.

Juan Carlos points out that technology has been seen in his country as an opportunity for development and participation in the global dialogue with countries that are at the forefront of this implementation process. In Chile there is a national artificial intelligence policy for the years 2021-2030 and an action plan to implement the policy, coming from the Ministry of Science, Technology, Knowledge and Innovation. The Chilean experience is to assess the country's capacity and stage in the implementation of AI, in an optimistic and economic view that says very little about the boundaries of the technology, being much more focused on assessing potential to the detriment of ethical and responsibility challenges. There is still an ethical gap regarding the discussion of risks, impacts, accountability and damages. However, recommendations are being made and it is important to include new voices and participants in the debate, in addition to deepening it to understand local needs.

Moderator Cynthia highlighted the use of hard law by some countries and soft law by others presented, as well as the need for inclusion, participation and ethical guidelines. Next, she asked Smriti and Thiago a question: “Is and how diversity and multistakeholderism taken into account in the regulatory debates? (race, gender, territorial, vulnerable groups, academia, civil society, private, public sector, specialists)”

Smiriti responded that diversity and multisectoriality can be analyzed at various levels. The first level is who has a seat at the table when the discussion is taking place; the second level is that of those who participate in the debate with deliberative capacity; and the third level is that of who is producing knowledge in this process. She argued that India has a very diverse social context, which highlights the concern with non-discrimination and bias.

In this context, she stated that the government sector has involved the private sector and part of academia as multistakeholders in the discussion and that Centers of Excellence have been implemented in technical institutes around the country where startups, entrepreneurs and academia are called to dialogue about innovation. Furthermore, the National AI Portal is being developed, the result of government collaboration with an industrial sector, which aims to be an institutional repository of AI in the country. She also mentioned government committees, which include people from academia and industry. However, she concluded that the discussion is still not open to all who represent the diversity of academic perspectives and that the participation of civil society is critical because it is being little heard. Therefore, it is necessary to improve the transparency and participation of the process.

The moderator Cynthia emphasized that this gap is a point in common with Brazil, due to the difficulty of including some groups in the debates, which are dominated by the preponderant presence of the private sector. She highlighted the relevance of the discussion to ensure the participation of affected vulnerable groups who are not included and who need space for deliberation.

Thiago recalled the importance of multisectoriality in Brazil, as in the case of the “Comitê Gestor da Internet”, in which all participants must be taken into account, as in the process of drafting the “Marco Civil da Internet” and the “General Data Protection Law". The challenge was highlighted in a country of great diversity in Brazil and that indigenous peoples are still little heard, despite being an extremely important part of the country. Thiago pointed out that in Brazil there is an effort on racial and gender diversity, but that there are still many challenges to face. It was pointed out that Bill 21/20 is an interesting experience because it was proposed in 2020, in the pandemic year, and this discussion was not deepened, so that the private sector took on a lot of prominence in the debate. Only in 2021 was the Commission of Jurists proposed, where more voices were expanded.

In continuity, a questioning was made by the on-site hearing, where it was questioned, for the Chinese case, how the inspection has been carried out, considering the legislation that deals with algorithmic transparency and recommendation systems. Representatives of civil society were also questioned about experiences of participation and diversity.

Juan Carlos responded to the last question that it is important to highlight public consultations, which rely not only on individual responses and external experts, but on the invitation to introduce and create with people from civil society and who do not necessarily have technical knowledge. On the other hand, he mentioned the participation that started on digital platforms and that did not have accessibility or other languages, including indigenous ones. Furthermore, 70% of the people who answered the query were men. So, there is still a lack of processes to overcome inequalities.

Wayne, in turn, replied that in China there is the Cybersecurity Administration (CSA), which adopts a routine for supervising activities called the Clean Cyberspace Campaign. Other supervisory authorities are Companion-Life Enforcement Activities and the Ministry of Industry and Information Technology that also adopt this type of campaign. These authorities examine, for example, applications in terms of data protection, security, etc. China has also released a guideline on algorithm registration systems.

One more question from the on-site audience was asked. This time it was asked if there are specific examples of how the government has engaged target groups in discussions.

Bobina responds that she is not aware of specific legislation in Africa, but civil society groups and academia, such as the African Commission for Human and People’s Rights, have tried to broaden the debate at initial levels.

Juan Carlos, for his part, responded that Chile has the example of a national cybersecurity policy for the years 2015-2021, where some groups focused on the issues were heard. It was still a restricted initiative, but guided by the government. Juan Carlos went on to add that this initiative is not just up to the government, but can also be promoted by civil society. This collaboration can come from the promotion of training, petitions or invitations to participate in procedures, being fundamental to think of ways to collaborate and also cultivate the knowledge of the academy.

Smiriti pointed out that technology policy must be transparent with civil society.

In continuity, a question was asked by the remote audience to the panelist Thiago. Questions were asked about transparency and how it works in terms of consultation and engagement.

Thiago replies that in Brazil there is an effort towards transparency and that there are challenges, but also some examples of what might work. At this point, for example, the Access to Information Law, which is about ten years old and which can help with these challenges of transparency. This is legislation that deals with the collection, access and request of information to the government. He pointed out that the legislation may conflict with data protection in some cases, but there is still optimism about its functioning. The central question would be what degree of transparency is achieved, combined with the difficulty of technical capacity and financial resources, which is usually addressed to the private sector and which is a difficulty that can be faced in other countries as well.

One more question from the on-site audience was asked. This time Bobina was asked how the debate on facial recognition in public safety has been conducted and what are the main concerns related to the topic.

Bobina replied that facial recognition technology for public safety purposes has already been used on the African continent, in Zimbabwe, Uganda, etc. and that such mechanisms culminated as instruments of mass surveillance, with many researchers emphasizing the harm resulting from this use.

Juan Carlos said that the issue touches on public interest and that in Latin America, regardless of any regulatory debate, facial recognition in public safety is inadequate for fundamental rights and is usually being questioned in the courts. He also argued that the systems are neither technically robust nor legally authorized and that this has been happening in cities like Santiago and São Paulo.

The last question came from the on-site audience and asked what would need to be done and discussed in the future for the AI regulatory framework in the Global South.

Wayne replied that one of the biggest challenges to be resolved and that has been guided by the Chinese discussion is the paradox of transparency and commercial secrecy, making it difficult to assess accuracy with the protection of secrecy. He also mentioned the co-regulation model and the expansion of stakeholder participation. Finally, he raised the point of “ownership” of data and algorithms, which have been commercialized in China.

Thiago replied that there are many steps to be taken and that it is necessary to think about an intelligent regulation that can be applied. He mentioned that hard law often fail to keep pace with the development of innovation, and it is necessary to think of alternatives. At this point, he also argued about means of partnerships between sectors that can favor dialogue and that have been promoted in the last five years, such as hackathons, innovations hubs and regulatory sandboxes. All of them have specific characteristics and give the public manager the opportunity to approach the regulated field. But there is still a need to think about designs for transparency, diversity and amplifying voices.

Moderator Cynthia concluded by pointing out that the use and development of artificial intelligence is usually accompanied by the rhetoric of innovation, however, it is necessary to talk about the risks and impacts on fundamental rights. It is a challenge and the panel ends raising several questions and reflections. Finally, the use of facial recognition for use in public safety was mentioned, considered as a discriminatory measure, without security and that harms vulnerable groups in Brazil.

The panel ends with a reflection on the balance between the protection of rights and innovation. Moderator Cynthia reminds the audience that they can ask questions through the institutional contact of the Laboratory of Public Policy and Internet, and moderator Alexandra closes the panel.

GENDER INFORMATION - In the virtual audience, about 7 women were present, apart from the moderator and the rapporteur. At the on-site hearing, two women were present, apart from the moderator. About 7 men were present at the virtual hearing and about 8 men at the on-site hearing.

IGF 2022 WS #309 Access to remedies in safeguarding rights to privacy & data

Updated: Thu, 15/12/2022 - 14:25
Governing Data and Protecting Privacy
Key Takeaways:

More resources are required to inform public authorities of their responsibilities towards data protection and privacy rights of data subjects.

, Capacity building efforts must focus on informing the data subjects of their rights.
Session Report

Workshop #309:

Title: Access to remedies in safeguarding rights to privacy and data

List of panellists, chairs and moderators:

Panellists: Cynthia Chepkemoi (Data Protection Counsel (Advocate), Association of Privacy Lawyers in Africa, Kenya); Mosa Thekiso (Executive Head: International Legal & Regulatory Digital Services & Platforms and AI at Vodacom South Africa); Maureen Mwadigme (Senior Human Rights Officer: Kenya National Commission on Human Rights); Stella Alibateese (Director: National Personal Data Protection, Uganda)

Chair: Dr. Jonathan Andrew (Danish Institute for Human Rights)

Moderator: Cathrine Bloch Veiberg (Danish Institute for Human Rights)

Rapporteur: Line Gamrath Rasmussen

 

The session was moderated by Dr. Jonathan Andrew, representing the Danish Institute for Human Rights (DIHR), which is a national human rights institute (NHRI). The DIHR works closely with other national human rights institutions globally, a number of whom travelled to attend the IGF 2022. The DIHR continues to work on the theme of access to remedies, which is part of a broader project and initiative of the Action Coalition on Responsible Technology, an initiative funded by the Danish Foreign Ministry that brings together different stakeholders from civil society, nongovernmental organizations, public authorities, businesses, and other interested stakeholders who are participating in a yearlong program of events to strengthen the use of technologies responsibly on a global level.  The Action Coalition on Responsible Technology incorporates a work stream on policy coherence which is reviewing how regulations and different initiatives in relation to legislation are creating alignment in oversight, including in relation to access to remedies.

Substantive Report and Main Themes Raised:-

Data Protection and Privacy Rights in Kenya

- The legal framework in Kenya of the Data Protection Act of 2019, and the Computer Misuse and Cybercrimes Act of 2018 provide the basis for regulating data collection, processing and retention. The Data Protection Act has provided Kenya with regulations that put in place the procedural laws on how the registration of data controllers and processors must be conducted. The Data Protection Act also provides for a complaint handling procedure and outlines how data subjects can file a complaint to the office of the Data Protection Commissioner.  Whilst there exists a process, the mechanisms to seek redress where there is a violation of privacy have taken time to evolve into viable means of remedy.

- Enforcing data protection law in Kenya has proven to be a painstaking process, and larger tech companies continue to be responsible for some of the infractions that occur. It remains the case that many citizens are not aware of the procedures that they need to follow, such that much remains to be done in terms of sensitisation and capacity building to ensure a citizen is aware of, and can actually follow, the legal procedures in place to seek redress.

- The Data Protection Act also establishes an intricate system of rights and obligations that operationalise the right to privacy. Data protection authorities have a duty to receive and act on all complaints by individuals, and sometimes the authority on their own initiative can also investigate issues they have identified. The first stage of compliance is the DPA conducting privacy audits so that they can review their compliance level in terms of data governance, and whether they are actually registered as data controllers or data processors.  In cases where they have not registered, this means that they are not yet in compliance.

- A major consideration in relation to finding remedies are the different reporting mechanisms available with respect to violations of privacy.  Most frequently, the first port of call for any institution receiving a complaint is to attempt to resolve the dispute in-house. In certain cases, a party may have an alternate dispute resolution (ADR) mechanism in place and outlined in its privacy and data protection policy: where there is a data breach, this mechanism can be used to attempt to resolve the violation or breach.

- A second point of call for a violation of privacy rights is the Office of the Data Protection Commissioner (the Kenyan DPA): this is the authority in Kenya tasked to set the rules and regulations on how personal data is being handled, processed, stored, and is the authority to which all the data controllers and data processors are required to report any issue of a data breach or data loss.

 

- Personal data of data subjects in Kenya have on occasion been shared with a third party without consent having been given. Where this information is shared with a third party without consent, then this would amount to a violation of your rights as a data subject. From experience, when a complaint has been filed with the DPA, the office takes around 14 days to respond to the complaint. Then the DPA will ask the party that is the subject of the complaint to respond and provide evidence: this reflects the importance of fair administrative procedure whereby each party must be given an opportunity to defend themselves.  At this point, it is frequently realised that the data controller or processor actually had policies (also known as ‘agreements’) where the data subject consented to the processing. As such, consent to wider processing is often very broad: data subjects simply haven’t read the terms and conditions of the agreements which can be extremely long and convoluted. A final avenue for redress is the courts. 

- Public authorities, such as hospitals and schools (processing sensitive children's personal data or patient data), are often advised to have datasharing agreements in this regard with respect to any transfers of personal data. These agreements can protect the organization from liability, and from the risk of court proceedings or where complaints are filed to the Office of the Data Protection Commissioner.

 

Access to Remedy in Uganda: Role of the Personal Data Protection Office

- In Uganda the right to privacy is enshrined in Article 27 of the Constitution of Uganda. In 2019, the Ugandan government enacted a comprehensive law, the Data Protection and Privacy Act. The Act is a comprehensive law that was set up to further enhance protection of personal data, and it introduced specific digital rights in Uganda.  For example, the act has an entire chapter on data subject rights including the rights to access to your personal information, the right to erase your personal information, the right to make connections, the right to stop automated decisionmaking and many others are also provided for. Prior to the Act, Uganda had other laws that provided for privacy protection more generally. The law also provides for the Personal Data Protection Office. Part of the mandate includes resolving complaints from data subjects, so if a person finds that her/his rights have been infringed upon by a data controller or data processor, then the law gives you a right to make a complaint to the data protection office.

- The Personal Data Protection Office in Uganda also provides guidance, particularly to data controllers in regard to the interpretation of the law in respect of issues related to compliance. The legal framework also gives powers to the DPA to investigate, and it can also prosecute where it finds there has been non-compliance. Under the same laws, the Ugandan DPA is required to register all data controllers and data processors: currently the entire system is online (including payment and certificate issuance). Under the online system the office also receives automated updates on complaints filed. The office activated the system around May 2022, and it has currently over 2,000 complaints that have been raised against various data controllers. Crucially, it is key to ensure that data subjects can access their rights under the act. The Ugandan Act is very specific: it provides for their rights within the regulations and provides for mechanisms of how the data subjects will raise their complaints. Within the regulations there are specific provisions that require data controllers to respond to those complaints within certain timelines.  The timelines range from 7 days to 14 days.

- Under the guidance notes that the Uganda DPA issued for data subjects to raise complaints, it requires that data subjects first engage with the data controller or the data processor before they come to the office of the DPA (this aspect of the process is also enabled through the online system).  If a data subject finds that it has a complaint to raise, she/he can use the system to develop the letter that they can submit to the data controller (it is automatically generated from the system). This was put in place to ease the complaints filing mechanism, because it was known that many people may have challenges writing letters.

- Ugandan data protection law also requires data controllers and processors to have inhouse complaint resolution mechanisms. The Ugandan DPA provides training for data protection officers, who are focal points of contact in these organizations. They are provided with training too on how to deal with various complaints. Regarding the Ugandan DPA’s role, its mandate under the law allows for it to investigate the complaint. 

- In terms of the current legislation, given the regulations were passed only in 2021, the country has not yet had any prosecutions brought under the new laws, however, the DPA does have a number of investigations that are currently being undertaken. 

 

A Business Perspective on Access to Remedy: Vodacom Group

- In its business activities Vodacom Group manages a number of privacyrelated issues across the continent and across various countries. The issues the business deals with on a daily basis are broad, given its drive to take Africa as a region fully into digital inclusion and financial inclusion: these are the main topics that are top of mind for Vodacom Group - it wishes to avoid a scenario where Africa and African consumers are getting left behind from a digital economy perspective.

- A lot of emerging and new innovative technology requires a lot of data and data processing. As such, with these datarich technologies a key balance is actually how to use these technologies whilst also looking after the rights of its consumers.

- Vodacom has undertaken a study on how the business actually achieved the balancing act in the current regulatory environments that are present across Africa.  It is important to point out that remedies do differ from jurisdiction to jurisdiction, which poses a lot of challenges for Vodacom as a big business. Robust, relevant measures are in place at Vodacom, yet in contrast it can be difficult to convey just how complex and difficult it is for a smaller entity trying to manoeuvre through Africa from a business perspective to grapple with these different laws as they change from country to country. 

- The main barriers that Vodacom has identified with regard to rolling out datarich technologies are data localization laws. Vodacom observes that when it is dealing with big data or AI technology and leveraging off technology provided by Cloud service providers, that they tend to take a regional approach. As such, for Vodacom to use these technologies it has to think about where it is going to centralise its operation of the tech in question. For example, would it use the Amazon Cloud in Cape Town or perhaps another Cloud in Kenya?  However, because Vodacom wishes to move its businesses forward throughout those jurisdictions at the same time, it tends to have to use one hub - and that means that data is always moving across borders. Thus, the first critical issue is data localization.  The second is that in many countries across Africa there are data protection laws, but in others they don't have the data protection laws in place yet. In some countries there is however a constitutional right to privacy, so a business such as Vodacom obviously has to take that into account.

- Vodacom conducts its own studies to determine how it develops and responds to emerging factors relating to data protection and privacy laws. Taking into account the rights protected from the constitutional perspective and also from dataspecific laws or data protection-specific laws, it has reviewed a number of best practices contained in policies, and in digital agreements. It has reviewed the Convention for the protection of individuals with regard to personal rights (Convention 108+). Vodacom also looked at the EU’s digital transformation strategy and the data policy framework.

- From a standards perspective, which form much of the business’s focus, it also looked at Mauritius, which is a good example of a robust data protection act which takes care of the rights of data subjects and also signatories. As such, from a bilateral perspective, the business also reviews preferential trade agreements: for example, Singapore is a good example and has robust bilateral agreements with Australia and also with New Zealand.  Vodacom also undertook to examine the African Continental Free Trade Area (AfCFTA). This approach outlines therefore a broad perspective by a business in evaluating policies and so as to develop best practice: the recommendations the company makes are thus what it understands it needs in order to protect data subjects and respect the laws. 

- Taking into account the technologies, it is important to look at how to take a regional approach. Thus, the business also takes policies and looks at them from a regional perspective. Another option that can be considered is regional cooperation by trade agreements, where provisions can be made for rights, and also for regulatory reform.

- Vodacom Group takes a regional approach in cases where there aren't any data protection laws in place.  It also encourages the ratification of international conventions such as Convention 108 +. Further, it understands too that a foundation is the right to privacy, which exists in most constitutions. In addition, Vodacom has in place as a business specific measures such as privacy by design (PbD) - this is part of its approach whenever it is dealing with any data technology, which is essentially a constant in the current operating environment. Privacy impact assessments (PIAs) are also used, including internally even for jurisdictions that don't have laws in place: this has been developed as internal best practice. Whenever the business is working with any kind of data processing, it starts with its privacy impact assessment, and adaption is required to each jurisdiction accordingly given different laws.

- Vodacom Group, when given the opportunity to comment on the various policies or laws that are still in draft, provides input e.g., with a new bill in Tanzania. Vodacom aims to take a robust and balanced approach in its activities: this is key in protecting rights of consumers from a privacy perspective. On the AI side, it is a little bit broader from an AI perspective: other constitutional rights are impacted, such as freedom of expression, equality and non-discrimination. One also has to consider how to deal with biases in data. Vodacom is therefore constantly thinking about these rights, whilst at the same time trying to cater for digital inclusion and financial inclusion.

 

The Role of National Human Rights Institutions – the Kenya National Commission on Human Rights

- The Kenya National Commission on Human Rights (KNCHR) is an ‘A’ status national human rights institution according to the Paris Principles.  The Commission has a clear mandate to speak on matters of digital rights. With regard to emerging digital technologies, it has become very clear that even seemingly neutral technologies can actually replicate preexisting inequalities and contribute towards marginalisation. Technology impacts human rights positively, and at the same time may have a negative impact - this is where the role of oversight institutions, such as the Kenya National Commission on Human Rights, can function in addition to other organisations such as data protection authorities.

- As a national human rights institution, the KNCHR is very keen in providing oversight of online spaces to ensure that the milestones met in the physical world are not lost in digital spaces.  We note that there are so many issues and human rights concerns that have been happening in the online spaces, and unfortunately most are often not regarded as such as human rights issues. For example, in Kenya a case study was conducted by the KNCHR due to having received a lot of complaints on matters of freedom of expression, where activists have been arrested and charged (frequently with offences under the Computer misuse and cybercrimes Act 2018 - Kenya). This proved particularly the case during the COVID19 epidemic where human rights defenders really took to express themselves online as opposed to going on the streets due to the limitations of public protest with which we are all familiar.

- Censoring and blocking are also key issues. For example,  public institutions that have a Twitter handle or Facebook page have unfortunately taking steps to avoid criticism by seeking to censor negative comments about their actions and activities. These authorities have in certain cases taken steps to ensure people are blocked from receiving any messages or interacting further on particular platforms.

- Surveillance is also a key concern, including government surveillance and surveillance by businesses. The targeting of consumer decisions and gaining insights on activities through processing personal data, such as by FinTech companies, is considered a huge problem in Kenya. The Central Bank of Kenya has been spearheading regulation of this sector, so as to ensure that there is a sensible approach with respect to FinTechs targeting civil and political rights. There has also been progress with regard to oversight of  government surveillance activities targeting civil and political rights in Kenya, including voting rights.

- Kenya has also experienced a number of massive data protection breaches. Prior to the elections in August 2022, a large number of Kenyans found themselves registered as members of political party with the Office of the Register of Political Parties even when they hadn’t in fact registered themselves. As such, this instance reflects a very interesting finding that political parties will go the extra mile to get very specific information  on individuals to be able to meet the threshold that was required by the Office of the Register of Political Parties to be able to register as a political party.

- Another concern in Kenya is that the country is seeing quite a lot of movement in terms of compliance in the private space.  In fact, recently the Office of the Data Protection Commission made a requirement on the regulations and compliance procedures by private companies.  However, for the government  the situation is quite different.  In essence, there also needs to be awareness in government of the need to follow data protection laws: government is in effect the largest data controller. There still exists a misguided belief that the public sector cannot infringe on personal data laws, and this approach must be challenged. Fundamentally, in Kenya, state departments, agencies, the government in general should lead by example and implement data privacy programmes within their organizations. On the issue of access to remedy itself, national human rights institutions (NHRIs) are very independent and trusted entities and are thus able to be engaged successfully. The KNCHR already receives a lot of complaints and a lot of feedback from communities and from users of particular technologies. Thus, in terms of providing legal advice, holding public awareness forums, these continue as activities conducted by the NHRI so that the citizens are actually helped to understand their rights, especially with regards to digital rights.

Conclusions: How should digital accessibility issues be tackled so as to safeguard the digital rights and access to remedies that the different stakeholders are all working to achieve?

Response from the Kenya National Commission on Human Rights (KNCHR)-

  NHRIs, such as the KNCHR, can work to ensure that the vulnerable and marginalised groups are not impacted by our actions when it comes to issues of access to online services.  Unfortunately, what is happening right now is that technologies are often marginalising the vulnerable even further. Thus, for an NHRI it is important to think, how it can work with ISPs and other companies to be able to understand the need of specific areas that have been mapped out.  Secondly, it is important to ensure that services are equitably distributed across the population. However, it is important to take into account the business angle, and whether they will be able to recoup their profits when they go into more rural areas.  A key question is therefore how best can government incentives be used to ensure that such companies can reasonably reach out to these offline areas and at the same time mitigate higher costs in doing so? This is an issue that requires a multi-sectoral approach: it is not one that can be dealt with by one sector alone.  It is a challenge that requires a mapping aspect, a monitoring aspect, and reporting - all parts must be performed so that the vulnerable and marginalised groups actually benefit from increased connectivity of networks. Stakeholders in the digital sector need to work very closely together as human rights are interdependent and the different roles that actors play here in their respective capacities all complement each other. Working in siloes doesn’t work; it is clearly necessary that the respective stakeholders in their different capacities come together to be able to impact positively on matters ensuring protection of the rights of users of the technology that is under development. 

 

Response from the Ugandan Personal Data Protection Office-

  Digital connectivity and access are a valid concern.  On the part of the Ugandan DPA, it is trying to address the issue by creating awareness through the local languages of the country (there are over 50 tribes speaking different dialects). At the  Ugandan DPA office only 3 or 4 of the dialects are spoken: this presents a large challenge where it is creating digital literacy programmes - clearly, it's important that you communicate them in the language that most people understand.  As such, this continues as a challenge that the DPA is trying to work out ways to address. First of all, it is interpreting the laws and then developing its work that can create awareness of the laws amongst the population. Secondly, in terms of access, whatever technology is developed, the Ugandan DPA makes sure that it provides for communication through current and future smartphone devices. For example, the complaint system is one that clearly interfaces with the population. The DPA has also enabled SMS and other technologies that enables an individual, even with a basic device, to be able to reach the authority and communicate. Obviously, the issue of engagement and connectivity is a journey and government need to continue with these efforts until the gaps are bridged.

 

Response from Cynthia Chepkemoi (Advocate)-

  Digital literacy is a broad challenge. Working with different institutions it is clear that, in creating awareness and improving digital literacy among marginalised communities and more especially women and children, the best approach is to work through associations, that's where you can reach many people and institutions. For example, in Kenya classes have been provided to train children on digital literacy, train them on cybersecurity and skills they need to stay safe online.  Also important is identifying specific groups that are more marginalised in the digital space. At times one of the major challenges is the infrastructure itself, in as much as in trying to roll out the services to marginalised communities, it is realised that they lack the infrastructure, so it even becomes more difficult to enhance digital literacy, but then through working with associations and civil society organizations it calls for a multi-stakeholder approach. A collaborative approach is required to actually attain and reach the digital literacy levels that we need to see among our communities.

 

Response from Vodacom Group-

  Vodacom has a very robust social contracting programme. and a big part of its function is when rolling out various products and services, for example, with its a momandbaby app (essentially a healthcare product that tracks pre and postnatal development) that connectivity is considered. For such services to actually go into the market, users need a smartphone - thus smartphone penetration is key, as is also the relevant digital literacy. As part of Vodacom’s social contracting programme, as it rolls out its various products that cut across different sectors (e.g., healthcare, education) it partners with Cloud service providers on areas such as education, for example. Vodacom continues to look at specific issues and identify new areas, and this approach goes hand in hand with that of educating consumers and users of those products on their rights, what the business does with their data, how the company secures their data. In addition, it is also important to inform them how they can hold the business accountable when it comes to their data if they're not comfortable with how their data is being processed, or if don't understand what we do with their data. It is important that consumers have at their disposal a resource or various channels to approach the company so they can learn and be informed.

 

- - -

 

IGF 2022 Open Forum #56 Enhance International Cooperation on Data-Driven Digital Economy

Updated: Thu, 15/12/2022 - 12:50
Governing Data and Protecting Privacy
Key Takeaways:

All parties, including governments, civil societies, private sectors, etc., should cooperate to jointly build a community with a shared future in cyberspace. Efforts should be made to deepen digital exchanges and cooperation, expand economic and trade exchanges in the digital field, promote the interconnection of digital infrastructure, enable digital access for all, and jointly promote the sustainable development of global digital economy.

,

The establishment of international rules for digital governance based on consultation and consensus will enhance the development of digital economy. More engagement in establishing international rules for digital trade helps reduce trade barriers and facilitate the sound and orderly development of international trade.

Session Report

During the session, most speakers emphasized the necessity and urgency of international cooperation on digital economy, digital connectivity and data governance. They shared practices of different stakeholders in boosting digital economy while ensuring data security. It is suggested that governments, civil societies, private sectors and other stakeholders should unite and cooperate to jointly build a community with a shared future in cyberspace. Efforts should be made to deepen digital exchanges and cooperation, expand economic and trade exchanges in the digital field, promote the interconnection of digital infrastructure, enable digital access for all, and jointly promote the sustainable development of global digital economy.

Experts also focused on the establishment of international rules for digital governance. They maintained that the establishment of international rules for digital governance based on consultation and consensus would enhance the development of digital economy and that more engagement in establishing international rules for digital trade would reduce trade barriers and facilitate the sound and orderly development of international trade.

Some expressed that in order to develop and benefit globally from cyberspace, international cooperation must take place in cyberspace and benefit from the Initiative on China-Africa Jointly Building a Community with a Shared Future in Cyberspace, to become a global initiative because of its comprehensiveness, clarity, transparency and global mutual benefits.

In addition, some advocated more engagement in the international governance of digital economy such as joining the G20 Digital Economy Development and Cooperation Initiative, the Global Development Initiative (GDI) and other international cooperation agreements.

Reflection on Gender Issues (Gender Report):

The number of participants in the open forum is more than one hundred and the percentage of women and gender-diverse people that attended the session is estimated to reach over one third. The session managed to engage with gender as a topic. The onsite moderator and keynote speaker Ms. Yik Chan Chin delivered a speech under the theme of “Building Gender-Inclusive Digital Ecosystems”. She analyzed the challenges and difficulties facing women in the digital technology ecosystem and shared her insights into enhancing gender inclusion in the digital world, which were echoed by the participants at the session. She maintained that woman-centered design could enable digital innovation hubs to tailor their services to women’s needs and that collaboration should be made between digital ecosystem stakeholders.

IGF 2022 DC-Jobs Responsible Internet Usage

Updated: Thu, 15/12/2022 - 05:42
Enabling Safety, Security and Accountability
Key Takeaways:

Key takeaways: 1. Focus on Decentralization, Localization, and Governance: Technologies of the future are going to have a significant intersection of the Internet and ESG principles. The next generation of social media is going to be more decentralized, providing more empowerment and local governance. The Federated model of the Internet creates a lower digital footprint by building an internet that is more scalable and conscious in its power and

Calls to Action

Action items: 1. Quantify the carbon footprint of the digital activities, create labels of the matrix of usages like that of the food and aviation industry, and embed the standards in the platforms / Internet Protocol. Like, emails could carry an impact on the environment for the carbon emissions. Currently, mobiles have systems to show us screen time and have parental controls, a similar mechanism could be initiated for the carbon footprint of o

Session Report

Responsible Internet Usage

Session Report

Session: Responsible Internet Usage

Date: December 2nd 2022

Time: 10:45- 12:15 UTC+3

Theme: Enabling Safety, Security, and Accountability

At Banquet Hall A

Session Chair: Dr. Rajendra Pratap Gupta, Chairman- Dynamic Coalition on Internet & Jobs, Internet Governance Forum (IGF).

Rapporteur: Ms. Smriti Lohia

 

The Session started with opening remarks from Dr. Rajendra Pratap Gupta, Chairman of Dynamic Coalition on Internet & Jobs, Internet Governance Forum (IGF). He released the report on ‘Responsible Internet Usage’. He discussed how digitalization is becoming an integral part of our lives and how it impacts the environment. Dr. Gupta gave a very different perspective on understanding the topic. If we take an average lifespan of 70 years and a human sleep around eight hours a day, we probably sleep 24 years in our life. And if we estimate 9 hours of time on the web per person, we spend almost 21 years of our life on the internet. That means the majority of our time. We are going to be on the net in terms of our office workers and that will definitely have an impact on the carbon footprint.

 

Mr. Gunjan Sinha, Executive Chairman of MetricStream and a tech pioneer, has been closely involved with the Internet since the early 90s when the Internet was still a research network. The Internet has become foundational to our lives, society, and nations globally. He talked about the intersection of the internet and ESG principles. The environmental world focuses on the carbon side; the social focus is on the digital divide. The Environmental, Social, and Governance of the Internet need to be considered very carefully at the policy and individual levels. According to him, social media will be more decentralized in the next generation, and when you decentralize the internet, it creates more empowerment and more local. So we have to get towards modus decentralization, more localization, and more governance at the local levels, even though the internet assumes to be a global network. As we get to a Federated model, we will create a lower digital footprint.

We have to label our digital activity properly. Labeling standards must be used in the digital world; for example, label at the bottom of an email about its negative impact. As we label, it starts to create awareness which leads to the right change of behaviour which then leads to a more ESG- centric internet, which has been missing in our overall architecture of the Internet of the future. It requires governments to come together. It also requires standard bodies to come together to create labeling standards.

Dino Cataldo Dell’Accio, Chief Information Officer at the United Nations Joint Staff Pension Fund (UNJSPF) talked about the project of Digital Identity. It was a move towards digital transformation. Now the system is based on new technologies such as biometrics and blockchain and how to provide assurance about these technologies’ reliance on governing bodies and other stakeholders.

He emphasized a lot needs to be done to make sure that the use of technology can be done in an accountable manner to provide assurance and whether and how the use of technology is being done by taking into consideration all the implications of technologies, whether it's about environmental sustainability, social responsibility, energy consumption and so forth. Governments, international organizations, the private sector, and professional associations should work together and start creating a set of standards in order to provide assurance and reliance on the responsible use of these technologies.

Mr. Erik Solheim, Former Environment and Development Minister of Norway, and former executive director of the United Nations Environment Programme (UNEP), touched upon the IT industry’s responsibility for solving the global environmental crisis. The industry should aim to go net zero, sourcing its data centers from renewable energies and buying carbon credits for emissions that cannot be abated. It should use its enormous outreach on various social media platforms to facilitate a broad dialogue on how to solve environmental problems.

Dr. Pooran Chandra Pandey, a senior visiting fellow at The Institute for Democracy in Taipei, gave a brief on the kind of scale and impact of carbon emissions that we are creating by doing very small things, like email, but that matters a lot. The serious problem with any technology is when we go beyond and begin to use it without really knowing the consequences that we will have on health, climate, and society. The kind of activities we undertake without knowing how it impacts the environment. Therefore, knowledge is important because most of us don't know the kind of impact of 1 MB of an email and the data created in its life cycle. On aggregate, an individual is contributing more than 300 million tons of CO annually, from sending emails to text to playing games online and others which we are not aware of in terms of the negative impact we are creating for the people around us and also on the environment which is an intrinsic part of life.

There are a few things that need to be considered: We have to educate children from the school itself about these issues; companies that are producing technology need to be more transparent in terms of their sustainability report, the value they are creating not by selling technology but also being aware how technology affects people in the state of wellbeing.

 

Osama El Hassan, a senior digital health expert on Smart Health at the Dubai health authority, talked about the excessive use of the internet and its association with people’s health. The consequences of excessive of the internet are both physical as well as psychological, which can cause damage to the economy as well. This can also affect cognitive abilities. The key health issues that are clearly associated with excessive internet use is the high blood pressure and obesity; sitting for so long, or focusing on internet devices for long, especially with people who are addicted to gaming, will impact their blood pressure. On the psychological side, anxiety is becoming more and more prevalent, especially with adolescents that now have more difficulties interacting with the real world. This area needs much consideration, and we must have many governance issues.

We also need to have legal governance around misbehavior, bullying, or shaming. This could be at local, regional, and country levels.  We need a framework to ensure that these interactions will not have a psychological impact on users.

 

Smriti Lohia, the co-author of the Responsible Internet Usage paper, talked about how the internet will soon become a basic human necessity and why it is now important to understand responsible usage of the Internet. People need to start focusing on their digital carbon footprint. She emphasized on while using the internet; we need to consider what's necessary and what's not necessary; for what purposes we are using the internet; we need to see what platforms we are using; what kind of content we are accessing. So, we need to consider our responsibility towards the internet at very minute levels.

The session ended with a vote of thanks from the chairman, Dr. Rajendra Pratap Gupta, and with a promise to come out with more reports on the labeling of our digital activities and to create mass awareness about the carbon impact of our digital footprint and work towards making people responsible on internet usage.

To learn more about our work,  visit: https://www.intgovforum.org/en/content/dynamic-coalition-on-internet-jo…

IGF 2022 Town Hall #82 Sustainable Automation as SDG-18?

Updated: Thu, 15/12/2022 - 05:26
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Bridge the skill gaps for future jobs through lifelong learning. Technology is here to stay and it is important that lifelong learning and upskilling is made integral to our education

,

Sustainable Automation should be considered by UN as SDG#18

Session Report

Sustainable Automation as SDG-18?

 

Key Takeaways:

  • Disruption in labour market due to automation
  • Identify the gap between current and futuristic skilling
  • Control of the Private sector over different areas/sectors of automation
  • Focus on job creation
  • Bridging the skills needed for future jobs
  • Life-long learning process to life-long skillinglnced use of automation

Calls to Action

  • Bridge between job opportunities and academics
  • Continuous Lifelong Learning to Continuous and Lifelong Skilling
  • Focus on Up-skilling, Re-skilling, and Futuristic skilling
  • Balanced use of Automation
  • Basic social safety and security for people
  • Creating ideas for capacity-building of futuristic jobs
  • Sustain-Able Automation as SDG 18 or included in SDG 9 – Sustainable industrialization

 

Session Report

Session: Sustainable Automation as SDG-18?

Hosted by Dynamic Coalition on Internet & Jobs, Internet Governance Forum & Digital Health Associates ( https://www.digitalassociates.health/ ). 

IGF 2022 Town Hall #82

Press Briefing Room

13:15 IST (1 Dec) -14:15 IST (1 Dec)

Rapporteur: Ms. Rahatul Jannah

Session Chair: Dr. Rajendra Pratap Gupta, Chairman- Dynamic Coalition on Internet & Jobs, Internet Governance Forum (IGF).

 

The Session started with opening remarks from Dr. Rajendra Pratap Gupta, Chairman of Dynamic Coalition on Internet & Jobs, Internet Governance Forum (IGF). He released the report on ‘Sustain-Able Automation as #SDG 18’ Dr. Gupta remarked that technology should not just focus on Productivity, Profits, and Proliferation, but also there is a need to keep people at the core. He discussed whether we should use automation indiscriminately or discreetly. He also briefed about the Sustain-Able Automation report which looks at automation in countries- large and small, sectors including agriculture, manufacturing, and retail services. Later, he moved to the panel to discuss the effect of automation on jobs and if Sustain-Able Automation should be #SDG 18.

Dr. Rishi Mohan Bhatnagar, President, Aeris India, and co-author of the book, Enterprise IoT, which covers Internet of Things (IoT) project management frameworks, talked about the time when the industrial revolution started in the western world and all the people who spun the wheel in India lost their jobs. He mentioned that whenever there is a technological advancement, there will be a transformation and a disruption. In 1986, when the computerization of railways was initiated in India, people were completely against computerization. However, advancement in technology will not stop, and we will have to adapt ourselves. He pointed out that policymakers need to come up with ideas for skilling resources and creating capacity-building for futuristic jobs.

Asish Thakur, Executive Director, Glocal Pvt. Ltd. talked about automation in Nepal, and mentioned that there has been a lot of work going on in terms of automation in different sectors. He mentioned the research conducted by UNICEF in 2019, which said that by 2030, more than half of the students would come out of school without the skills needed for future jobs. So, we need to adapt to technological advancements and focus on skilling. He also added that a lot of young people have now started taking up upskilling and reskilling programs, and observed that they are now getting into coding, designing, & different activities, which are equally or even more at times, providing them with opportunities and resources. Therefore, there needs to be a bridge between academics and job opportunities. He concluded that there will be a need to learn futuristic skills to adapt to technological advancements.

Mr. Suresh Yadav, Deputy Head, Secretary General Office, The Commonwealth Secretariat, quoted from the Sustain-Able Automation report how the total market value of Apple is much higher than the total GDP of many countries combined together, which brings a perspective that how much control companies have on automation and how they define and decide the direction and areas of automation. He mentioned the shift of power from the government to the private sector and how the private sector controls important assets and determines changes in policies. Mr. Yadav also remarked that technology has the power to disrupt, and neither the government nor individuals can stop this disruption. Hence there is a need to look for the best ways to adapt and prepare ourselves for those changes. He emphasized shifting from ‘continuous lifelong learning to continuous and lifelong skilling.’

Mr. Pooran Chandra Pandey, former member of the Board of Trustees, United Nations World Food Program, and currently International Visiting Fellow, Taiwan Foundation for Democracy, Taiwan (R.O.C) discussed how to balance technology in a way that does not get on the wrong side of human centricity. He elaborated on how technology is evolving and creating disruptions at a very large scale- throwing people out of jobs, bringing upheaval in the labour market, and which is waiting to be accelerated by excessive technological application brought into the offices. He remarked that it is the joint duty of the government and private sector to identify areas, sectors, and people to skill and give them basic social safety and security. He concluded by saying that automation is not only going to displace people from several sectors but also create social disorder. Therefore, there is a need to mitigate the downsides of automation urgently.

 

Toward the end of the session, the majority of the participants opined that it is time to consider ‘Sustainable Automation’ either as SDG-18 or under the existing SDG which covers sustainable industrialization  i.e. SDG-9

The session ended with a vote of thanks from the Chairman, Dr. Rajendra Pratap Gupta.

For more information on Sustainable Automation, please visit the webpage https://www.intgovforum.org/en/content/dynamic-coalition-on-internet-jo…

 

IGF 2022 Open Forum #46 Strengthening MS collaboration on DNS Abuse

Updated: Thu, 15/12/2022 - 00:39
Enabling Safety, Security and Accountability
Key Takeaways:

The multistakeholder model includes a role for users, it is important to reflect this in due process and consider the impact on human rights - for example, through notification and engagement with the user in their domain name is implicated in a form of abuse. Recourse mechanisms also form part of a due-process when prior notification of action is not possible, for example due to severe, well evidenced concerns of harm to the public.

,

There is growing regulatory pressure on the need to address content related abuses at platform level, and increasingly through the DNS. Given the technical limitations of the DNS in targeted interventions, there is a need for a multistakeholder process on developing principles, criteria and thresholds to demarcate the limited set of situations where the DNS may be used to remediate specific types of content. However, it is important to note that

Calls to Action

Continued multistakeholder dialogue on definitions of when to act at the DNS on what types of abuses and strengthening due-process towards ensuring an open, accessible and safe internet for all

Session Report

The DNS Abuse Institute and Internet & Jurisdiction Policy Network Open Forum on Strengthening Multistakeholder collaboration on DNS Abuse, took place on Wednesday, November 30, 2022 from 06:30 to 07:30 UTC. It engaged a multistakeholder panel comprised of industry representatives (both generic Top Level Domains (gTLD) and country code Top Level Domains (ccTLD) Registries), government, law enforcement, and civil society representatives on the question of, “when is it appropriate to act at the level of DNS to address online abuse?”. 

The session was structured around the following key pillars:

  • What does DNS abuse look like? 
  • What does acting at the DNS layer mean? 
  • When do you think it's appropriate to act through the DNS? 
  • What is the role of multistakeholderism in DNS abuse? 

This session focused on the role of the multistakeholder model in relation to when it is appropriate to address online abuses through the Domain Name System (DNS). Domain registries and registrars are part of a centralized system of Internet infrastructure that provides an addressing system for the Internet. 

The Internet Corporation for Assigned Names and Numbers (ICANN) is a multistakeholder organization where consensus policies are developed by the community including the contracted parties (e.g., Registries and Registrars). These contracted parties who operate generic Top Level Domains (gTLDs) can be impacted and ultimately bound by these consensus-driven policies. Country Code Top Level Domains (ccTLDs) also fit into the ICANN ecosystem but have their own systems of policy development which can also include multistakeholder engagement on a national or regional level. 

There are various definitions of ‘DNS Abuse’. Sometimes the term can be used as shorthand to indicate ‘action is appropriate at the DNS level’. However, the conversation benefits from a more granular discussion and the consideration of context for specific types of online abuse.

The DNS is a tool that allows users to connect to specific addresses on the Internet (often websites), but itself is separate from the content on websites– which is not within the control of a registry. Acting at the DNS layer often entails deleting or suspending the entire domain name which can have far reaching, often unintended consequences, for the registrant and website users. For a registry, remedying abuse often means working with registrars or other service providers. 

For governments and law enforcement around the world, it can be challenging jurisdictionally to address harm occurring on the Internet, with its transnational nature, when the actors and intermediaries are distributed globally and laws are not aligned across the world. 

There are more actors involved in the ecosystem beyond registries and registrars, for example, hosting providers, the registrant themselves (the user of the domain name). 

On what DNS abuse looks like, panelists delved into what different operators consider to be appropriate to address at the level of the DNS and the wide distinction between gTLD registries and ccTLD registries. It was identified that ccTLD registries are often much closer to national laws and may be required to follow national procedures. 

When considering action at the DNS level, it is important to differentiate technical abuse from content abuse, assess evidence and consider principles. Is the ‘tool’ available to that operator effective to mitigate the specific harm, precise, proportionate, and has a limited potential for collateral damage? If action is taken in error, it can often be reversed (e.g., a domain name can be restored), but the consequences of the error may not be reversible. Those consequences could come with an unacceptable impact on fundamental human rights, for example, by causing a loss of connectivity for critical health or informational services. The potential to cause more harm than the initial issue detected should be considered seriously. 

It is essential that law enforcement in particular have respect for due process as they investigate and report harm. It is also essential that operators of infrastructure are aware of the potential human rights impacts of potential actions. 

In addition, it was identified that ​​the role of the multistakeholder model is important but is better suited to some tasks than others. There is currently a movement within ICANN by the contracted parties to request changes to their contract. In particular the request is for focused and targeted amendments to take reasonable and appropriate action to mitigate or disrupt malicious registrations when reports are properly evidenced. 

There is also a role for the multistakeholder model within ICANN to undertake further, more detailed work on the topic of DNS Abuse. In addition, there is also a need to engage with actors outside the ICANN DNS Community ecosystem. 

The multistakeholder model includes a role for users, it is important to reflect this in due process and consider the impact on human rights—for example, through notification and engagement with the user that their domain name is implicated in a form of abuse. Recourse mechanisms also form part of a due process when prior notification of action is not possible, for example due to severe, well evidenced concerns of harm to the public. 

There is growing regulatory pressure on the need to address content related abuses at platform level (where the content ‘lives’), and increasingly through the DNS. Given the technical limitations of the DNS in targeted interventions, there is a need for further work through the multistakeholder process to develop principles, criteria, and thresholds to demarcate the limited set of situations where the DNS may be used to remediate specific types of content. However, it is important to note that the purpose is not to legitimize content restrictions through the DNS which is neither technically possible nor recommended.

IGF 2022 WS #235 Dialogue on the 'Declaration for the Future of the Internet'

Updated: Wed, 14/12/2022 - 19:09
Connecting All People and Safeguarding Human Rights
Session Report

On April 28, 2022, the United States announced the Declaration for the Future of the Internet (DFI), a commitment signed by 61 like-minded nations to reclaim the promise of the early internet in the face of 21st-century challenges. The IGF session, Dialogue on the Declaration for the Future of the Internet, was held on December 1, 2022, and gathered experts from signatory and non-signatory countries to debate the policy questions highlighted above.

Milton Mueller from Georgia Tech's Internet Governance Project opened the session by asking panel members whether their host countries signed the declaration and, if not, to explain why they didn't.

The Cyber Ambassador for Germany, Regine Grienberger, opened by stating that EU member states had all signed the Declaration. Grienberger highlighted how EU stakeholders are reaffirming their stance on ongoing digital transformations through other initiatives such as the Declaration on digital rights, principles for the digital decade, and Germany's Freedom Online Coalition.

BRICS-block countries did not sign the Declaration, taking issue with the process and substance. According to Dhruva Jaishankar from Observer Research Foundation (ORF) America, India did not sign the DFI because its drafting process did not include sufficient consultation with Indian officials or an emphasis on national security. Jaishankar noted that while nations like India may pay lip service to multistakeholder principles, digital nationalism will be India's predominant form of internet governance going forward.
According to Anriette Esterhuysen from Civil Society, African Group, South Africa did not sign the DFI due to a customary position of not signing international agreements they did not negotiate. That said, South Africa has broad alignment with most DFI principles, except for multistakeholder governance. Esterhuysen added that the DFI sends a geopolitical signal of alignment between like-minded and democratic nations. The DFI language makes it difficult for many other states to align themselves with the document. Esterhuysen hopes civil society will use the DFI to hold countries accountable.
According to Louise Marie Hurel from Civil Society, Latin American and Caribbean Group (GRULAC), Brazil does not typically sign international agreements where they are not part of the conceptual and negotiation phase. Further, Hurel highlighted how Brazil's foreign policy stance could best be described as strategic ambiguity, where cooperation with different geopolitical blocks is made on an ad hoc basis. For example, Hurel noted that Brazil cooperates with the West on the international counter-ransomware initiative while maintaining a working relationship with other countries such as China and Russia. Hurel said the DFI achieves its objective of sending a government-to-government political signal but is less sure about its adherence to a multistakeholder process. For Hurel, the DFI is about creating trust with countries in the middle geopolitical ground and a stakeholder mapping effort for the US to gauge a willingness to support from its allies.

Assistant Secretary of Commerce for Communications and Information and National Telecommunications and Information (NTIA) Administrator Allan Davidson noted that the DFI's intent was to signal a shared vision and renewed commitment around its seven core principles, given the rising trend of digital authoritarianism. Davidson noted the DFI was conceived as an intergovernmental declaration because it started as a contribution to the Summit for Democracy, where governments were initially approached. Davidson hopes the document can attract more countries to become signatories and allow the multistakeholder community to advance the DFI vision and hold nation-states accountable.
Milton Mueller asked whether the Declaration will mean the US will be more willing to distance itself from forms of digital sovereignty. Mueller also emphasized the difficulty of navigating the tension between creating an exclusive geopolitical block of like-minded nations while allowing countries that cannot agree with the DFI principles the possibility of signing on and adhering to the document in the future.
Finally, the panel decided to bring DFI principles into future IGFs to gain more support and discussion of those principles. The Q&A portion of the session included the following interactions:

  • A question from an unnamed UK government representative asked what could be practically done with the DFI.
    • Esterhuysen answered that the DFI principles could be used to engage in public, open review processes of a country's recent online legislation.
  • Moira Whelan from the National Democratic Institute (NDI) in Washington, DC, emphasized civil society's engagement in developing the DFI. Whalen also noted that government colleagues were reluctant to provide civil society with opportunities to contribute meaningfully. Whelan asked the panelists for a detailed explanation of the mechanisms for civil society to participate.
  • An audience member noted that since DFI was a project early in the Biden administration, it suffered from an "objective creep" problem. He suggested that UN member states declare their aspirations, intent, vision, and commitments to the future of an open internet and engage in that process through the General Assembly or the Secretary-General.
  • Yik Chan Chin, from the Oxford Global Society and Beijing Normal University, noted that the DFI was a geopolitically driven initiative and questioned the intent behind the Declaration.
  • Matthew McNaughton from Kingston Jamaica's SlashRoots foundation indicated that a declaration of values by like-minded actors would not necessarily be the best vehicle for achieving an open, un‑fragmented internet. He noted the DFI might have the opposite effect of further highlighting distinct divisions and separate visions.
  • Izaan Khan noted the Shanghai cooperation organization, of which India has been a member since 2017, has made digital sovereignty the foundation of internet governance and is promoting it as an international code of conduct in information security.
IGF 2022 WS #494 Cutting Ties: Citizens caught between conflict and tech

Updated: Wed, 14/12/2022 - 10:50
Avoiding Internet Fragmentation
IGF 2022 WS #260 Protecting Shared Computation (Cloud Security)

Updated: Tue, 13/12/2022 - 21:49
Enabling Safety, Security and Accountability
Key Takeaways:

Data is the lifeblood that guides the decisions of the most organizations but old ways of thinking about data protection are not fit for the era of digital transformation.

,

Cloud security /protection starts with complete visibility into the security and compliance posture of every resource you deploy into the cloud

Calls to Action

It's time to devise a strategic plan to protect data so that our spaces can reap the benefits of working in the cloud without increasing the risks of exposure

Session Report

Organizations and the public face security concerns regarding cloud environment .Despite the fact that many organizations have decided to move sensitive data and important applications to the cloud, concerns about how they can protect it is abound .Therefore the technical community needs to help in  reducing risks of exposure of data while the  private sector  and civil society need to play key roles in educating and raising awareness to the public of security concerns.

Data sovereignty /residence control has created major concerns around data control with protection regulations such as the GDPR limiting where EU citizens data can be sent, the use of data centers outside of the approved areas cloud place organizations in a state of regulatory non-compliance .We need to have other region like the Africa region, America, Asia and Australia adopt different jurisdictions and laws regarding access to data for law enforcement and national security which can also impact the data privacy and security of nations.

IGF 2022 WS #471 Addressing children’s privacy and edtech apps

Updated: Tue, 13/12/2022 - 20:24
Governing Data and Protecting Privacy
Key Takeaways:

The use of edtech apps by children and adolescents generates different risks, especially with regard to privacy and the protection of their personal data. Large corporations that create and provide these services, some of which are free, can collect massive amounts of data and use it to send personalized advertising and behavioral modulation based on their vulnerabilities.

Calls to Action

It is necessary that governments put children's best interests at the center of the debate, including hearing their opinions and experiences. Governments must also pass legislation to protect children's data and monitor and penalize any violations of children's data, privacy, or rights. The tech industry bears the primary responsibility for child data protection.

Session Report

 

  • Millions of students have returned or will return to a new academic year and they will largely use technology that was adopted during the pandemic. Just a few months ago Human Rights Watch published a report called “How Dare They Peep into My Private Life?”: Children's Rights Violations by Governments that Endorsed Online Learning during the Covid-19 Pandemic” that investigated the educational technologies endorsed by 49 governments in the whole world. The investigation covered the majority of kids who had access to the internet and devices.
  • Every government, except for one, authorized the use of at least one online learning product that surveilled children online, outside of school hours and deep into their private lives. This was the first time that evidence was collected that the majority of online learning products harvested data on who children are, where they are, what they’re doing, who are their family and friends are and what kinds of devices their families could afford for them to use.
  • The critical point of the HRW’s report is that the products did not allow students to decline to be tracked. The monitoring happened secretly, without the child's or family's knowledge or consent. Being the online app a mandatory tool, it was impossible for kids to opt out of surveillance without opting out of school and giving up on learning.
  • This pairing of the edtech industry with the attention economy and targeted advertising industry, as it’s clear from Han’s research, has been promoting a clear violation of the students’ rights to privacy and to the protection of personal data. And on top of that, it promotes children’s behavioral manipulation to an extent that we are still unaware of (neither in terms of present, future, individual or collective impacts).
  • Children are human beings going through a developmental stage. They need to be able to make mistakes and learn from them, as well as to experiment throughout this development in order to understand and mold their own personalities.
  • The need for children to experiment with their personalities is completely undermined by the attention economy and its profiling and aggregation techniques. In the end, what we see today is that the content that reaches children online and therefore influences their personality shaping is, to some extent, dictated by private and commercial interests. So besides behavioral manipulation, this aggregation and specific content targeting can also reinforce discrimination.
  • In order to face the problematic current scenario of the edtech industry, we need to understand that the protection of children’s rights will only be achieved once it is shared among all of society. Much is often said about families being responsible for educating children to use digital devices and services. Some families should support children in their use of edtech apps as much as possible, of course, but that can’t be all. How do states choose edtech tools to be adopted in public education? How do schools themselves choose the tools to be adopted in the private education sector?
  • We need to address the responsibility of the private sector, both the edtech companies themselves and other companies from other sectors that are buying student data from them.
  • When addressing the responsibility of States, schools and the private sector, we need to bring the concept of the best interest of the child to the table, as determined by the UN’s Convention on the Rights of the Child, the most ratified international treaty in the whole world. All actions that directly or potentially affect children must be undertaken in order to fulfill their best interests.
  • The first and foremost way to protect children online is to be aware of what data they provide, whether the apps that they use are putting their data in unwanted hands. Check the company's reputation reviews, take advice from parents and teachers and check online if you're in doubt before using them. Maybe teachers should be trained in schools to help students to understand how to keep their data safe. 
  • The other way is to ensure that we have the best practices enforced. Companies that offer solutions to children have to be mandated to only collect relevant data. Companies should face severe consequences. This is where the IGF can play a role and convince governments to enforce these universally. Governments should come together and make laws that ensure that children stay safe online and that their data is protected . Technology is not going away and children are increasingly going to use the internet and online apps for their educational needs and other social media requirements. We should work collectively to bring laws across national boundaries, encouraging organizations, government agencies and international institutions like the United Nations to mandate rules that will help protect us online and our privacy.  
IGF 2022 WS #369 Harmonising online safety regulation

Updated: Tue, 13/12/2022 - 16:47
Avoiding Internet Fragmentation
Key Takeaways:

Legislators around the world are increasingly engaging with online safety questions, and implementing novel regulatory regimes aimed at enhancing online safety and addressing various online safety risks. In this context, more and more independent online safety regulators are emerging, whose job it is to implement and enforce novel online safety regulations.

,

To ensure people are protected online and to ensure that regulation is effective and consistent across boarders, international collaboration amongst regulators is essential. While substantive rules may differ across the world, there is significant scope for alignment around regulatory toolboxes and for the sharing of best-practices and expertise. The new Global Online Safety Regulators Network will serve as a crucial vehicle for collaboration.

Session Report

Digital technologies are at once both global and local, and as a result international regulatory collaboration has always been essential to keep people safe online. With the global regulatory landscape for online safety rapidly evolving and more and more countries devising and implementing novel regulatory approaches to improve online safety, international collaboration will become even more important.

In this session which took place on Day Two of IGF, four regulators who are at the forefront of the new drive for online safety came together to discuss the latest online safety regulatory trends, how greater international regulatory cooperation can improve outcomes for everyone, and the role that initiatives like the new Global Online Safety Regulators Network can play. 

The panel featured senior representatives from the eSafety Commissioner (Australia); the Online Safety Commission (Fiji); the Broadcasting Authority of Ireland (Ireland); and Ofcom (United Kingdom). It was moderated by Matthew Nguyen, Digital Governance Lead at the Tony Blair Institute for Global Change.

During the discussion panellists shared updates on how their respective jurisdictions are approaching contemporary online safety policy issues. The major theme was international cooperation, and extensive discussion was devoted to how regulators need to work together to enhance enforcement capabilities and ensure they are benefiting from best practice globally. The panel also discussed the Global Online Safety Regulators Network, that was recently launched by the four regulators and which aims to be a vehicle for greater regulatory alignment and cooperation amongst online safety regulators across the world.

Stakeholders, both in the room and online, were engaged throughout the discussion. Participants expressed support for greater coordination amongst regulators and many were especially interested in understanding the Network’s plans to invite additional regulators to join in the coming year. Stakeholders also opined on the relationship between government and independent regulators, and the panellists noted that regulatory independence was a key principle of the new global network.

Conversations in the room continued well after the end of the panel discussion, with stakeholders highlighting the challenges from regulatory fragmentation and the importance of involving civil society and broader internet governance stakeholders in regulators’ work.

The four regulators were grateful for the opportunity to present and discuss their work with a diverse range of stakeholders at the IGF.

IGF 2022 WS #350 Why Digital Transformation and AI Matter for Justice

Updated: Tue, 13/12/2022 - 14:08
Addressing Advanced Technologies, including AI
Key Takeaways:

Judicial operators play an important role as guardians of justice in the digital age, and need to have the latest knowledge on how technology can help them strengthen access and delivery of justice while being mindful of the associated human rights, democracy and rule of law related risks of technologies like artificial intelligence.

Calls to Action

UNESCO’s Judges Initiative empowers judges, lawyers and policymakers to better fulfil their responsibilities as the duty bearers to protect human rights and the rule of law. UNESCO supports judicial operators worldwide in this endeavour through through knowledge sharing, open educational resources, and capacity building efforts.

Session Report

As uses of AI proliferate, whether through the use of surveillance tools or algorithms that amplify disinformation, judicial operators as duty bearers play an important role in protecting human rights and the rule of law. Through partnerships, capacity building efforts, open educational resources, and standards related to new technologies like AI, UNESCO supports judicial operators in creating open and accessible justice systems.

 

As judiciaries worldwide face a large backlog of cases, they are working to make administration of justice more efficient, timely and people friendly. For instance, a panelist underlined that in a West African country, the number of cases filed is more than four times the number of cases closed. However, technologies like AI are helping address this challenge as electronic law reports, AI-powered document review, e-registries, e-payments, and a range of digital transformation measures have led to a 200% increase in the number of cases resolved by the Court of Appeals between 2009 and 2019.

 

Faster case resolution is necessary, but there are significant challenges to the inclusive, sustainable, and transparent use of AI. These include a lack of transparency and trust in the judiciary, challenges with digitization and infrastructural capacity, and privacy concerns.

 

At the same time, the role of judicial operators is evolving. They need to be aware of the use of technology in the justice system, technological bias, and varying levels of digital literacy to improve access to justice. In the courts, the focus is on building the infrastructure to support digitization, but more can be done when it comes to hiring to acquire digital skills.

 

A key finding of capacity building in UNESCO’s AI Needs Assessment Survey in Africa is the need to create localized data sets to inform AI. Currently, AI systems are often trained on low-quality and unrepresentative data sets and are then deployed in the African context.

 

The discussion emphasized the role of interoperability between within the judicial systems and law enforcement. The judiciary, law enforcement, and administration should have interoperable systems. Currently, discussions about interoperability are conducted in isolation - those responsible for digital policy in government are not working with those responsible for digital transformation in the justice system

IGF 2022 Open Forum #50 Global Conference on CCB: Cyber Resilience for Development

Updated: Tue, 13/12/2022 - 11:54
Enabling Safety, Security and Accountability
Key Takeaways:

The Global Conference on Cyber Capacity Building (GC3B) is needed to bring multi-stakeholders together and mobilize effective, sustainable and inclusive stewardship of international cooperation for cyber resilience, bridging international development with international cyber capacity building.

,

A priority for cyber resilience is making sure there is sufficient support and sustainability. Opportunities and challenges of financing cyber resilience through different sources needs to be tackled, in addition to ensuring that resulting global public goods remain sustainable.

Calls to Action

Sessions under the Operationalizing Solutions pillar will be opened up to the global multi-stakeholder community in an Open Call for proposals for session leads., starting 15 December, at gc3b.org. Interested parties should submit a proposal for session leadership.

Session Report

Open Forum The Global Forum on Cyber Expertise (GFCE) held an Open Forum (#50) on the ‘Global Conference on Cyber Capacity Building (GC3B) 2023: Cyber Resilience for Development’ on Wednesday 30th November 2022, during the Internet Governance Forum (IGF) 2022 in Addis Ababa. This open forum session presented the concept, aims and foreseen outcomes of the conference while highlighting the opportunities for global cooperation through the GC3B.

Tereza Horejsova, Outreach Manager of the GFCE Secretariat, opened the event by summarising that the GC3B will be a key global gathering of leaders and experts to mobilize effective, sustainable, and inclusive stewardship of international cooperation for cyber resilient development and cyber capacity building. Its overarching aim is to catalyze global action to elevate and mainstream cyber resilience and capacity building in the international development agenda. She explained that this Open Forum was convened to consult with the global multi-stakeholder community on the conference program.

Chris Painter, President of the GFCE Foundation, outlined the conference’s aims and objectives. He noted that much has been done to promote best practices in cyber capacity building, but insufficient awareness among key decision-makers and a lack of resources and coordination sometimes hinders implementation. This is why the GFCE partnered with the CyberPeace Institute, World Bank, and World Economic Forum to work together in convening the Global Conference on Cyber Capacity Building: to advance, operationalize and collaborate on cyber capacity building. He affirmed that the need for cyber capacity building as a key enabler of sustainable and resilient digital development will be highlighted, reflecting the key theme of the conference for 2023: ‘Cyber Resilience for Development.’ Lastly, he highlighted the two main objectives of the conference: elevate and mainstream cyber resilience and capacity building as a first-order, strategic and operational priority in international cooperation and development, and to support middle- and low-income countries in incorporating cybersecurity and cyber resilience into their national strategic plans, including their digital and infrastructure strategies and investments. These objectives will be achieved through several concrete outcomes.

Theoneste Ngiruwonsanga, Project Manager in charge of Cybersecurity & Data Privacy at SmartAfrica, re-iterated the importance of cyber resilience for development and the need for the GC3B, by highlighting that cyber resilience requires deeper understanding of risks and communities at the regional level and that developing countries should introduce resilience into their critical functions. He explained that this is also because individuals will always strive to live and invest in countries that are resilient. Finally, he spotlighted the importance of the conference’s aim to be multi-stakeholder and inclusive, to mobilise effective, sustainable, and inclusive stewardship of international development and CCB, and that it is necessary to be open to input from the global community, in order to catalyze global action.

Dr. Towela Nyirenda-Jere, Head of the Economic Integration Division at AUDA-NEPAD, highlighted that a key outcome of the conference is a Global CCB Agenda that can be linked to Regional CCB Agendas. She zoomed in on the Africa CCB Agenda by stating that it is currently in the process of being written and finalized with contributions from the community such as through the GFCE Africa Regional Meeting 2022 which took place in the margins of the IGF 2022. Moreover, this process is being led by the Africa CCB Coordination Committee who represent key institutions with various stakeholder interests in Information and Communications Technology (ICT) and Cybersecurity in Africa. The Agenda relies on a demand-driven approach for the coordination and implementation of cyber capacity-building programs and initiatives on the continent. She also affirmed that a whole-of-society and whole-of-government approach to cyber capability building is needed. The key themes identified to be addressed in the Agenda are: political willingness from governments; revision of legal framework on cybercrime and technical capacity building for CERT and DFLs; coordination at national, regional, and international levels; and cyber awareness and skills development. Lastly, she explained the next steps, which include finalizing the proposed Africa Agenda on CCB together with the Africa CCB Coordination Committee, and subsequently submitting it for endorsement by the African Union, after which it will be presented at the GC3B.

Francesca Bosco, Senior Advisor at the CyberPeace Institute, presented an overview of the GC3B program, highlighting its four pillars: Making International Development Cyber-Resilient; Collaborating to Secure the Digital Ecosystem; Cyber Capacity Building for Stability and Security; Operationalizing Solutions. All the pillars will involve sessions and discussions on sub-topics and the 4th pillar, Operationalizing Solutions, will be further divided into four tracks: Empowering Better Program Management for Cyber Capacity Building and Cyber Resilient Development (Track A), Implementing Successful Cyber Capacity Building and Cyber Resilient Development Actions (Track B), Using Global Public Goods for Cyber Capacity Building (Track C) and Coordinating at the Regional Level (Track D). It was explained that the conference program has purposely left space for up to 12 session slots for members of the community to propose and/or lead sessions, under the cross-cutting theme of “Operationalizing Solutions”.  An Open Call for proposals for session leads or session topics under this pillar will be launched on December 15th 2022. Prior to this, the conference co-organizers are looking for feedback from the global multistakeholder community on the topics included per track (A-C).

Following the presentation of the GC3B and its program, participants were invited to partake in an interactive discussion regarding which topics should be prioritised under each of the tracks of Pillar 4. For Track A (empowering better program management for cyber capacity building and cyber resilient development) bringing cyber expertise into development programs and upskilling/reskilling development staff on cyber issues was identified as the preferable priority. Diversity and inclusivity were also identified as principles the Conference should aim at promoting and representing. Secondly, under Track B (implementing successful cyber capacity building and cyber resilient development actions), participants proposed that the track priorities opportunities and challenges of financing cybersecurity and cyber resilience in developing countries through different sources. In this way, the conference can serve as a launching pad for assessing the way in which resources are used and involve additional donors. Lastly, under Track C (Use of Global Public Goods for Cyber Capacity Building), participants highlighted the importance of giving wider access to existing resources and ensuring global public goods are designed and used sustainably.

Participants were thanked for their contributions and are invited to visit the website at gc3b.org for more information, or get in touch with [email protected] for any questions or to provide any further input.

IGF 2022 Town Hall #63 Enabling a Safe Internet for Women and Girls

Updated: Mon, 12/12/2022 - 09:56
Enabling Safety, Security and Accountability
Key Takeaways:

- Online gender-based violence creates a negative feedback loop because of its silencing effect. Because of how widespread the problem has become, it creates social norms that enable this behaviour to continue over time. - This problem is prolific and spreads to institutions such as national elections and even the IGF itself. More must be done to ensure that online spaces are safe spaces.

Session Report

Online gender-based violence is prolific. It affects women, girls, and gender diverse people across different countries and social environments.

 

People who call this problem out make themselves more vulnerable to further abuse. Honourable Neem Lugangira spoke about her experience as an elected official in Tanzania, the exposure that meant for online gender-based violence, and how raising this issue made her even further targeted. Another activist based in South Asia noted the self-silencing impact of this abuse for targeted individuals.

 

Women are helping themselves overcome this problem. Irene Mwendwa gave the example of Pollicy’s peer-to-peer communities and local-level training for resilience among women policymakers at multiple levels of government.

 

Norms against this kind of violence already exist. Clear analogies on the illegality of this behaviour in the streets should be translated into the online world, said Onica Makwakwa of the Global Digital Inclusion Partnership. Governments are falling short of their mandate to address this: huge amounts of public education against gender-based violence is required, and a failure to address this issue costs governments over $1 billion in lost productivity due to the digital gender gap.

 

Platforms are taking action, but transparency is missing, said Kat Townsend. This comes from the experience of the Web Foundation and its Tech Policy Design Lab, working with activists and major social media platforms on this issue. While the platforms are making changes that users can see, there is still an underwhelming amount of transparency for us to understand what the potential impact of this change might be.

 

Resources:

Amplified Abuse, from Pollicy https://pollicy.org/projects/amplified-abuse/

Tech Policy Design Lab, from Web Foundation https://techlab.webfoundation.org/ogbv/overview

Meaningful Connectivity, from A4AI https://a4ai.org/meaningful-connectivity/

IGF 2022 WS #283 Capacity Building for Safe & Secure Cyberspace: Making It Real

Updated: Mon, 12/12/2022 - 09:08
Enabling Safety, Security and Accountability
Key Takeaways:

Even though there are various factors involved in CCB and countries and regions have different needs and issues, stakeholders involved in CCB can explore best practices and solutions implemented in different regional settings, as they will often be transposable and adjustable to different contexts.

,

Cybersecurity is a shared responsibility between governments and other actors involved in this space, such as the private sector, technical communities and civil society. CCB should thus provide a multi stakeholder response to challenges, for eg through calling on the private sector to provide inputs on trends and threats overview, or empowering civil society to take an active approach in CCB, such as through the potential role of academia in fil

Calls to Action

Stakeholders are encouraged to make use of the GFCE ecosystem and tools, such as the Clearing House, that not only matches needs with resources, but also supports stakeholders through clarifying their needs and developing their CCB roadmap.

Session Report

IGF WS #283 “Capacity building for safe & secure cyberspace: making it real” looked at cyber capacity building (CCB) as a priority on the international cooperation agenda. The session discussed regional dynamics and challenges as well as the role  and intersections of different actors in a multistakeholder approach to CCB, particularly in workforce development.

APNIC mentioned that capacity building efforts faced three main challenges in the Asia-Pacific region, exacerbated by the COVID pandemic: the growth in number of users and networks led to additional pressure on existing operators; multilingual diversity challenges were noted in terms of the need for continuous translation of manuals and documents such as updated best practices in different languages (needed for maintaining the level of engagement); the increased reliance on access to internet in terms of livelihood and delivery of government services has also meant that this became a much more critical resource for organisations and businesses in remote areas. A reliable, accessible, affordable stable internet has become essential for securing growth in a stable way.

The OAS (Organisation of American States) – CICTE as focal point for GFCE’s Liaison highlighted their focus on identifying gaps in CCB. The challenges identified in the region were not only related to the gap between decision makers and the technical community but relate also to the disconnect between decision-makers stemming from an age bracket that does not necessarily identify with the information required to make cybersecurity-related decisions, and the cyber domain they are called to rule on. On workforce development, it was noted that there was not sufficient reference to education as regards the digital skills gap. There is an insufficient offering of university courses for reducing the digital skills gap. Further on, on the labour market, recent graduates are faced with obstacles or are unable to get cybersecurity jobs because of their lack of practical experience. Moreover, gender parity in the workforce dropped after the pandemic, which highlights the need for policies geared towards including women in digital and cybersecurity roles. CCB is a strong focus of the GFCE Liaison at OAS – CICTE. Mapping of ongoing projects and efforts is an important step undertaken by the GFCE liaison through analysing information on the Cybil Portal. The mapping so far indicates an increased interest from donors and implementers in the region. The OAS as GFCE hub is an ideal position to seize this as an opportunity through coordinating efforts and developing a regional roadmap for CCB implementation. The potential overlap between projects can thus be deconflicted through steering efforts towards different identified priorities.

The GFCE’s Clearing House process was illustrated through examples from the significant number of requests stemming from the Global South, particularly from African countries. The match-making mechanism goes beyond connecting members and partners who have identified CCB needs with resources within the GFCE community, be it expertise or financial. The process is in many cases a first step in a country’s CCB’s journey. The Clearing House process facilitates the discussions on CCB needs in an expert community, supporting countries in identifying and prioritising their needs, which results in developing national CCB roadmaps for the medium-long term. This exercise is essential to mobilise the resources and expertise available.

Cross-stakeholder engagement in CCB is vital as each stakeholder group is called to represent different viewpoints and play specific roles. From the outset, panellists focused on cybersecurity as a shared responsibility between governments and other actors involved in this space, such as the private sector, technical communities, civil society and academia. CCB should thus provide a multi stakeholder response to challenges, be it through calling on the private sector to provide inputs on trends and threats overview with predictions across a longer timescale, or empowering civil society to take an active approach in CCB, such as through the potential role of academia in filling in identified knowledge gaps. It was proposed that the shared responsibility mantra should in turn impact on the concept of CCB, broadening it beyond the state view and addressing local industries and civil society that need support and could benefit from CCB.

Further on the role of the private sector, it was noted that the industry is not only made up of large companies, it is in fact in majority composed of small companies that operate domestically and implement solutions locally. Even though they will not have the same resources available as a major industry player, they are still an essential partner and potential beneficiary of CCB.

Regarding civil society and academia, it was mentioned that investing in domestic research-based academic programs and engaging with academic communities at national level will provide a better understanding of the national context and can also help countries develop the knowledge required to build their national cyber capacity through a bottom-up approach.

The two stakeholder groups collaborate in meaningful and practical ways, for example through the private sector offering placements, fellowships, and internships, as certification programs often need to provide practical experience. 

It was concluded that a capacity building approach connecting industries and educational institutes ensures that there is no supply-demand mismatch in workforce development. Panellists underlined that workforce development strategies should be comprehensive and connect education, government and private sector. This ensures that the skilling content is industry-aligned, based on a common set of needs. This ensures also that all stakeholders speak the same language, be it when describing university courses offered or when drafting job descriptions. However, it was stressed that these strategies should be country-specific, as the need for cybersecurity personnel varies according to the country’s industrialisation and digitalisation level. So, it is important to promote career paths in a country-specific approach.

As a concrete example of a cross-stakeholder, country-specific approach, Microsoft mentioned the implementation across 23 different countries of cybersecurity skilling campaigns, aiming to bring in traditionally excluded or less represented communities in the cybersecurity workforce, including women. By partnering with local governments, education institutions and local businesses the campaigns aimed to ensure that the programs developed fit the unique needs of their own context.

Panellists reiterated the importance of having a cross-stakeholder approach to cyber capacity building, understood in the regional context and implemented at national level.

IGF 2022 Open Forum #57 Digital skills for protection and participation online

Updated: Mon, 12/12/2022 - 08:52
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Children are actors of change. But in order to become active digital citizens, that are able to safely navigate the online environment, access support online and provide support to their peers, they need to be equipped and empowered with the necessary digital skills and digital literacy education.

,

Online digital skills and safety education through engaging trainings are one of the ways to empower women and girls everywhere and allow them to access opportunities and rights online. International partnership is essential to deliver comprehensive and relevant education. ThE drafting national strategies should complement and be focused on privacy, security, and ensuring the inclusion of girls and women, and learning digital technologies.

Calls to Action

We need to involve children in drafting policies and have an understanding of what children are doing and what they need in order to feel safe online. Young people should be at the center of collaboration and this will help to improve cooperation and partnership on digital skills and online safety for all.

IGF 2022 Open Forum #97 Adopting Data Governance Framework: From Silos to Ecosystem

Updated: Fri, 09/12/2022 - 23:04
Governing Data and Protecting Privacy
Key Takeaways:
Speakers and participants at the Open Forum reflected on issues of data governance vis-à-vis national settings and individual experiences. Cutting across all interventions was the message that trust is the bedrock of data governance, and digital transformation in general – without that trust, it is impossible for frameworks or user experiences to be genuinely useful., Speakers and participants at the Open Forum reflected on issues of data governance vis-à-vis national settings and individual experiences. Cutting across all interventions was the message that trust is the bedrock of data governance, and digital transformation in general – without that trust, it is impossible for frameworks or user experiences to be genuinely useful. Data governance needs to be considered with respect to its processes and instit
Calls to Action
Data governance needs to be considered with respect to its processes and institutionalization, which should be nimble and tackle innovations head-on to harness their opportunities and addressing challenges quickly.
IGF 2022 WS #214 Blurred lines between fact & fiction: Disinformation online

Updated: Fri, 09/12/2022 - 10:13
Enabling Safety, Security and Accountability
Key Takeaways:

Disinformation circulating in the internet and in private groups can interfere in democratic processes . All panelists pointed out that investment that should be made in media literacy skills in population as a way of empowering them through a critical analysis of the information they receive. Meaning, to give the people the instruments that can help them to distinguish disinformation and misinformation and empower their decisions.

,

Apart from impacting democratic processes, disinformation has also impact in mental health of activists and young people regarding emotional effects that disinformation provokes. It was also mentioned that we should be very careful with the legislations, because there exist the threat to suppress free speech and pluralism in the conversations.

Calls to Action

One call for action is to support media literacy education and raise awareness about safety online. Particularly for the Youth we have to come up with different ways of raising awareness than the usual because they tend to ignore all the side effect of disinformation in the democratic societies and they only touch upon the surface of the problem.

,

Support to journalistic media as a way to avoid giving ground to disinformation, meaning to support good practices that follow the journalistic deontological code.

Session Report

Great insights and discussions at #IGF2022 on the 29th November during the workshop on online disinformation.

Unanimously, the internet is the source to which people would turn first if they need information on a specific topic and the internet has provided unprecedented amounts of information to huge numbers of people worldwide. However, at the same time, false and decontextualized information has also been disseminated. The rise of the digital platforms has enabled people to provide more direct access to content, and thus have replaced, in a way, mediated professional journalism and editorial decision with algorithms that prioritize clickbait content, in order to maximize engagement. Anyone with a social media account can create and spread disinformation: governments, companies, other interest groups, or individuals. Research has suggested that human users are the main amplifiers of online propaganda, not bots. Consequently, online influence operations are extremely fuzzy, as they largely depend on the broadcast of data by many private actors to reach their target audience. On top of that, inaccurate or misleading content has potentially damaging impacts on core human rights and the functioning of democracy.

In the workshop participated in the panel:

Sérgio Gomes da Silva, Director of International Relations and Communications at the General Secretariat of the Council of Ministers Presidency, member of the Board of Cenjor (Protocol Centre for Professional Training for Journalists), member of the Executive Board of Obercom (Communication Observatory) and member of the National Electoral Commission.

Rodrigo Nejm, Awareness director at Safernet Brasil, PhD social psychology Federal University of Bahia (UFBA). Coordinator of the Brazilian Safer Internet Day since 2009.

Samuel Rodrigues de Oliveira, PhD candidate in State Theory and Constitutional Law at the Pontifical Catholic University of Rio de Janeiro. Master in Law and Innovation. Attorney at Law.

Marina Kopidaki, an 18-year-old medical student at the University of Crete. Five years ago she joined the Greek Safer Internet Youth Panel and since then internet safety has been one of her main interests. At an international level, Marina have participated in various activities such as the BIK Youth Panel, the Internet Governance Forum (IGF) 2019, and the Youth Summit, among others.

During the workshop Sergio Gomes da Silva gave some examples to demonstrate how fake messages circulating in internet and in private groups can interfere in democratic processes. Regarding the balance between disinformation and the right to free speech Sergio Gomes the Silva point out the support to journalistic media as a way to avoid giving ground to disinformation, meaning to support good practices as the journalistic deontological code.

Another aspect point out was the investment that should be made in media literacy skills in population as a way of empowering them through a critical analysis of the information they receive. Meaning, give the people the instruments that can help them to distinguish disinformation and misinformation and empower their decisions.

Rodrigo Nejm stressed out how WhatsApp can be the only channel to access information for some part of the population in Brazil.

Media literacy is a key point but in Brazil it has to be said that the basic literacy is not ensured so how can they go further to media literacy.

Brazil has good Guidelines at the national level about media literacy a great approach, but the big challenge is how to scale these guidelines to all population.