IGF 2019 Reports

Issues on the Free Flow of Data, ICT Products and Services in a Digitally Connected World

Main Session
Updated: Fri, 08/05/2020 - 13:28
1. Key Policy Questions and Expectations
  1. Coordination and policy coherence are of fundamental importance to enhance cross-border data flows and create trust among governments, businesses and consumers. How is the articulation of the different processes done today to ensure policy coherence? What are the trends you see, and what are potential roadblocks today?
  2. How can we strengthen that the different regimes and frameworks adopted around the world are compatible and even interoperable in the future? What are your recommendations?
2. Discussion Areas:

Both G20 and G7 Chairs introduced their outcomes and activities on digital economy and trade, especially in relation to the free flow of data accross borders while addressing various challenges including but not limited to privacy and data protection, hate speech and fake news, discrimination, and others.

Speakers from the private sector advocated how important to maintain and facilitate cross border data flows and insisted some restrictive measures like data localization give a great and negative influence on the expansion of digital economy and trade. At the same time, they also recognized the importance of privacy, data protection, and cybersecurity so supported EU’s General Data Protection Regulation (GDPR). One said the Internet should not be fragmented and strong encryption would be critically important for DFFT. A panelist from the Polish government added the importance of interactions among different regulations and further explained how GDPR takes a balance between the free flow of non-personal data and the protection of fundamental rights while taking the reality of SMEs into account.

A panelist from WTO introduced the history and up-to-date situation of WTO negotiation on E-commerce while also saying that there are various fora to deal with digital economy and trade at present. Another trade-experienced panelist explained the inherently-closed and high-pressured nature of trade-related negotiation with various examples and advised to conduct open and closed discussions in parallel depending on points at issue in order to avoid deadlocks in discussions.

3. Policy Recommendations or Suggestions for the Way Forward:

Although there is no single or one-size-fits-all solution, multistakeholder dialogue should be continued while getting both trade-related and data-related stakeholders involved in the dialogue. Especially, in most cases, governments do not understand the reality or up-to-date development of technologies, so the diverse and inclusive discussion among public and private stakeholders would be necessary to move forward harmonization and interoperability for expanding digital economy.

It might be difficult to achieve interoperability in every single point at issue or to reach a consensus on each of different aspects in detail, so the relevant discussion should be aimed at achieving a middle ground in a principle level.

4. Other Initiatives Addressing the Session Issues:

The newly released Internet & Jurisdiction Global Status Report 2019 was highlighted as a key policy tool, demonstrating a lack of coordination between regulatory initiatives. According to the Report, only 15% of surveyed stakeholders (governments, companies, IGOs, civil society) say we have the right frameworks and standards in place to address cross-border legal challenges in cyberspace.

When discussing the value of multi-stakeholder cooperation to ensure cross-border data flows and trust, panel discussants suggested concrete actions to support the cross-border internet such as building pathways to support enhanced coordination between actors and policy processes. The role of the new G7 and G20 Chairs the United States and Saudi Arabia, in stepping up to these tasks and building on the efforts of France and Japan as previous hosts were underscored as important.

5. Making Progress for Tackled Issues:

As the moderator, Mr. Paul Fehlinger from Internet and Jurisdiction Policy Network (I&J) rightly indicated, “the multistakeholder model can provide a pathway into different policy processes around the world from the perspectives of different stakeholders, so that we have the necessary coordination for cross-border data flows in the future” . There are indeed various fora as discussed during the session, but the most important thing is to continue broader multistakeholder dialogue on digital trade while involving decision-makers such as governments and international organizations.

In this regard, unfortunately, we could not invite Chairs of both G7 and G20 next year to the session, so in order to ensure the consistency of digital trade-related multistakeholder dialogue, we will have to invite G7 & G20 Chairs of both 2020 and 2021 if we have an opportunity to hold the similar main session on digital trade at IGF Katwice 2020.

6. Estimated Participation:

I guess around 50-100 people participated in our main session (both onsite and online), and about a half of the participants were women.

7. Reflection to Gender Issues:

Fortunately, about a half of the speakers were women and they all made various professional and invaluable inputs during the panel discussion. In addition, Ms. Salwa Toko from France, G7 Chair this year emphasized the importance of multistakeholder dialogue and in the context of it, she mentioned a facial recognition and pointed out the necessity of discussing the ethical problem of it since the technology could cause gender discrimination. Further, Ms. Luiza Brandão from Brazil insisted the importance of the diversity of stakeholders when it comes to discussing global coordination on digital trade.

8. Session Outputs:
  1. Internet and Jurisdiction Policy Network, “I&J Deputy Executive Director Moderates UN IGF Main Session”, December 4, 2019, at https://www.internetjurisdiction.net/event/i-j-deputy-executive-director-moderates-un-igf-main-session .
  2. Konosuke Matsuba (Mercari Inc.), “包摂的なデジタル社会を実現するために必要な国際協調とは何か (What is necessary international coordination in order to achieve an inclusive digital society)”, December 3, 2019, at https://merpoli.mercari.com/entry/2019/12/03/070000 .
  3. Ministry of Internal Affairs and Communications, “インターネット・ガバナンス・フォーラム(IGF2019の結果―大阪トラックの推進に向け、信頼性のある自由なデータ流通を議論―(Results of Internet Governance Forum (IGF)  2019 – Discussed Data Free Flow with Trust (DFFT) for the facilitation of Osaka Track”, December 6, 2019, at http://www.soumu.go.jp/menu_news/s-news/01tsushin06_02000193.html .
  4. Japan Network Information Center, “IGF2019フォトレポート(IGF 2019 photo report”, December 6, 2019, at https://blog.nic.ad.jp/2019/3684/ .
Governance Challenges in the Digital Age: Finding New Tools for Policy-Making

Main Session
Updated: Mon, 09/12/2019 - 10:48
1. Key Policy Questions and Expectations

Policy questions

  • What should be the perspectives, and which stakeholders and disciplines need to be considered, to enable policy-making approaches that are truly multidisciplinary for Internet Governance?
  • What are the underlying structural conditions that facilitate truly multidisciplinary policy-making process?
  • What are examples of attempts to build multidisciplinary policy-making processes for public policy already being developed on Internet Governance across the globe? What worked well? What needs improvement? What lessons can we learn from private sector policy-making?

Overall expectations from the session

We aim to highlight conceptual frameworks and good practices coming from concrete cases presented in the session to illustrate ways to go beyond working in silos and to create policy-making approaches that are truly multidisciplinary and involve a full range of perspectives and actors, in a wide range of substantive topics, covering the life-long period of policy-making from its design, implementation and evaluation.

2. Discussion Areas:

The discussions highlighted a number of key elements for successful policy-making in the digital age. It was felt that processes that are inclusive, transparent and make use of 21st century tools can lead to increased trust in the process, provide more legitimacy and result in better informed and more balanced outcomes.

Full Report available HERE.

3. Policy Recommendations or Suggestions for the Way Forward:

The following specific elements of a successful process were put forward:

  • Being thoughtful about the design of a process at the outset
  • The importance of an open, inclusive and accessible process
  • Transparency
  • Accountability
  • Finding the right stakeholders for the specific issue at hand, who can bring the expertise necessary to produce informed and evidence-based decisions
  • Flexibility
  • Practical ways to engender trust and create genuine dialogue
  • Measuring impact and disseminating results
4. Other Initiatives Addressing the Session Issues:
5. Making Progress for Tackled Issues:
6. Estimated Participation:

Onsite – around 250 participants, 35% female

Online – around 10 participants, 30% female

7. Reflection to Gender Issues:

A majority of the panelists were women.

Women were considered as one group - alongside others such as young people, disabled people, refugees, former prisoners – which policy-makers should consider when designing inclusive consultative processes.

8. Session Outputs:
Addressing Terrorist and Violent Extremist Content Online

Main Session
Updated: Thu, 20/08/2020 - 12:29
1. Key Policy Questions and Expectations

This main session focused  on the different responsibilities, responses, rights, and risks involved in policy approaches to dealing with terrorist and violent extremist content (TVEC) online.  Regulatory and non-regulatory approaches were considered, including how Internet platforms deal with TVEC uploaded to their services by end users. Panelists addressed various policy questions, including:

  • What are the different responsibilities of the different stakeholders to each other and to the broader public in developing stragegiess to fight the spread of TVEC online?
  • How have governments responded to the spread of TVEC online?  How has the private sector responded?
  • What are the different human rights that are relevant to the discussion of TVEC online, and why?
  • What are the potential risks to different human rights posed by TVEC regulation and how are these risks being addressed?

 

2. Discussion Areas:
  1. Different government policy approaches: New Zealand, Germany, and the United States
      • New Zealand
      • Germany
      • United States
  2. How tech companies are responding to TVEC: Kakao, Facebook, Microsoft and GIFCT
  3. Policy implementation: Challenges for resource-poor languages in the Global South
  4. Rights and Risks: The tension between TVEC regulation and Freedom of Expression

Should TVEC be treated as Hate Speech? Does TVEC equal “hate crime”?

 

 

NZ PM Ardern addressed the audience with 4 main messages: 1) Respect for international law (human rights and counterterrorism frameworks); 2) Free open, interoperable, global Internet to preserve the benefits of connectivity; 3) Collaboration and consultation in a multistakeholder approach is key; 4) Strengthening and engaging collaborative efforts.

 

Park introduced the responsibilities and responses section of the agenda and then all panellists presented their initial remarks.

 

Park: Although there is a proliferation of violent content across platforms, services and applications, content can’t be banned just because it is violent but based on the external harms it may cause. Understanding hate speech is a fundamental part to address TVEC online as well as the impact of trying to counter it. "TVEC" is a "misnomer" and the discussion should instead be couched in the well-established analytical framework for “hate speech.” "What we really mean by violent and extremist content is 'hate speech.'"

 

The statement from Park summarizes two SCOTUS decisions by saying that “Content can be banned only for its external harms.” (see https://www.supremecourt.gov/opinions/10pdf/08-1448.pdf ; https://www.supremecourt.gov/opinions/09pdf/08-769.pdf)

 

Park’s statement aroused disagreement from both the US representative and GIFCT/Facebook. Also Gerd Billen, State secretary from the German Ministry of Justice and Consumer Protection did not agree with Park and explained that the German Network Enforcement Act NetzDG does not focus on hate speech but hate crime, i.e. content that falls under the regulations of the German Penal Code like Nazi symbols.)  

 

Ash: Identification of stakeholders and concrete definitions of their roles and responsibilities is far from being an easy task, as each stakeholder has a different understanding of what responsibility entails, the different vocabularies each stakeholder uses and the resources they have to effectively engage in the response. 

 

“We're working in difficult, uncharted territory, where work like legislation can have significant unintended consequence unless it's rooted in a collaboration of parties.” It is essential to work with the companies, government partners, and a broad swath of civil society actors - from those concerned primarily with Freedom of Expression to those who work to protect the rights of the victims. This collaboration lead to the Christchurch Call, at the heart of which is respect of international law, counterterrorism law, and human rights law.  It is a response to “someone murdering 51 people peacefully at worship and livestreaming that across the world.”

 

 

Billen: Extremists final goal is trying to destroy democracy and pluralistic approach to society . Freedom of expression is at stake as well as the freedom to exercise individual rights. He highlighted how many tech companies reduce their counteraction to TVEC content to the application of community standards. Therefore Germany responded with the Network Enforcement Act to increase transparency and accountability, in reference to Germany’s criminal law. Only content that violates Germany’s criminal law can be deleted or removed since community standards -however valuable- as they are not approved by parliament equate to private rules or regulations. It is improtant to protect victims of these crimes and to invest more in digital literacy programs and in understanding the criminals and their networks better.

 

Germany is working with France to deliver ideas to the European Union on this issue in the discussions on the upcoming Digital Services Act legislation.

 

Clark: Building long term resilience and responses to terrorist messaging is at the core of the US response on how to tackle TVEC not only short-term content removal. US Constitution and strong commitment to Freedom of Expression, expressed through the First Amendment, informs the US approach, as well as international obligations and a commitment to human rights.  Guiding principles for policy approach: 1) US law does not compel removal of content unless it clearly violates US law. There are types of TVEC content -beliefs alone however extremists- are protected under the US constitution first amendment. The US approach encourages 2) voluntary collaboration with tech companies through strengthening and expanding terms of service and community standards, instead of designing new regulations. 3) The most effective means from their point of view is not censorship or repression but through more speech that promotes tolerance, cultivating critical thinking skills and raising awareness on a multistakeholder approach. In the US, there is a line between hate speech and violent extremist content, so only speech that calls for violence is not protected under the constitution.

 

It is important to note that some governments have used counterterrorism as a pretext to crush political dissent. An experience from working with former members of groups focused on racially or ethnically motivated terrorism, is that they’ve said government censorship is one of the best recruitment tools because it reinforces the group’s narrative of oppression.

 

Choi: User protection is at the center of Kakao’s products design and use as part of their social responsibility (social media, messaging and news). This human centric design is shared among many other companies in South Korea concerned about Korean culture and values and online safety, with a strong emphasis on engaging the user to reporting issues that are cause of concern and acquire digital skills to effectively engage in conversations and dialogue online. The company continues to adjust the design of its products to counteract bad behavior and protect user rights. Kakao encourages the user to use reporting tools to help prevent the spread of harmful content. Kakao’s policies are developed in conjunction with other Korean companies through KISO - the Korean Internet Self-Governance Organization (https://www.kiso.or.kr/%EA%B8%B0%EA%B5%AC%EC%86%8C%EA%B0%9C/), established in 2009.

 

Fishman: Facebook works to coordinate efforts across different departments of the company (e.g. legal, engineering). Brings in independent experts to help develop policy. More than 350 employees globally are focused on TVEC; 10 million pieces of content have been removed for TVEC alone in 6mo period.

 

Facebook’s efforts cover 5 main areas around dangerous organizations (terrorist, hate, criminal groups): 1) enforcement of Community Standards and Terms of Service; 2) engagement with law enforcement in response to information requests or credible threats of violence; 3) support counter-speech; 4) look after staff dealing with this TVEC; 5) engage industry partners beyond competition. The biggest challenge is the scale and how to take context into account, even using AI and machine learning. This is the reason for why the policy is global - making enforcement infrastructure for national level legal structures for all countries an extraordinarily difficult challenge

 

The GIFCT was established by tech companies (Facebook, Microsoft, Google and Twitter) to share best -and worst- practices and evolved into sharing hashes of known terrorist content and coordination . Now as an independent organization, the main issues arising are: 1) an industry led effort, with strong engagement with governments through an advisory structure (only those that are signatories of the Freedom Online Coalition and respect human rights) ; 2) good, effective training for smaller companies’ platforms and online services to define their own terms of service and technology development, including AI; 3) continue the collaboration to coordinate responses; 4) to share capacity and 5) to sponsor research. Terrorism is a strategy of the weak to provoke responses that are not conducive to long term interest. As we think about the long term, it is more important to find strength on what we stand for as opposed to against.

 

Gregoire: highlighted that clear definitions of the roles and responsibilities of each stakeholder are as important as the responses we come up with. Those definitions are the base to build effective collaboration and partnerships that lead to concrete responses. For Microsoft, the responsibilities vary, not only from their social media platform but around the productivity tools that the company provides.

 

Microsoft organizes its work around advocacy, internal policy, tech and tools for enforcement of that policy, and partnerships with the broader ecosystem - e.g. Microsoft is a strong supporter of the Chch Call and its multistakeholder approach.  The Chch Call is unique in articulating the roles and responsibilities of the different stakeholders.  Government’s role is to counter all notions of what breeds violent extremism, including lack of economic opportunity.  Private sector needs to focus on enforcement, transparency, and knowledge sharing, upholding human rights. 

 

Wijeratne: Responses against TVEC have to deal with the fact that people are dying while the conversations about how to deal with what is happening are taking place. Watchdog was a civic tech, first responder, civil society response. At the scale and speed at which hate speech and harmful content grows in Sri Lanka, in conjunction with the scarcity of linguistics and etymological resources available to understand how hate speech manifests in those languages, the enforcement of terms of service in non-English speaking countries becomes almost impossible. The connection between local expertise and research capacity in the global south and where the data sets are (where the companies are) needs to be supported as it is impossible to have technical systems without biases, so multidisciplinary teams should be analysing the datasets. It is necessary to consider the technical aspects of the implementation of those policies into concrete action.

 

Would estimate that a majority of users engage in hate speech in Sri Lanka, as compared with Germany, but the enforcement of policies in countries like Sri Lanka and in the global south is relatively low.  This is because the technical implementation of policies is not designed for “resource-poor” languages. What an engineer can do, in terms of building enforcement tools, in a well-researched language belonging to the West Germanic language tree is 10 years ahead of what can be done in Sinhala or Tamil; algorithmic design problems are based upon language.  E.g. Removal policies for hate speech may focus on specific threats to occur - things in the future - but Sinhala does not have a future tense. Understanding when hate speech occurs in different countries requires analysts to understand the ethnicities involved in the conversation. Collaboration is needed between parties who have the data sets and local academics.

 

Lanza introduces the rights and risks section.

 

Lanza: TVEC online moderation is probably one of the most difficult areas of content regulation due to the incompatibility of current policies and technical tools with the existing human rights framework. Protection to freedom of speech is falling on the tech companies, without enough understanding of what their responsibility is, about what is at stake for society -not for the companies-. Companies are recommended to assess their terms of service and community standards against the human rights framework, especially around the responses they design (remedies) and the appealing mechanisms in strong collaboration with local experts. Censorship is not an effective response to violent extremism. Content filtering policy has to meet the requirements of legality, proportionality, and necessity.

 

It was mentioned that terrorism is not so much of a problem in Latin America as incitement to violence by gangs. It is mportant that states refrain from applying speech restrictions in a broad manner - including words like “glorifying, justifying, or encouraging” which results in broader criminalization of speech than allowed under international law.  Companies should conduct human rights impact assessments of content moderation policies, these should be necessary & proportionate, transparent and accessible appeal mechanisms should be provided, and thecountry context should be taken in account. Another suggestion would be to develop international standards for content moderation policies.

 

Panelists offered their views around the rights and risks

 

Clark: Regulations and enforcement -of conflicting regulations- can restrict innovation and commerce. She emphasized how difficult enforcement can be as definitions are so unclear and contradictory. Focusing on technical solutions takes the attention off the actual perpetrators and may even violate human rights as is the case with upload filters developed by companies. She stressed that voluntary collaboration is a better approach. Regulation (like the German Network Enforcement Act) may be an inspiration for other less-democratic regimes to apply censorship and restrict freedom of speech. A response to this threat should not put the open Internet at risk.

 

Park: The laws that impose liability on the platforms for not removing content and obliges them to engage in general monitoring and incentivize the use of upload filters or other forms of prior censorship conflict with e-commerce directive from the EU and ignoring intermediary liability principles included as part of Freedom of Expression safeguards. When faced with compliance with mandatory takedowns, operators rush to delete content, as they may not have enough time to make an evidence-based decision. This might lead to suppress counter-speech and takedown lawful content prepared as part of awareness campaigns, counteracting misinformation.

 

Billen: The experience in Germany is that companies do not delete everything related to a complaint they receive, only about 20 to 25%. That shows they are deliberating and finding out what is illegal and what has to be tolerated to preserve freedom of speech. Pressure from civil society and victims in Germany has led not only to content takedown but to the cancellation of platforms accounts of extremists. As they search for other platforms to continue, they have not succeeded in taking their followers from the big platforms with them, which has the positive effect on having less people exposed to this type of content.

 

Gregoire: Freedom of expression is important as well as the right to access information and the right to privacy. Concerning trends from the notice and takedowns regulations around laws that have extraterritorial implications as well as mirror content, as context matters. How are we going to address narrowly defined harms that uphold the global framework is something that requires deep thinking.

 

Ash: This is hard, but not acting is a major risk. We have to put the victims at the center of a response about how to deal with this. There is a big risk to the Internet as a whole, as the perception from those affected, calls for additional protections which might lead to loose the benefits the open and free Internet can bring.

 

Wijeratne: Bad actors are not the only ones spreading TVEC online. In the hunt for bad actors it is important to acknowledge that the virality of this content is in most cases caused by terrified people trying to warn their loved ones. It is mathematically impossible to design systems without bias. We have to bring humans to deal with the false positives and false negatives. The datasets and protocols observed to design these systems at the tech companies, should be open to civil society to have a multidisciplinary interrogation, so the companies get advice on local cultures and languages, human rights frameworks and legal frameworks.

 

The moderator opened the floor for questions and comments.

 

  1. How/what is Facebook doing in the context of the war in Afghanistan? Fishman answered that their policies are global and do apply in the Afghan context. As their policies start from what they called “dangerous organizations” one of the issues they faced is how to deal with the content that such organizations might produce as part of peace negotiations.
  2. Are we close to a real-time global response for takedown of harmful content? Wijeratne answered that due diligence is really important to actually takedown content as the risk to act with haste can cause more harm.
  3. Online content is only a portion of TVEC content available. Some of the responses required will be conflicting with the business models of the tech companies, the role of governments to make sure effective action is taken. It is important to understand how people get radicalized and their motivation to be able to deal with the effects. Platforms should take responsibility as publishers have to.
  4. There is a bias on terrorism research and we are failing to understand the lessons learned from the last few decades. It is important to acknowledge that terrorism may have a root in state sponsored terror. Some organizations tagged as terrorists in the past can negotiate peace and their evolution is part of the process to grow as a society. Historical records should be preserved to be able to understand it and learn from it. Fishman agreed with the need to have a multistakeholder discussion -and agreement and real collaboration- for a historical record to be kept.
  5. Do you believe nations should be held accountable for the actions of companies and organizations within their borders for producing and promoting TVEC?
  6. How do you see traditional news organizations play a role? Clark responded that a lot of radicalization takes place by traditional media. She highlighted the role of news organizations to provide clarifications, correct misinformation, and support the production of counter-speech to promote tolerance. Wijeratne highlighted how blocking content online is not going to stop humans hating each other as a lot of the harm to individuals caused during war is not mediated by technology and that not everyone is acting in everyone’s best interest. Lanza highlighted the need to protect journalists, researchers and their sources. He reminded the audience that government officials should also be held accountable for the impact of their words, as they have a duty to respect the right to protest. He said that the legal framework of one country should not be the only test for TVEC responses, but reviewed under international frameworks.

 

Closing remarks

 

Park: Although it is welcome progress to have principles of intermediary liability incorporated into national law and international frameworks, it is important not to take them too far and use them as an excuse for companies to avoid responsibility. Takedown of content should be the last resort. Mandatory takedowns may also hinder innovation and diversity in the market as new platforms addressing TVEC in a different way may not have enough resources to compete with the current platforms that dominate the market.

 

Billen: We should not limit the conversation toTVEC online but also support research into the reasons why people become an extremist and how to avoid it. It should not be only about the platforms’ responsibilities.

 

Choi: The private sector should take seriously their responsibility to solve this problem and their response should be agile. Respect for law and human rights is key to do business. Digital and media literacy is key for a safe Internet.

 

Gregoire: The tools of opportunity that the Internet has provided can be weaponized. The IGF is a key space to seek common ground to define the responsibilities of each stakeholder, to clearly articulate what is the problem that we are trying to solve and what are the values and the rights we are trying to preserve.

 

Ash: This is hard, a holistic approach to deal with the harms and victims' rights is the right thing and it has to be done together.

 

Wijeratne: More research is required to fill the gaps. Civil society and technical community should be encouraged and supported for in-depth understanding of local contexts.

 

Fishmann: Ambiguity in law may cause conservative responses from companies, for example platforms not allowing access to datasets to academic and civil society in light of not fully understanding what GDPR implications might be. Clear definitions around responsibilities are needed.

 

Clark: What success looks like? What it is that we are expecting to happen? A balance response between strong security and fundamental rights. IGF’s role is key to discuss these issues and deepen the understanding around other types of harmful content across many sessions.

 

Lanza: A raise for criminalization and censorship is not the best way forward. Clearer definitions around what is content that incites violence and the responsibilities for each stakeholder requires a multistakeholder approach to come up with global solutions.

 

In closing, the moderator called to continue the discussion to deepen the understanding and the reasons why there are different approaches to solve this problem.

3. Policy Recommendations or Suggestions for the Way Forward:

Both during the video address of PM Ardern and as part of all the panelists interventions there was strong agreement that the Internet is a powerful force for good, but that terrorist and violent extremist content online at a global scale requires a multistakeholder, inclusive and concrete response, taking into account the risks, rights and responsibilities.

There was also strong agreement that human rights need to be upheld, no extremist should have the right to neglect and demolish the human rights of any other person, like the Christchurch terrorist did to his victims by the murder and the live-streaming.

Legal regulation was considered as one option in particular to avoid companies being in sole responsibility for the decision on what should be deleted and what should not. Nonetheless a balanced approach was demanded since governments might misuse regulation to oppress free speech.

4. Other Initiatives Addressing the Session Issues:

Links to:

Christchurch call

GIFCT

German law: Act to Improve Enforcement of the Law in Social Networks non official translation

US law

Kakao policy

Microsoft policy

Facebook policy 

Watchdog

OAS freedom of speech

Australian law

Korea law

Manila Principles

French law

5. Making Progress for Tackled Issues:

The IGF’s role was considered as key to discuss the issues of TVEC and deepen the understanding around other types of harmful content. The IGF is also seen as a key space to seek common ground to define the responsibilities of each stakeholder, to clearly articulate what is the problem we are trying to solve and what are the values and the rights we are trying to preserve.

Civil society andc technical community should be encouraged and supported to do more research for in-depth understanding of local contexts.

 

The IGF should be the platform to develop international standards for content moderation policies.

6. Estimated Participation:

400 people, around 100 women on-site.

7. Reflection to Gender Issues:

The session did not address gender issues in particular.

8. Session Outputs:

Transcripts and video of the session are available.

BPF Cybersecurity - Exploring best practices in relation to recent international cybersecurity initiatives

BPF Session
Updated: Thu, 28/11/2019 - 09:57
Security, Safety, Stability and Resilience
1. Key Policy Questions and Expectations
  • What is the role of norms? And can and should we then move forward from discussions on cybersecurity norms to operationalization of those norms?
  • Are cyber norms cascading into the international system and what are some challenges that arise? What are some processes or norms that do not exist today but are still required?
  • How can cybersecurity norms be assessed to evaluate whether they are working?
3. Policy Recommendations or Suggestions for the Way Forward:
4. Other Initiatives Addressing the Session Issues:

Success examples, projects, and initiatives mentioned during the session:

  • GCSC norms
  • Technical examples related to the routing system and anti-spoofing standards and best practice
  • APC publications on network and society organisations into the Human Rights Council procedures
  • AccessNow publications
  • Oregon Observatory of network interference
  • OIA American research
  • AR2018
5. Making Progress for Tackled Issues:
6. Estimated Participation:

There were an estimated 60 onsite participants, of which about 20 were women

There were 8 online participants, of whom 3 were women.

7. Reflection to Gender Issues:

The session discussions were gender-neutral and did not discuss gender issues.

BPF Internet of Things (IoT), Big Data and Artificial Intelligence (AI)

BPF Session
Updated: Mon, 16/12/2019 - 08:32
Data Governance
1. Key Policy Questions and Expectations

The first part of the session focused on opportunities for IoT, Big Data, AI to address societal challenges.

  1. Convergence of IoT, Big Data and AI has a huge potential to benefit society.
  2. There is need for improved outreach and digital literacy on IoT, AI and Big Data to gain traction as people fear what they do not understand.
  3. Collective intelligence is a promising area of research that can be exploited for the benefit of society.

The second part of the session dealt with policy challenges and best practices.

  1. Uptake and trust are linked. Improving trust will improve the adoption of these three technologies.
  2. AI is dealing with more data than only personal one. Data protection is an issue to be addressed. Regulation is needed, balancing market adoption and human rights. 
  3. A human-centric approach is needed if these technologies are to play a role in the SDGs.
2. Discussion Areas:

The panel supported the potential of IoT, Big Data, AI in addressing societal challenges. Improving awareness about benefits and risks allows users to take informed decisions. Government and grass-root activities, and SME involvement are needed. Improving infrastructure is key to access and uptake. 

David Salomao presented case studies for disaster management and pointed to the challenge of analyzing quickly huge amounts of data to find patterns. Big data produced by telecom subscribers can be used to gather information during disasters. This needs regulation to avoid misuse. 

Christine Tan pointed to the fragmentation in the IoT area. She presented two applications, in rural villages in China and mountainous areas, where simple sensors provide useful information about the environment and allow remote monitoring. 

Raymond Onuoha observed that AI, IoT, BD have not yet gained traction while they could help address the SGDs. He referred to their use during the Ebola outbreak in Liberia. 

Olivier Bringer noted that these technologies are key pillars of digital transformation. He presented EU projects, highlighting how huge amounts of data gathered through collective intelligence can be used to improve, for example, transportation. 

Evelyne Tauchnitz spoke about risks and opportunities in peace and conflict. She pointed out that if AI is trusted, it can help to prevent future conflicts. Agenda setting is an important issue as it shapes the debate. 

Emanuela Girardi gave examples in health, inclusiveness for the disabled and drug discovery. It is important to bring AI to people by outreach activities. She presented the CLAIRE project to create a European AI ecosystem, 

Bruna Martins dos Santos presented AI projects that bring transparency in the public space. She pointed to the need for a national AI strategy that considers all voices and is regional sensitive. She talked about facial recognition for public safety in Brazil.

3. Policy Recommendations or Suggestions for the Way Forward:

Activating pilot projects and sharing lessons learned can help drive policy and regulation. An active platform to share outcome of pilot projects by all stakeholders could support improvement of policy and regulation. Use of open-source technology, adoption of open standards, and a certification process could improve system security and trust.

Considering voices from all players and being sensitive to regional needs has been highlighted as ways to improve trust. All the speakers agreed on the need of opening a debate on the adoption of these new technologies and on explaining to the community at large what they imply.

4. Other Initiatives Addressing the Session Issues:
5. Making Progress for Tackled Issues:

As IoT, Big Data and AI are evolving technologies, there is a need to continue BPF on this subject. More case studies are needed as they can drive policy and regulation.

6. Estimated Participation:

150

7. Reflection to Gender Issues:

70

It has been pointed out that in AI technologies it is important to consider gender. 

8. Session Outputs:
NRIs Collaborative Session on Privacy Online

NRI Session
Updated: Wed, 27/11/2019 - 11:09
Security, Safety, Stability and Resilience
1. Key Policy Questions and Expectations

Policy Questions: 

  • How can end users rights and their capacity to protect themselves and their data be reinforced?
  • What are the practices of privacy protection on national and regional levels?
  • What role should Internet platforms play in defining the standards for privacy protection online?
  • Are nationally developed standards globally acceptable?  
2. Discussion Areas:
3. Policy Recommendations or Suggestions for the Way Forward:
4. Other Initiatives Addressing the Session Issues:
5. Making Progress for Tackled Issues:
6. Estimated Participation:
7. Reflection to Gender Issues:
8. Session Outputs:
NRIs Collaborative Session on harmful content online

NRI Session
Updated: Fri, 20/12/2019 - 09:39
Security, Safety, Stability and Resilience
1. Key Policy Questions and Expectations

Policy Questions: 

  • How can risks of contact and content be addressed successfully by legal and regulatory approaches as well as by technical instruments and how can digital civility be increased?
  • What role should Internet platforms play in defining the standards for acceptable content in light of freedom of speech?
  • How can globally accepted standards be developed?
  • What kind of collaboration could be created among Internet platforms and media outlets to fight disinformation and fake news?
  • Where is the middle ground between increasing demands for proactive content policing by digital platforms and the necessary neutrality and legal certainty for platforms?
2. Discussion Areas:

The discussion developed around the regularly and legal response to combating harmful content online on national levels. Different practices were shared. The Japan IGF prioritised the child online pornography as the part of the online available content that the country's institutional mechanism successfully addresses. The Armenian IGF followed with centralising the content of blogs as a key topic in the country. The Bolivia IGF noted the importance of capacity building and digital literacy that should be primary for the communities.  The French IGF added that the EU is focused on addressing harmful content online. 

3. Policy Recommendations or Suggestions for the Way Forward:
4. Other Initiatives Addressing the Session Issues:
5. Making Progress for Tackled Issues:
6. Estimated Participation:
7. Reflection to Gender Issues:
8. Session Outputs:
NRIs Collaborative Session on Data Protection

NRI Session
Updated: Sun, 01/12/2019 - 11:52
Data Governance
1. Key Policy Questions and Expectations
  • Are there concrete data protection policies in NRIs countries and regions?
  • What are the issues of relevance for your communities?
  • How can we develop internationally accepted standards on data protection?

Elaborations on these questions available at: https://docs.google.com/document/d/1GJxPYtNNbvQSSdg-4AIHFVoZlbbm1S1lK2d…

2. Discussion Areas:

The panel made an important input on the fact that the government should take a higher responsibility and initiative for tackling data protection issues. It was mentioned the fact that there should be more pressure on policymakers and politicians to adopt and update Data Protection Laws. The adoption of the International Common Standard to all regions, that would increase, develop and foster economies, as underlined as important. One must mention that these standards must follow the principles laid out in the 2009 Madrid Declaration of The ICDPPC and The Resolution On The Promotion Of New And Long-term Practical Instruments And Continued Legal Efforts For Effective Cooperation In Cross-border Enforcement 41st International Conference Of Data Protection And Privacy Commissioners October 2019, Tirana, Albania. As an example a multi-stakeholder can be an approach to adopt those common standards. Increase the level independence of the bodies that regulate data protection subject rights. In that sense, we need more dialogue between IGF and ICDPPC (now the Global Privacy Assembly). 

3. Policy Recommendations or Suggestions for the Way Forward:

The panel made an important input on the fact that the government should take a higher responsibility and initiative for tackling data protection issues. It was mentioned the fact that there should be more pressure on policymakers and politicians to adopt and update Data Protection Laws. Another remark made by North Macedonia IGF was the adoption of the International Common Standard to all regions, that would increase, develop and foster economies. One must mention that these standards must follow the principles laid out in the 2009 Madrid Declaration Of The ICDPPC And The Resolution On The Promotion Of New And Long-term Practical Instruments And Continued Legal Efforts For Effective Cooperation In Cross-border Enforcement 41st International Conference Of Data Protection And Privacy Commissioners October 2019, Tirana, Albania. As an example, a multi-stakeholder can be an approach to adopt those common standards. Increase the level of independence of the bodies that regulate data protection subject rights. In that sense speaking, we need more dialogue between IGF and ICDPPC (now the Global Privacy Assembly). 

4. Other Initiatives Addressing the Session Issues:

North Macedonia IGF mentioned some of the initiatives she took the lead on the research of data protection legislation breaches. The training at the Startup Space Community Center on GDPR. SMEs and Startups showed the result that the business sector is more aware of this problem than the public sector. And ironically enough they are the biggest data processors. 

5. Making Progress for Tackled Issues:

During the session, the panel mentioned that the governments and legislators have to take responsibility to ensure a good legal framework for unlawful data collection and processing. The challenges is that there are DPOs - data protection officers, but no Information Security officers. On the main issues tackled the progress might be achieved by cross-border partnerships and collaborations to achieve the main goal, for standardizing data protection regulations.

Data protection should remain a major focus of IGF and of NRIs sessions in the future. NRIs are important fora where civil society can discuss and monitor progress made in data protection legislation, but also hold data protection authorities to account.

Unlike ICDPPC, the IGF is good at bringing together the international civil society. However, partnerships between the network of NRIs and DPAs on the issues of data protection may need to be reinforced in the future.

6. Estimated Participation:

There were about 35 onsite participants.

There were about 15 women present.

7. Reflection to Gender Issues:

IGF north Macedonia mentioned a fact to analyzed the profile of nominated DPOs in public and private entities and over 65% are women. Apartnership meeting with the President of the Women Entrepreneurs was established. There is  Women Empowerment and a High Level of accountability when a Woman is Managing Personal Data.

                                          

 

8. Session Outputs:
NRIs Collaborative Session on Human Rights

NRI Session
Updated: Tue, 19/11/2019 - 11:19
Data Governance
1. Key Policy Questions and Expectations

Policy Questions: 

 

  • What are the priorities regarding the human rights for local communities?
  • How do we protect the privacy and free speech on the Internet?
  • Should national approaches to regulation be internationally harmonized and how?
  • Are there concrete examples of digital cooperation on national and regional levels for protecting human rights on the Internet?
2. Discussion Areas:
3. Policy Recommendations or Suggestions for the Way Forward:
4. Other Initiatives Addressing the Session Issues:
5. Making Progress for Tackled Issues:
6. Estimated Participation:
7. Reflection to Gender Issues:
8. Session Outputs:
NRIs Collaborative Session on Cybersecurity

NRI Session
Updated: Thu, 05/12/2019 - 15:58
Security, Safety, Stability and Resilience
1. Key Policy Questions and Expectations

Policy Questions: 

  • Cybersecurity norm-making initiatives - how to bridge the gaps and make them work?
  • How cybersecurity policy and regulation can address emerging technological challenges?
  • How can cooperation and collaboration on national, regional and global levels help to increase cybersecurity?
  • What legal regulations are already in place but potentially need to be enforced and what new legal regulations should be created to address upcoming threats?
  • What role can institutional arrangements play?
  • What role should different stakeholders play in cybersecurity capacity building approaches?
2. Discussion Areas:
  • Mechanism for enforcing Cybersecurity norms. Government and military intelligence enforcement is a suggested option.
  • Communities are becoming the norm-making machines but there is no mechanism in place for norm implementation. Though norms should remain binding and voluntary as opposed to regulation, there are issues as to why such approach don’t work.
  • Norms should be developed in a multi-disciplinary way from ground up. Different stakeholders be involve at different levels of the process, and technical community should be involve from the beginning of the process.
  • Complete set of report notes: https://docs.google.com/document/d/1gJKNkUGIyMSuWPM3cn3VxPxOLRFTQ8j51c7sBVLTOZ4/edit?usp=sharing
3. Policy Recommendations or Suggestions for the Way Forward:

Recommendation: Session should not last an hour, but probably two hours.  National regional IGFs, should embed in national regional IGFs agenda, the regional discussion as well.  So that at each places have space for regional discussions concerning Cybersecurity.

4. Other Initiatives Addressing the Session Issues:
5. Making Progress for Tackled Issues:
6. Estimated Participation:
7. Reflection to Gender Issues:
8. Session Outputs: