IGF 2023 WS #497 Platfrom accountiblity in times of crisis

Organizer 1: Eliska Pirkova, 🔒Access Now
Organizer 2: Marlena Wisniak, 🔒European Center for Not-for-Profit Law (ECNL)

Speaker 1: Rizk Joelle, Intergovernmental Organization, Western European and Others Group (WEOG)
Speaker 2: Gabriel Lindén, Government, Western European and Others Group (WEOG)
Speaker 3: Chantal Joris, Civil Society, Western European and Others Group (WEOG)
Speaker 4: Marwa Fatafta, Civil Society, Asia-Pacific Group


Eliska Pirkova, Civil Society, Western European and Others Group (WEOG)

Online Moderator

Marlena Wisniak, Civil Society, Eastern European Group


Marwa Fatafta, Civil Society, Asia-Pacific Group


Debate - 60 Min

Policy Question(s)

1) When does International Humanitarian Law applies to social media and content governance issues?
2) What are states’ negative and positive obligations under International Human Rights Law (IHRL) and International Humanitarian Law (IHL) with respect to content governance?
3) How to balance the freedoms of expression and information and conflicting rights under IHRL and IHL, factors to take into account in content moderation decisions, e.g. context, speaker, visibility, victims, form of content?
4) What are social media companies’ rights, risks, and obligations under IHL and IHRL the risk of liability when in situations of armed conflict?

What will participants gain from attending this session? The session will explore the responsibilities of social media companies under international humanitarian law (IHL) and corporate liability under international criminal law. In addition to the companies’ responsibilities to respect human rights, prevent risks, and remedy abuses, the UN Guiding Principles on Business and Human Rights highlight the risk of human rights abuses in conflict affected areas, and call on companies to not only respect international human rights treaties, but to also “respect the standards of international humanitarian law”. Whereas there has been a lot of advanced discussion on platforms’ responsibilities under international human rights law, little research has been done on how IHL applies to these companies when they operate in or respond to situations of armed conflict. Existing human rights due diligence frameworks and toolkits don't provide a sufficient roadmap for tech companies to navigate such complex contexts. The session intends to fill in these gaps.


In times of conflict and other situations of violence, human life has to be protected. While social media platforms are not a root cause of the conflict, they can worsen the situation on the ground. This can happen in many ways including through inadequate and shortsighted responses to the spread of incitement to violence, discrimination, and hostility online. Companies have often failed to respect human rights or to mitigate adverse human rights impacts from their services or activities well before the crisis escalates. Their responses (or lack thereof) have disproportionately impacted marginalized communities and historically oppressed groups, and have facilitated serious human rights abuses as well as violations of humanitarian laws and principles.
This session will introduce and discuss the guidelines containing main legal principles that companies should implement when operating in areas in a state of armed conflict or fragile post-conflict as well as areas witnessing weak or nonexistent governance and security, such as failed states, and widespread and systematic violations of international law, including human rights abuses. The session will mainly focus on the intersection of international human rights law, international humanitarian law and international criminal law - main fields of laws applicable in times of conflict. It will show how they can contribute to meaningful platform' accountability and protect human rights offline and online.
The session will be informed by the work of Access Now, including the joint Declaration on content governance and platform accountability in times of crisis launched at the IGF2022, alongside follow up reports launched this year. It will support individual recommendations by concrete examples from around the world. The session will gather leading experts from international organizations, state representatives, former director of content governance at Twitter and civil society organizations that have long documented platforms’ unequal, opaque, and inconsistent approach to content governance.

Expected Outcomes

The expected outcome of this session is to first, clarify main international legal concepts of international humanitarian law and international human rights law in the context of platform accountability and content governance. Second, the session will build on existing standards developed by expert bodies, civil society organizations and best practices of private companies operating in conflict zones and areas impacted by crisis. The session will provide set of recommendations on how platforms can identify and mitigate risks of liability, complicity, and human rights abuse in such contexts. Recommendations will be discussed in detail by invited experts and consequently, subject to the direct feedback from the audience.

Hybrid Format: The session will be structured in three parts. First, the invited speakers will give a background about the human rights impacts of social media platforms during times of crisis, recommendations and key challenges, especially in the Global South. Second, participants will share their reflections through a guided conversation. Discussion will include attendees participating remotely, and those in-person. The organizer will facilitate in-person and online breakout groups, co-moderating the online conversation. Third, the organizer will provide an overview of what was discussed, open questions and ideas for future work, based on the group discussion. The in-person moderator will lead conversations in the room. Remote participants will contribute via a chat function. The in-person moderator will work closely with the online moderator, who will facilitate virtual breakout groups.