IGF 2022 WS #52 De-platforming as Censorship Means in the Digital Era

Organizer 1: Civil Society, Eastern European Group
Organizer 2: Civil Society, Eastern European Group
Organizer 3: Civil Society, Eastern European Group

Speaker 1: Amir Alaoui, Civil Society, African Group
Speaker 2: Nayla Abi Karam, Civil Society, Asia-Pacific Group
Speaker 3: Chahrazed Saadi, Private Sector, African Group
Speaker 4: Anna Starkova, Government, Eastern European Group
Speaker 5: Charlemagne Tomavo, Civil Society, African Group


Round Table - Circle - 90 Min

Policy Question(s)

1. Do content moderation and de-platforming in particular actually help fight extremism and prevent verbal and physical violence, or does it instead feature exclusion and digital space fragmentation?
2. Who is being targeted most by de-platforming, canceling, and other means of digital censorship?
3. What is being moderated on the Internet? Explicit definition and delimitation criteria of the harmful and malware content.
4. How to design mechanisms capable of mitigating the moderator’s bias and ensuring the moderator’s impartiality?
5. Devising transparent and holistic guidelines for social media and digital services users: freedom of speech vis-à-vis hate speech.
6. Enforcing external unified social media regulations: could the state be an effective arbitrator?

Connection with previous Messages: Message 1. Economic and Social Inclusion and Human Rights
• Policies implemented by Internet platforms to deal with harmful online content need to be transparent, acknowledge the limits of automated content moderation, and ensure a proper balance with the right to freedom of expression.
• To ensure that human rights are enforced and upheld in the digital space, a careful reflection is needed on how technology serves humanity, as opposed to simply putting in place safeguards around the edges and waiting until harms occur. States’ duty to prevent potential harm of human rights (e.g., through regulation and enforcement) needs to be complemented with (a) effective access to remedy when people are victims of human rights violation, and (b) responsibility on the part of relevant actors in integrating human rights due diligence and impact assessments throughout the entire life cycle of a technology.

Message 2. Universal Access and Meaningful Connectivity
• Ensuring that all people everywhere have meaningful and sustainable access to the Internet must be a priority. Access to the open Internet is key for bridging the digital divide, as well as fostering democracy and human rights.
Message 3. Emerging Regulation: Market Structure, Content Data and Consumer Rights and Protection
• In the debate on digital sovereignty and digital autonomy, more focus needs to be placed on the individual autonomy of Internet users within the digital realm.

Message 5. Inclusive Internet Ecosystems and Digital Cooperation
• While the Internet contributes to social, cultural and economic growth, questions of governance, accountability, misuse, trust and access still exist. As the Internet cannot be dealt with from a one-dimensional perspective, collaborative, equitable and inclusive Internet governance is imperative and requires well-structured coordination and consolidation.

Message 6. Trust, Security, and Stability
• Neutrality holds significant potential as a force for stability in cyberspace and - in times of lively global discussions - can advance the understanding of key conditions for implementing rules of responsible behaviour. Greater clarity about state views, which have been the traditional focus under the law of neutrality, has the capacity to create safe spaces for non-state actors that assist vulnerable groups.
• A responsible use of AI algorithms ensures the preservation of human rights and avoids biases that intensify inequality. Policies to deal with misuses should be developed where needed.


9. Industry, Innovation and Infrastructure

Targets: De-platforming and unconditional bans not simply raise a serious question of whether the freedom of the press is being observed but also create information bubbles. Essentially, these are tools of alienation and cutting off from the interoperable global digital space used to deny individuals access to social media and video hosting platforms. It significantly reduces one’s ability to rely on the globally interconnected Internet infrastructure, thus forcing users to shift to the national Internet services and digital platforms, substituting the international ones and consequently increasing the Internet fragmentation following the patterns of the national borders in the digital dimension. Therefore, along with the Internet fragmentation process, the UN SDG 9 “Build resilient infrastructure, promote sustainable industrialization and foster innovation” comes under fire and is being compromised.


The Internet infrastructure has long been considered a neutral foundation, on which the debates on various social, economic, and political issues could take place independent of the companies’ and states’ stances on the discussed topics.

Yet the rampant evolution of new technologies, powered by 5G connectivity and artificial intelligence (AI) technologies, has altered the significance of the Internet infrastructure in an unprecedented way, and the major social media platforms have become irreplaceable and unique services, de-facto resembling digital communication monopolies.

Historically, the development of and control over the communication infrastructure was a state prerogative; however, the advances in the broader Internet infrastructure and digital communication services, together with the social media platforms, have been primarily managed by the private enterprises, and the current moderation policies, including the definition and devising multiple criteria of the so-called hate-speech and harmful content.

Now we observe how Internet infrastructure and scores of digital platforms are being instrumentalized via exercising ambiguous practices of de-platforming – literal (and sometimes simultaneous) erasing of user’s accounts in popular social media and other digital services for expressing opinions contradicting the dominating narratives.

As a result, in the domain of social media nowadays we face a problematic situation, when а few private enterprises exert control over the most popular social media platforms, providing unique and non-elastic communication services (in the sense of content format, style, and means of its distribution) to the dozens of millions of users, also possess the unchecked powers to arbitrarily exclude anyone from the complex communicative networks, thus shunning those who have alternative opinions on various significant social, political, cultural, and economic topics by demoting them to the radicals and extremists, be it true or wrong.

Having the last word in de-facto censorship practices grants the IT moguls immense agenda-setting and agenda-framing powers, consequently challenging the strive for democracy, neglecting social inclusion, marginalizing social groups, and leading to further digital space fragmentation.

The workshop is set to address the problem of arbitrary and unchecked de-platforming policy exercised by the private business in control of the junction points of the modern Internet communication infrastructure. At the session, the speakers are encouraged to discuss the issue and propose middle-ground solutions stipulating the mitigation of major stakeholders’ conflict of interests and adherent to the fundamental human rights and principles of the freedom of speech and democracy.

Expected Outcomes

As a tangible outcome of the workshop, the organizers expect to tailor a set of policy recommendations on the phenomenon of de-platforming and content moderation policy on the major social media platforms while also raising the awareness of the society and expert community towards the issues discussed at the workshop.

Hybrid Format: AICESIS and the Civic Chamber and deem digitalization and the introduction of ICT technologies as one of the top priorities of the organizations’ activities. Therefore, we have substantial experience in organizing and holding public events in a hybrid format.

Firstly, we designate two moderators: one to be present in person on the event site and one to administer the discussion of the speakers participating online via Zoom or other digital communication platforms.

The onsite moderator, Mr. Malkevich, after the opening word, will steer the discussion from topic to topic while delegating the speaking slots of approximately 8-10 minutes to the onsite speakers and then turning to the online speakers overseen by the online moderator, Ms. Mikheeva, and so on. During the free discussion part of the workshop, the same rotation scheme will be implemented, with the provision that the allotted time per speaker will be shorter.
To ensure the best possible experience for both onsite and online speakers, we will collect their reports and topics prior to the event and will devise the speakers’ sequence according to the discussion plan while also trying our best to rotate the onsite and online speakers and on the same time preserving the workshop’s internal logic and coherence.

The online moderator will rely on the IGF online translation chat to collect the questions and relevant topics to be covered by the speakers more extensively during the free discussion part of the workshop. As per the onsite attendees, the onsite moderator will ask for their questions and comments and then forward them to the speakers to discuss.

Finally, the moderators will use their renowned Telegram accounts and chats connected to them and open to all Telegram users (Mr. Malkevich, link: https://t.me/alexandr_malkevich, the Civic Chamber of the Russian Federation, link: https://t.me/oprf_official) to collect additional feedback and questions for the workshop participants.

Online Participation

Usage of IGF Official Tool.