IGF 2022 WS #234 Pushing for IT Companies’ Moderation Policy Accountability

Organizer 1: Alexander Malkevich, Civic Chamber of the Russian Federation
Organizer 2: Lydia Mikheeva, International Association of Economic and Social Councils and Similar Institutions (AICESIS) || Civic Chamber of the Russian Federation
Organizer 3: Nikita Volkov, Civic Chamber of the Russian Federation

Speaker 1: Nayla Abi Karam, Civil Society, Asia-Pacific Group
Speaker 2: Amir Alaoui, Civil Society, African Group
Speaker 3: Chahrazed Saadi, Private Sector, African Group
Speaker 4: Anna Starkova, Government, Eastern European Group
Speaker 5: Kirill Ignatov, Government, Eastern European Group

Moderator

Alexander Malkevich, Civil Society, Eastern European Group

Online Moderator

Lydia Mikheeva, Civil Society, Eastern European Group

Rapporteur

Nikita Volkov, Civil Society, Eastern European Group

Format

Round Table - U-shape - 60 Min

Policy Question(s)

1. Effectiveness of contemporary content moderation policy in fighting extremism vis-a-vis its side effects featuring digital space fragmentation.
2. Defining the malware and harmful content: clear criteria and straightforward transparent moderation guidelines.
3. Mechanisms of control and observation over the content moderators on the social networks.
4. Alternatives to bans and blocking: promoting inclusion moderation policy towards the disseminators of the non-mainstream views.

Connection with previous Messages: Message 1. Economic and Social Inclusion and Human Rights
• Policies implemented by Internet platforms to deal with harmful online content need to be transparent, acknowledge the limits of automated content moderation, and ensure a proper balance with the right to freedom of expression.
• To ensure that human rights are enforced and upheld in the digital space, a careful reflection is needed on how technology serves humanity, as opposed to simply putting in place safeguards around the edges and waiting until harms occur. States’ duty to prevent potential harm of human rights (e.g. through regulation and enforcement) needs to be complemented with (a) effective access to remedy when people are victims of human rights violation, and (b) responsibility on the part of relevant actors in integrating human rights due diligence and impact assessments throughout the entire life cycle of a technology.

Message 3. Emerging Regulation: Market Structure, Content Data and Consumer Rights and Protection
• In the debate on digital sovereignty and digital autonomy, more focus needs to be placed on the individual autonomy of Internet users within the digital realm.

Message 5. Inclusive Internet Ecosystems and Digital Cooperation
• While the Internet contributes to social, cultural and economic growth, questions of governance, accountability, misuse, trust and access still exist. As the Internet cannot be dealt with from a one-dimensional perspective, collaborative, equitable and inclusive Internet governance is imperative and requires well-structured coordination and consolidation.

Message 6. Trust, Security, and Stability
• Neutrality holds significant potential as a force for stability in cyberspace and - in times of lively global discussions - can advance the understanding of key conditions for implementing rules of responsible behavior. Greater clarity about state views, which have been the traditional focus under the law of neutrality, has the capacity to create safe spaces for non-state actors that assist vulnerable groups.
• A responsible use of AI algorithms ensures the preservation of human rights and avoids biases that intensify inequality. Policies to deal with misuses should be developed where needed.

SDGs

9. Industry, Innovation and Infrastructure


Targets: Unaccountable moderation policy exercised by the IT moguls brings multiple negatives outcomes, i.e., arbitrary bans resulting in alienation and cutting off multiple users from the interoperable global digital space, used to deny individuals’ access to the social media and video hosting platforms. Such measures significantly reduce one's ability to rely on the globally interconnected Internet infrastructure and promotes mistrust towards the digital services. Therefore, unaccountable moderation policy negatively affects people’s inclusiveness into the digital space and discourages their reliance on digital services. Therefore, the UN SDG 9 “Build resilient infrastructure, promote sustainable industrialization and foster innovation” comes under fire and is being compromised.

Description:

The Internet infrastructure has long been considered a neutral foundation, on which the debates on various social, economic, and political issues could take place. Yet the rampant evolution of new technologies has altered the significance of the Internet infrastructure, and the major social media platforms have become irreplaceable and unique services, de-facto resembling digital communication monopolies.

Millions of users across the globe post vast amounts of digital content on the daily basis, including malware and harmful, hence bringing in the necessity of the social media platforms’ administration to react accordingly. As a result, major social media platforms, e.g., YouTube, Twitter, Facebook, and Instagram, have gained the capacity to control global discursive space via exercising moderation policies based on unilaterally imposed arbitrary and non-transparent criteria, values, and beliefs.

The social media platforms possess the power to delete content enshrining views that alter the dominating narratives or even block media outlets and user accounts that disseminate non-mainstream views, thus violating the values of freedom of speech, as well as core principles of free information flow and unimpeded access to it. Furthermore, one of the overlooked yet crucial implications of the IT moguls’ abrupt moderation policy, shunning the dissent users, is boosting Internet fragmentation, as the banned or blocked users tend to shift to the alternative, mostly national ones, social media platforms, localized in the country of their residence and complying the local digital regulations. Alienating or driving off hundreds of thousands of people out of the global digital communication infrastructure creates information bubbles and limits the outreach of the internationally operating digital ecosystems, thus reproducing the national borders in the digital space.

Having the last word in de-facto censorship practices grants the IT moguls immense agenda-setting and agenda-framing powers, consequently challenging the strive for democracy, neglecting social inclusion, marginalizing social groups, and leading to further digital space fragmentation. In this regard, it is necessary to initiate a public discussion towards developing new mechanisms aimed at protecting the freedom of speech internationally and preserving the integrity and interoperability of the digital space.

Under these circumstances, the paramount role of civil society institutions is to hold digital corporations accountable, demanding installment of transparent, clear, and exhaustive moderation policy.

The workshop is set to address the problem of arbitrary and unchecked moderation policy exercised by the private business in control of the social media platforms – crucial junction points of the modern Internet communication infrastructure.

At the session, the speakers are encouraged to discuss the issue and propose middle-ground solutions stipulating the mitigation of major stakeholders’ conflict of interests and, eventually, preventing further Internet fragmentation.

Expected Outcomes

As a tangible outcome of the workshop, the organizers expect to tailor a set of policy recommendations on the issue of IT companies’ content moderation policy accountability while also raising the awareness of the society and expert community towards the issues discussed at the workshop and advocating for more transparent content administration practices on the social media.

Hybrid Format: AICESIS and the Civic Chamber deem digitalization and the introduction of ICT technologies as one of the top priorities of the organizations’ activities. Therefore, we have substantial experience in organizing and holding public events in a hybrid format.

Firstly, we designate two moderators: one to be present in person on the event site and one to administer the discussion of the speakers participating online via Zoom or other digital communication platforms.

The onsite moderator, Mr. Malkevich, after the opening word, will steer the discussion from topic to topic while delegating the speaking slots of approximately 8-10 minutes to the onsite speakers and then turning to the online speakers overseen by the online moderator, Ms. Mikheeva, and so on. During the free discussion part of the workshop, the same rotation scheme will be implemented, with the provision that the allotted time per speaker will be shorter.
To ensure the best possible experience for both onsite and online speakers, we will collect their reports and topics prior to the event and will devise the speakers’ sequence according to the discussion plan while also trying our best to rotate the onsite and online speakers and on the same time preserving the workshop’s internal logic and coherence.

The online moderator will rely on the IGF online translation chat to collect the questions and relevant topics to be covered by the speakers more extensively during the free discussion part of the workshop. As per the onsite attendees, the onsite moderator will ask for their questions and comments and then forward them to the speakers to discuss.

Finally, the moderators will use their renowned Telegram accounts and chats connected to them and open to all Telegram users (Mr. Malkevich, link: https://t.me/alexandr_malkevich, and Civic Chamber of the Russian Federation, link: https://t.me/oprf_official) to collect additional feedback and questions for the workshop participants.

Online Participation



Usage of IGF Official Tool.