IGF 2023 WS #526 Dark Algorithm, from personal bubbles to continental ones


Cybersecurity, Cybercrime & Online Safety
New Technologies and Risks to Online Security

Organizer 1: Salvatore Orazio Agatino Giannitto, 🔒
Organizer 2: Kateryna Bovsunovska, Internet Society Youth Standing Group
Organizer 3: Muhammad Kamran, KP Bar, provincial Youth Assembly, Peace Club Pakistan, Helping Youth Pakistan.

Speaker 1: Angelo Alù, Civil Society, Western European and Others Group (WEOG)
Speaker 2: Athanase Bahizire, Technical Community, African Group
Speaker 3: Laura Pereira, Civil Society, Latin American and Caribbean Group (GRULAC)
Speaker 4: Sasaki Reina, Civil Society, Asia-Pacific Group


Salvatore Orazio Agatino Giannitto, Civil Society, Western European and Others Group (WEOG)

Online Moderator

Kateryna Bovsunovska, Technical Community, Eastern European Group


Muhammad Kamran, Private Sector, Asia-Pacific Group


Panel - 90 Min

Policy Question(s)

A. What is the reason for algorithmic differentiation?
B. Should state participation in companies producing multimedia content be completely excluded when these are able to influence their behaviour so profoundly, and the contents of the companies those of the users?
C. What laws and international bodies could offer a solution?

What will participants gain from attending this session? The workshop aims, starting from a brief analysis of social network algorithms and their regulation up to the present day, to analyze the developments of the so-called internet bubbles arriving at the current diversity of algorithms adopted by geographical areas. Finally, it will highlight the social effects and the risks faced above all by the younger groups of users as well as shed light on the current legislative situation and its possible developments.


While the positive effects of social networks were warmly welcomed in every part of the world a decade ago, now comes the time to shed light on the other side of the coin. With the spread of greater awareness of the social effects of social networks, the concept of a bubble has risen to the fore. We realised that the operating algorithms keep us locked up within a few of our interests, supported by a massive volume of content posted every day. Algorithms are why no two users will see the same social content, even if they follow all the same accounts.

This phenomenon has been exacerbated by the emergence of vlogging social platforms, which leave even less space for interaction and debate in favour of mere assimilation and emulation. The world-famous TikTok app is not, in fact, a social media encouraging social connections between users but a media showing content flagged as attractive to the user based on their interests, calculated by the algorithm.

Such a selective approach to content ranking may pose risks to Generation Z, considering that 45% use these platforms as primary search engines. At the same time, it seems that not all young people are exposed to it in the same way, as in some areas, socially dangerous content is almost absent from the platform, while in others, the algorithm causes these to be given free rein.

Considering the emerging legislative initiatives on the regulation of social media, one wonders whether there should be a state influence on social media algorithms at all. If not, how should private companies be incentivised to eliminate dark patterns? If yes, may individual bubbles become state bubbles prone to censorship and propaganda?

Expected Outcomes

Stimulate a reflection on the differentiation of the contents of social platforms in order to arrive at a basic homogenization of the contents shown by completely excluding those that can have negative effects on society, solicit the regulation of the algorithm when it operates on the internet tout court

Hybrid Format: The session aims to facilitate a panel discussion where participants are able to ask questions and leave comments both online and onsite. For this purpose, the session will feature both online and onsite moderators who will have regular communication to keep the participants equally engaged. While the onsite moderator will hear the participants’ questions, physically attending the session, the online moderator will be keeping an eye on the questions and comments that are shared online and bring these into the discussion by communicating it to the onsite moderator.