IGF 2022 DC-Gender: Who's Watching the Machines? New Tech, Gender, Race & Sexuality

Wednesday, 30th November, 2022 (08:15 UTC) - Wednesday, 30th November, 2022 (09:45 UTC)

Dynamic Coalition on Gender and Internet Governance

Round Table - Circle - 90 Min


The DC on Gender and Internet Governance session will be a roundtable discussion on the intersection of advanced technologies, gender and sexuality, human rights online especially freedom of expression, and marginalised communities. Facial recognition softwares are biased against people of colour, queer and trans persons, persons with disabilities, and women, and are discriminatory. In spite of this, they are increasingly used for identification and verification, and for ‘security’, and in justice systems. Research on artificial intelligence has shown that it’s embedded with the same racial and gendered prejudices that exist in society, and is exclusionary towards marginalised people due to the algorithms and the limited data sets that are used to train it. Algorithms are also increasingly used by various social media platforms to address online gender based violence and hate speech. This does not work as a lot of hate speech is in non-English languages, and colloquial misogynist, xenophobic and casteiest abuses often slip through the gap. Researchers working on addressing online gender based violence have pointed out the need for human moderators in the reporting processes. Deep fakes are more easily available and abused to harass women and LGBTQ persons. Machine-learning and AI has also been used to manipulate votes and elections results, directly affecting democratic processes in several countries. It is clear from the current state of advanced technologies and research around the same that there is an urgent need to pay close attention to them. Facial recognition, surveillance tech, AI, machine-learning, data systems etc. are all inherently prejudiced. Therefore, it is important to understand sooner rather than later, how this affects structurally silenced and marginalised communities and people in the digital age and online spaces. This session will bring together researchers, practitioners, and people working on digital rights and freedoms to share and discuss the impact of advanced tech on different communities, and what needs to be done to course correct. Do we want to reject certain technology, like how CCTV cameras with facial recognition are banned in some places? How are advanced tech undermining our right to freedom of expression among other freedoms? What does the current landscape look like? Who is profitting off these technologies? How do we understand and address the role of big data and data systems in this? These are some questions that will be deliberated upon.

To facilitate interaction between all participants and speakers, we will have a detailed plan for the roundtable, with clearly set out roles and responsibilities for online and onsite moderators and speakers. Both our online and onsite moderators have prior experience moderating sessions. We’ll ensure that online and onsite moderators are thoroughly briefed prior to the session, so that they’re able to ensure meaningful and equal participation for both online and onsite participants. Any visual materials used will be made accessible to online participants through the virtual meeting platform and to onsite participants through a screen at the venue. Any resources shared will be accessible to online participants through Chat; our Rapporteur will help consolidate the list of resources which the onsite moderator can share with onsite participants.


Shohini Banerjee, Point of View, Civil Society, Asia-Pacific Smita V, Association for Progressive Communications, Asia Pacific Debarati Das, Point of View, Civil Society, Asia-Pacific Zahra Gabuji, Point of View, Civil Society, Asia-Pacific Riddhi Mehta, Point of View, Civil Society, Asia-Pacific Maduli Thaosen, Point of View, Civil Society, Asia-Pacific


Srinidhi Raghavan, Rising Flame, Civil Society, Asia-Pacific Chenai Chair, Mozilla, Civil Society, Africa Liz Orembo, KICTANet, Civil Society, Africa Smita V., Association for Progressive Communications, Civil Society, Asia-Pacific Sheena Magenya, Association for Progressive Communications, Civil Society, Africa

Onsite Moderator

Smita V

Online Moderator

Shohini Banerjee


Maduli Thaosen



Targets: The session focuses on SDG target 5b: “Enhancing the use of enabling technology, in particular information and communications technology, to promote the empowerment of women”. It will look at the intersections of advanced tech, gender and sexuality in how they relate to human rights of women and people of marginalised genders and sexualities. It will also identify and reflect on biases in facial recognition, surveillance tech, and AI that are barriers to women and LGBTQIA+ people’s use of such technology. Speakers and participants will share recommendations and reflect on intersectional perspectives on what enabling new tech can look like for women and gender minorities, how we can rethink tech to make it more meaningfully inclusive & intersectional, and how this can promote the rights of women and gender minorities.

Key Takeaways (* deadline 2 hours after session)

The idea that technology is a great equalizer is flawed, as technologies often amplify inequalities and harm against gender and sexual minorities, people with disabilities, and people from other marginalised communities. Queer and disabled people cannot access healthcare services as they are built on identification systems that they cannot access. Technologies are built by a homogenous group of people(white, cis men) using biased data.

Holding tech companies accountable is a big challenge, as they have strong lobbies with the governments and collective public awareness on issues of digital rights is limited. Geographic-location dynamics also inform these tech accountability processes;tech companies mainly respond to policies in the global north as there are big costs to defying them. Further, how and why given tech policies are being implemented need to be considered.

Call to Action (* deadline 2 hours after session)

The interest of the people, their lived experiences and local contexts, and their visions for the technologies need to be front and centre. Tech companies and developers ought to be intentional not just about privacy, safety, transparency, representation, and benefits to the community, they should also centre people’s choice and agency. We should be able to decide the extent to which we want to participate & engage with technologies.

To think of technologies are merely practical solutions to problems, or a means to an end is a myopic understanding of technology. Emerging technologies should centre community, care, pleasure, and fun. People’s joy and community building should be the motivations for technological innovation.

Session Report (* deadline 26 October) - click on the ? symbol for instructions


Smita Vanniyar inaugurated the discussion by highlighting the importance of bringing forth the experiences and visions of women and queer people in building a resilient internet and stressed on the need to de-binarise discussions on internet governance. With Governments bringing emerging tech, digital ID systems, and surveillance tech, we need to think about in what contexts these are being pushed and who will be most affected by them. They pointed to how facial recognition technologies and CCTV cameras are pushed for the ‘protection’ of women and children, however, facial recognition technologies have an inherent colour bias. When you use such tech for safety and it is already biased, we need to ask: who are we throwing under the bus? Research shows that non-binary, trans, and black people are significantly misidentified. We need to ask: what determines what kind of policies govern tech? Are these being pushed by civil society or are there other forces at play? A lot of the time, we get policies from the perspective of research and development and they are inevitably connected to state security and the military. Tech policies are often informed by global relations, for example, a lot of privacy conversations came from the EU, often ignoring local contexts.

Liz Orembo reflected on how algorithms contribute to the lack of representation of women in governance or politics in today’s age.  She highlighted how the algorithms and social media’s business models amplify attacks on women seeking political seats. What needs to be done is to discourage such biased and harmful business models, and reimagine these business models in a way that enables other communities. Technologies are biased because they are built on data that ignores the realities of women and gender minorities, especially in the global south, as they have limited access to digital technologies. On the other hand, women and gender minorities in the global south are not using technologies because they are not built for them. It is a continuous loop. Liz also spoke of the many barriers to holding technology companies accountable. Civil society members act out of passion, while companies are always driven by profit-motive. They have strong lobbies with various governments as they have the money and power to do so. Moreover, it is a challenge to mobilise the general public in holding these companies accountable as understanding tech and policies is a slow and difficult process. When people do not understand how tech works, it becomes a challenge to ask for accountability. 

Chenai Chair deliberated on the concerns around surveillance and tech, noting that in many parts of Africa, surveillance cameras are being increasingly deployed for military and state security. Who has the data that is collected by these technologies? How are these technologies being secured? What is the data used for? There are also serious concerns around data surveillance, and there is a need for greater transparency. How does my device know where I am and who is collecting such information? What steps are being taken to ensure privacy? For example, the mental health apps that we share our most intimate thoughts and information with are often quite irresponsible with our data and do not prioritise privacy of data. Chenai also elaborated on the biases that exist in Voice Technologies. Voice technologies respond to European and American accents but misidentify African languages and accents, and even voices of women. These voice technologies are being pushed as solutions that enable greater access to information and engagement with digital technologies, but these biases need to be addressed. There needs to be diversity in datasets that these solutions are built on. 

When speaking of tech accountability, Chenai highlighted that geo-location dynamics play an important role in determining the tech platforms’ response to accountability demands. These companies respond to political representatives where they come from. For example, platforms take the EU more seriously because they will be charged a hefty fine, whereas, fines imposed by African nations are seen as a small cost. The forex rate works against African nations. Community standards that govern these platforms are also used to silence people from many communities. Mass reporting is used to silence feminists, LGBTQIA+ community, and activists, who greatly rely on these social media platforms for their work. Apps like Telegram are used to mobilise such mass reporting.

Chenai also emphasised the need to centre fun and joy in our approach to technology and innovation. People first used the internet to date. When TikTok came, a lot of people thought TikTok would not work because people did not have to sign up to see content. However, TikTok is has reach across classes because it centred joy and entertainment for as many people as possible. A lot of innovations were driven by SDGs, etc. but how do we ensure that there is a greater participation and awareness of technology among people because there is joy and fun in the process?

Srinidhi Raghavan examined conversations on data, surveillance, and privacy at the intersection of technology and disability. In India, there is a strong push toward the systematization and creation of Digital IDs for easier and streamlined access to health services. Some of the few central questions that need to be addressed are: Where is this data being kept? How is the system built to allow people with diverse disabilities to access these systems? Access to these services is often linked to the extremely contested Biometric ID card (in India), Aadhaar card, which was built thinking of non-disabled bodies in mind. People with cerebral palsy, retinal detachment, or other kinds of physical disabilities cannot access the Aadhaar system. Technology is seen as a solution to bridge the gaps in access, but often it is the other way around. Disability needs to be front and centre when we create these technologies and services.

Srinidhi also highlighted how many of the emerging technologies are actively harming people with disabilities. Technologies are being built to detect disabilities, and to look for how disability presents itself, but as we know, these systems of identification are often used to discriminate against people with disabilities. While some might say that these technologies allow for streamlining of access to healthcare, these same technologies are also used to deny disabled people, like those with psychosocial disabilities, access to medical insurance. We also need to think about these technologies from the perspective of privacy – who gets to decide when a person’s disability is disclosed and identified?

Srinidhi also noted that the question of AI helping in increasing accessibility, spaces, and possibilities for persons with disabilities is a complicated one. As activists, we realise we do not have enough data on disability issues. But if data is collected without paying attention to privacy or transparency, it can be weaponised against groups the data is collected about. For example, a study in the US showed that a company was collecting user data on disability, and one of the things it ended up doing was deprioritizing disabled persons' applications in the hiring process. Data collected is often not in the best interest of the community. How do we imagine good use of data? We often do not know what data is being collected. There is a power in play - how do you challenge something when you do not even know what is being collected about you? We would like data to build inclusive systems, but it is used against us. Our relationship to data is a very complicated one. Moreover, the conversations at the intersection of tech and disability are always about fixing disability. It is important to think about where this comes from. We need to think about people and community and care, not just a means to an end. We need to think about the human aspects of engaging with technology - how can it nourish human life and relationships?

Sheena Magenya reminds us to be wary of the narratives that talk about tech as a great equaliser. Tech is seen as a big solution to all our problems, especially in the African context, however these technologies are devoid of any accountability, equity, and representation. In the technology landscape, there is limited agency and choice of community members. As we saw during COVID-19, often, technological ‘solutions’ are imposed on communities, and there is not much to refuse or reject use of tech. Communities that are criminalised in certain contexts and nations are denied access to spaces when these technologies are constantly tracking you or fishing your personal information out of you. If you are a trans person, and your identity is criminalised in a certain region, you can only participate in certain spaces by lying about your identity. 

Sheena also asserted that we need to shift focus away from what is horrible and what we want more of. We need to remember that tech and innovation are not new to the human experience. We have always been developing tech as humans, but now the speed is just really fast. We are always trying to do more and better. But this inherent interest in technology is manipulated by corporates and states. Where is the choice to the extent to which we want to participate? Who gets to take up space? Young women and queer people face an incredible amount of violence for simply existing, for saying things that are not in line with societal, patriarchal standards. They are truly at the battlefront. There is no support, solidarity, or recourse. It is important to recognise that young people are defending not just their freedoms but everyone else’s as well.