IGF 2020 - Day 4 - DC Future Unclear: data and bodies in the post-pandemic times

The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

    >> MODERATOR: Hello, everybody and welcome to the session of the Dynamic Coalition on Internet Governance.  The first online DC session and we are pleased to have the full session looking at one of the key themes in IGF 2020 which is the question of data.  And what we try and do at the Dynamic Coalition on Gender and Internet Governance is really ensure that the issues that we are discussing related to internet governance are looked at through the lens of gender.  So that is what we will try and do in this session. 

And before I introduce our panel, there is a tradition that we follow at each and every DC-GIG as our short form is called for the Dynamic Coalition on Gender and Internet Governance at each meeting we have a small presentation which looks at a phenomenon called the gender report card, a tool used in at least the last nine IGFs or so, to really look at the gender composition of the IGF and critically think about gender composition as a critical tool in internet governance as well.  Before we move to the future, I'm pleased to invite Debarati Das to share the gender report card from IGF 2019. 

     >> DEBARATI DAS: Thank you.  Hi, everybody.  I'm Debarati.  I will just share my screen.  So, every year at IGF as part of our work in the Dynamic Coalition on Gender and Internet Governance, we do analysis and summary of the gender report cards the sessions at IGF fill out after they have completed.

     I will take you through the observations from the IGF 2019.  So here you can see the gender distribution of online and offline site participants and last year at IGF only 39% of participants were women.

     And it was a 2% dip from IGF 2018 which was a previous year when we had 41% women's participation.

     Here you can see the gender diversity and percentage of female participants in presentation brackets 86 sessions filled out the field and most sessions had female presence in the bracket of 25 to 50%.  In 2018 most were in the greater than 50% bracket followed by 50%.  Even here there was sort of a dip.

     We also all look at how many sessions fill out the gender report cards and how many do not.  Last year at IGF, gender report cards were filled out for 94 sessions out of 118.  This was quite a positive finding because this showed an increase in the number of reporting sessions by about 40% compared to IGF 2018.

     We also look at how sessions engaged with gender.  So 93 sessions at IGF 2019 provided information on this.  And 24 sessions did not fill out this field.  And as you can see here, 41 sessions which is the highest number of sessions had partial focus on gender or topics that intersect with gender closely followed by sessions which did not engage with gender at all which are 38 and only 14 sessions directly engaged with gender as a topic.  That's what I had.  Thank you.

     >> MODERATOR: Thanks so much, Debarati.  I put a message in chat saying that if anybody has any questions on the gender report cards they can either be added on chat or on the Q&A tab at the bottom of the screen.  And I will keep a watch on it.  Nothing so far.

     So let's move on then to Future Unclear: Data, Bodies, Gender, and Sexuality.  And I'm pleased to introduce the three fantastic panelists, all of whom are leaders of their organizations.  So alphabetically, Anja Kovacs is the director of the Internet Democracy Project and is based in New Delhi, India.  And the Internet Democracy Project does really amazing research and advocacy through a feminist lens on a number of issues.  Last couple of years really focusing on bodies and data that.  And then Joana Varon, the Executive Director of Coding Rights based in Brazil and works at the intersection of technology and other social justice constituencies and I think is known for really strategic and creative communication.  And ways to really engage audiences around the issues among many other things.  Welcome, Joana.  And our other speaker is Sadaf Khan from Media Matters for Democracy based in Pakistan.  And Sadaf is one of the leaders of Media Matters for Democracy which is again known for a lot of policy work around media freedoms and internet freedoms and rights in Pakistan.

     So thanks all three of you.  And the reason we invited all three of these speakers is because all three of them actually wrote really powerful essays looking at the intersection of bodies, data, gender, and sexuality.

     And I'm going to put the link to the essays in the chat.  But I would like to actually start by asking each of our speakers if they could briefly summarize the key ideas that they raised in those essays and if any of you would like to read out a little, you know, excerpt please feel free to do that as well.  We allocated 10 minutes roughly per speaker.  And Anja, may I ask you to actually begin.  And I also wanted to say that the name of the essay that Anja has written that we are referring to is When Our Bodies Become Data Where Does That Leave Us and I'm putting a direct link to that actually in the chat.  Okay.  Over to you, Anja.

     >> ANJA KOVACS: Thanks so much and also for the invitation.  It is nice to be with feminist friends and talk about data governance after a long time.  Excited to be here.

    As you mentioned already in the last two years quite a lot of work that we have been doing at the Internet Democracy Project has been focusing on data governance.  What we tried to bring in the essay was one specific aspect.  We know that with datafication of society this is something that all of us are affected by, but what we really wanted to bring out is that for people at the margins, the consequences of losing control over our data are particularly devastating.  And particularly we want to highlight the concept of privacy as boundary management to really being central to how we understand those new dangers and what is really happening here.

     Boundary management is something to complicate but something that we are all doing in our everyday lives all the time, right?  It is really about the decisions about what to share, what to reveal or not about ourselves to others.  And you do that when you go to the supermarket and they ask for your phone number, you do that when you consider whether or not to share your sexual orientation with your family.  All of these are decisions in which boundary management is taking place.

     Some of these decisions might seem trivial in themselves but what is important is that collectively they really affect our ability or our capacity for self-determination.  And in that way also to live a life of dignity.  Especially for those whose identity is already marginalized it is important to have the space to step back from the social environment and develop a critical perspective on that environment, validate your own experiences and beliefs and desires even if they are the not aligned with what society values.

     That is really important if the social is not to overwhelm our own sense of identity if we are fully to become ourselves, so to say, right.  And boundary management, the thing we engage in every day is what allows us to create that room for ourselves.

     So that understanding that this is really important even though we don't see it reflected very much in traditional debates around privacy that talk about privacy in the home where it is something you either have or don't and it is static as well.

     Even though a lot of the traditional perceptions of privacy are of that kind.  Privacy boundary management is a concept understood by people who think about the lives of marginalized people for many years and I wanted to read a brief part of the essay quoting Dr. Ambedkar.  In the 20th century before the internet came into being one of India's foremost intellectuals was already sharply aware of the importance of being able to dynamically control what information about ourselves we share with who and when.

     While Gandhi called upon followers to relocate to the India’s villages; Ambedkar recommended that the country's those who were in the pervasive Caste system move to cities instead.

     The anonymity of the city would give them a better chance to escape the rigid requirements to build better lives for themselves.  In villages, your Caste will already always be known.  In the cities you still retain some measure of control over whether or not to reveal this information.  Recognized that boundary management is not only important because it allows us to control what we share with others, he also understood that this control is crucial to living a life of dignity.

     And that is the end of the quote.  The challenge today that we face and that all of us face to some extent is really that in their quest to make us more and more legible, transparent, predictable, governments and companies are fundamentally undermining our capacity to engage in that kind of boundary management to autonomously manage our bodies, ourselves and our lives as we see fit.

     How did we get there?  First of all, a challenge, a big challenge is that in most dominant discourses data is treated as a resource, as this thing that is out there, up for grabs that sits on everything but is somehow independent from the medium that generates it and that is why you can just take it if you get access to it.

     At the same time, data is also increasingly seen as telling the ultimate truth.  Your body's data and only the data will tell us who you really are and you see this, for example, in the growing number of countries relying on digital IDs based around biometrics.  If the biometrics, if a machine has to verify the biometrics is failing to verify you, you are not who you say you are even if you are standing there and making that claim.

     So what we see is that increasingly our bodies, the data about our bodies is actually privileged over our living and breathing bodies and telling others who we are.  We can no longer represent ourselves.  This is really in many ways the end of agency.

     There are many examples in the essay, but here I thought I will briefly touch on one.  There were two researchers who trained an algorithm to actually tell whether outcome or straight, gay, or lesbian.  So basically to read people's sexual orientation.  That study was highly criticized, very controversial.  First of all, because it has the potential to out people on a massive scale, right?  And since in so many countries not being heterosexual is still a crime, this has really, really potentially deadly consequences for people.

     The algorithm wasn't always accurate in its categorizations either.  But because of that aura of truth that is there around data, people often tend to believe this in any case over what a person tells you and again seeing that sexual orientation is criminalized in many countries, this is really, really risky.  This example also really shows how increasingly the expectation we do still have and what was referred that at least in urban spaces our faces are in some ways also private.  We have spaces of privacy even when we are out in public.

     What algorithm searches these do are also undermining that.  When you go out in public space now increasingly you do not have that anonymity anymore that was considered so empowering.

     These are really important harms to do with anonymity and privacy but there is a second set that the example brings out and that raises the broader question of dignity also.  That is about biometrics showing who we really are, right?  In a way what the two researchers were doing was really reminiscent of the pseudo sciences that were popular in the 18th and 19th century which claimed it was possible to determine somebody's personality or even their moral character from their faces or their skulls.

     Many of these the most horrible aspects of these theories were debunked especially in the 1960's but some of their impact actually remains.  So, for example, if you look at the genocide in Rwanda in 1994, one of the origins of the genocide has been traced to Belgium and German colonizers who divided the population based on the measurements of people's faces.  What is scary is that the basic premise of the colonizers at that time has been brought again in research.  This might sound like an extreme example but actually facial recognition technologies at least in the country like India already affect in many different ways how freely we can live.  Just one example, it is increasingly used by police to identify habitual protesters.  Even though protesting habitually is not a crime under any law in India.  This is just one other example how that space is really shrinking.

     There are many more in the essay and some of those examples also built on the points I just made, but really the overall point that I then wanted to bring out here and that the essay is trying to make is really that that capacity for boundary management is really crucial to living a life of dignity in the digital age and if we want to continue to live or have that capacity to live a life of dignity going forward as well, it is really crucial that we reclaim this.  Thank you.

     >> MODERATOR: Thank you so much, Anja, for opening up the conversation with really a very thought-provoking concept and sort of, you know, the boundary management concept, right?  Really taking the whole question of data, privacy, dignity, agency into a whole other realm.  Thanks very much. 

I will invite Sadaf to go next.  And again, as was the past, I'm just going to -- give me a moment.  The name of the essay that Sadaf wrote is Data Bleeding Everywhere: A Story of Period Trackers.  And I'm just going to post that link and invite Sadaf to start as well.

     >> SADAF KHAN: Thank you.  I think when we were writing and working on this essay last year we had no clue that basic concept of privacy versus healthcare would become so important in just a year.

     When I started working on the essay and I'm sure I hope people are participating will read it later, the premise deals with how women are using period trackers, what kind of information are these period tracking apps collecting, sharing, caring and how is -- how the users are engaging with the premise itself and whether they are conscious of it or not.

     I will start with today because it is called post pandemic, it seems like a futuristic thing especially in countries like Pakistan where internet penetration is only 17% and the subjects and people I reached out to interview for this essay hadn't really thought about what they are doing and whether -- and they didn't see their data as something that invades into the privacy.  I found it fascinating, Anja, when you talked about that because in the mind of various users period tracking apps which are interlinked the boundary doesn't exist and they had not thought about the fact that the data that they are entering that is so personal that they don't even talk to their doctors about it usually is actually being shared as part of a big data app.  And not just one third party that they give consent to but everybody else that the parties are connected to.

     The question is paramount in a lot of people's minds.  Especially because we have seen the same thing being carried out and exploited in a way much more invasive than this.  In Pakistan created a COVID tracking app and took over tracking app like various other governments and ours is linked to the GPS signals of the users diagnosed with COVID and allows other users to enter data that they think or know have been positive or are displaying symptoms of the virus. 

So the main question that I title in the essay has become kind of paramount and has come to the forefront as well.  I will read a bit from the essay before going into key questions that came up when I was working on this.  I keep saying questions because I left a lot of questions in there as well.  That was one of the key things when I engaged with other people and I come from a journalistic background so my essay includes a lot of journalistic investigation into the users.

     And so what I found paramount was that the questions I was asking them was bringing forth their questions in their minds and I think that is a good thing as well.  But nobody had really explored the answers before the questions were posed to them.  I will read a bit just the beginning to say why I was interested in writing this article and why somebody who works in the field that really connects with the data and really explores this area still uses these apps because convenience becomes at times paramount for us.

     So my own initiation into the menstrual apps was an emotional one.  Having just gone through an early term miscarriage, I switched from Glow Nurture, a pregnancy tracker, to Glow Eve, a period tracker.  The miscarriage had triggered an irregularity in my cycle that had not been there before, and Eve helped me track the change and maintain a detailed health log. It also helped me connect with other women going through the same thing and facing similar emotional and physical fallouts.  In other words, Eve offered a space that helped me heal. 

It was months later that I actually started thinking about the details of the health log that I was maintaining on Eve which ranged from my height and weight and moods but it was also clear to me that much of what I was noting down would never ever make it to my gynecologist.  However, in the vulnerable state that I was in, it was helpful to just click on options that helped me label my own emotions as anything but grief.

     Eve's how are you feeling prompt asks users to choose from a variety of emotions. You can be happy, emotional, stressed, calm, tired, in love, angry, sad, energetic, confident, motivated, excited. And once you tell Eve which emotional state fits your current mood, Eve will ask you to de-construct your emotions even further.  Why are you feeling this way?  User can choose from options including love life, family, schoolwork, PMS and others.  Every time I completed a health log on Eve, the applications and algorithms were able to add more detail to my health records based on the app made better and more predictive positions and post relevant articles and posts to me.

     As long as I gave Eve regular updates, every bit of information I received from the gems that Eve selected to the community posts that made to the top of my timeline were completely customized to my own needs at that moment.  No wonder I felt that Eve was a friend, a therapist and a gynecologist all rolled into one free, convenient bottle.

     And I chose to -- this was not really the beginning that I had planned when I started thinking about the essay.  I chose to go with this because this vulnerability and the fact that a number of users who were using this app were so emotionally involved and connected whether they were trying to conceive, whether they were using the app as a contraceptive method, whether it was just a way to track their periods, the vulnerability that came in, they were not in the best of their health at that time and the best of their moods at that time and needed some kind of support and some kind of community.

     And it made me wonder whether if this was not the very nature of period tracking apps and the nature of pregnancy tracking apps.  If we would as users be offering this data so freely and openly to the applications.

     And I want to highlight two or three things that came up in the essay.  One was the fact when I talked to the users about how they felt in control of their own data, the main premise that came was the use of anonymous digital identities.

     So people were simply putting in an anonymous name and feeling secure about the information they were putting in.  This becomes very important because while a lot of people feel that the kind of information period tracking apps ask for and are of no use and even if they are leaked they will be anonymized, what I found out in the terms and conditions is that a number of the apps see the user as data as well.  And when you think about the fact that stories shared by the community and comments left on the app which are much more personal and much more identifiable and much more intimate than what just clicking an option would be, you are all feeling towards offering that kind of transparency to the data to the app changes.

     However, regardless of the fact that everybody seemed to be using the apps on their own mobiles meaning they were connected to the real phone numbers and often connected to their Google accounts, they are often signing in through Facebook, they still felt secure in the fact that simply anonymizing their names or using a different name on the app somehow made them anonymous and somehow made their identity a secret to the app in that way.

     The other thing that I probed and found out was that people had not been thinking about, you know, the period tracking app is seen as a community where everybody is doing the same thing.  So there was -- as a community where everybody is doing the same thing.  A bit of comfort I'm talking about my sex life but a dozen other women talking about their sex lives, too.  And this is a big deal for women coming from a conservative society from Pakistan where it is not okay to talk about things and go in detail and it is considered a sin, but the fact in that community being that so many people are doing it makes them okay.  If I asked about the participants they were getting and when I looked at my own experience when I started tracking the pregnancy on the apps immediately on Google and Facebook and every other thing I was using on my mobile, all the ads were changed to pregnancy related stuff and baby things.  People had not thought about how third-party data sharing works and that is something coming to the forefront again with the COVID-related apps.

     The third thing that I think was really prominent to me is the fact when people think about the consent they are giving to applications they haven't really thought through it is not informed consent.  When I asked if they knew that their content was being shared or the apps had the permissions to share their stories and permission to share their comments on articles and whatnot, nobody really knew about it.  And they did look uncomfortable even though they did try to kind of cover but they looked kind of uncomfortable with the fact that this could be happening. And that is where the questions I was mentioning earlier came from, came in and they were like, you know, what would the application really do with this data?  Why should I be concerned even if they are doing that, my name is not there?  Why should I be concerned if somebody knows the volume or the consistency of my period?

     And it doesn't really do anything to me.  And yet if you start looking at how these apps are progressing, we are still in that part of the world where internet penetration is not high.  But telemedicine has made a huge, you know, I don't want to say comeback, but triggered in a way that nobody is prepared for yet.  At least Pakistan.  I'm sure in India as well.  What I found out in the U.S. and different European countries where most of the apps are registered, companies can buy premium accounts for all of the employees.  It is one thing to think about Google and Facebook where Google looks larger than life and not consider what they have.  But the employer is a small entity.  If that is the way things going somewhere in the world that is the way it will be everywhere.  Technology progresses in similar ways in different countries.

     The idea of employers asking access to it and that is something that has not been thought of and something that is becoming increasingly the limit now I feel because even right now we have the telemedicine health service for our team because of COVID, the health insurance is there and a lot of people are switching to it.  A few telemedicine services that we connected to, told us that their business had increased 500 times just during this one year.

     So because of the fact that this -- I don't know in contradiction, this conflict between being able to have access to great health services versus what are you offering and on what conditions you are offering the data information that is something that has not been thought through by the general users, there needs to be more information and more engagement.  And even if people choose to still offer the data without having the protections that we necessarily need or the kind of protections we as advocates advocate for, they should at least know what they are doing and at least have a clear picture of the costly conditions, the possible implications, the vulnerabilities in the systems that is being offered.  That kind of marks the data that they are offering to the applications.

     I think I will end there.  If there are any questions, please feel free to put in Q&A and inject.

     >> MODERATOR: Thanks so much, Sadaf.  What was really interesting is how the issues that you raised actually speak to the issues that are around boundary management and how do we see boundary management vis-a-vis humans and software and also the implications given the number of health apps and telemedicine, et cetera, right, for privacy boundary management, et cetera.  So thanks very much for contributing and building that conversation.

     And our final speaker is Joana Varon, and the title of the essay is The Future is Trans Feminist From Imagination to Action.  So as Joana starts to talk about some of the major issues that came up in this essay, I'm just going to paste the link here.  Over to you.

     >> JOANA VARON: Okay.  As the article is also kind of visual, I will share my screen and go through it with you showing some of the pictures.

     Can you see my screen?  Okay.  So, this article for me in the beginning was meant to be an interview with the Katarina Nolte who wrote “That Feminist.”  But then it was right when the book was going to be launched and the pandemic happened and my head was upside down with the changes that was going to happen from then onwards.

     And I was starting to think about future, about alternatives and how do we bring back our imaginations about technology and feminism into practice, into action.

     So that is why the title and it was a mix of personal narratives and experiences with technologies.  So I will read some parts of it.  The departures from the question, what would the future look like if algorithms that command our daily interactions were developed based on feminist values?

     What if the technologies that we cherish were developed to crash instead of maintain the metrics of domination?  Composed by capitalism, patriarchy and white supremacy and settler colonization.  I have been playing with that question together with other colleagues such as Sasha Costanza-Chock, Clara Juliano with whom we are developing the oracle for what we called The Oracle for Transfeminist Futures which is a card game that have several decks that we use for codesign workshops to envision and change the manners of technologies that we have now and to enable also that anyone can -- even people who are excluded from the process of developing technology to get together and think, for instance, what a transfeminist period tracker would look like.

     And so it opens the space for imagination.  And for me, technology was very connected to my imaginaries since before the internet when I was playing video games.  And I will read you a part of the article in which I was describing my experiences with video game before the internet.  And that I said that still those imaginaries were playing and being amazed of being able to live in other scenarios in life. Those imaginaries are still continue to feel very gender normative into which I was misfitted.  I was asking questions like why did it always need to be Mario or Luigi?  Why did we need to save the princess?  Why was the princess not using her own strengths and tools to unblock her own path?  Why was it always a blonde white pretty dressed princess who needed to be saved?  Why couldn't a princess save another princess?  Why couldn't I save the turtles instead of Mario?

     And then the internet came, and for me it was finally a feeling of autonomy, of mentality in which we were transforming from media consumers to media creators, finally able to queer genders, to let our imagination fly loose with those new tools.  It did sound like a tool for revolutions.

     But then we had monopolies, Facebook, Google, Apple, Microsoft, Amazon, all the white male dominated big five in the Silicon Valley who started to monopolize most of the platforms and also our imaginaries.  Now they use the terms community while they are actually turning us into addicted users with our desires becoming the product and the targets exactly of those running that matrix of domination.

     And I think that is where we stand now.  So how do we break this?  This matrix of domination is in more hands trying to manipulate us with our data.  But also with algorithms censoring our production with an example of a gift that we shared during the day of lesbian visibility in Brazil, lesbians call themselves sapatelle.  It is like a word that is like Dyke that has been reappropriated by the movement and sapatelle also goes for frog so we posted the dancing frog and it was censored on Instagram because of the word sapatelle.  So those monopolies and the imaginary -- the monopolies of our imaginaries are taking away our rights to express ourselves with our own words.

     And the emergence of what social narratives above call surveillance capitalism has also opened the space for practices of gender surveillance.  So threats – and I will read here this part either by threats by partners who spy on devices, a web full of ads that widely reinforce gender roles, policemen infiltrating dating apps, period trackers that turn our blood into money for others or are even funded by the anti-abortion movement to spread misinformation that prevents access to sexual and reproductive rights.

     So all of these trends on the internet are getting even more complicated because there is also these trains in which our bodies, as Anja was saying, are becoming data sources from data recognition to the collection of DNA.

     And in the next hours we will launch this web series at Coding Rights that’s called From Devices to Body, the first part of which is exactly about the DNA data collection.

     And one of the key questions here is that if my cousin or my sisters or my sister do DNA ancestry test, people can use her DNA to figure out what is my DNA.  So as Anja was talking about, boundary management, if we don't discuss DNA data collection, the issue of considering the privacy is a collective matter we can see more evident because it is not about my -- only my consent anymore.

     So watch out for the series to come soon.  And I think I need to wrap.  But what I'm focused now is on this.  Okay.  Let's problematize and talk about DNA and AI and how the strains and monopolies that have conquered a lot of our imaginaries about technology over the internet, they are moving to other kinds of our digital interactions through AI systems bringing, again, this matrix of dominations.

     So I think in one hand, we should map, expose this problem, but on the other hand also recover the imaginaries and think -- recover and rethink imaginaries to have and trace our own path.  And I will end with this quote from Ursula Le Guin because I think imagination is a tool for revolution that we can not change if we do not envision alternatives.  Ursula Le Guin said “The thing about science fiction is it isn't about the future, it's about the present. But the future give us great freedom of imagination.  It's like a mirror.  You can see the back of our own head.” 

So I will finish here.  Also, an invitation for us to keep problematizing and watching and pointing and exposing how this matrix of domination is pertaining technologies in many of our digital interactions but also imagine alternatives so from this imagination we can later prototype and live them.  Thanks.  Stop sharing.

     >> MODERATOR: Thanks so much, Joana.  Again, so sort of taking the conversation, you know, back to values and how do we sort of get out of this movement that we find ourselves stuck in, right?  And what kind of values can we use to sort of build towards the future and really I think putting back the imagination.  Not as fantasy but as actually as a political tool, right?  That we can use to kind of build our futures.

     So I think now -- and it is interesting that the whole concept of surveillance capitalism also I think has come up in many of these discussions and is clearly something that is very on our minds.  So I wanted to actually ask all three of you, and again, we can go in whatever order is comfortable.

     You laid the ground out for us really well.  And these are all aspects of the whole sort of big mosaic, right?

     So looking forward, trying to imagine a future which is not as bleak as it seems at this moment.  You know, what is the change that you -- each of you would like to see based on what you have presented.  And, you know, it would be great if you could go a little bit into that and then actually if we have time for another round, then I would like to ask you all sort of the policy implication of that.  But if for now if we could just talk a little about the change moving forward.

     And I also want to invite participants if they would -- thank you very much some of you for sharing comments and if you have any particular questions for specific panelists or thoughts, or any general questions, please do use the Q&A function at the bottom or the chat.

     Yeah. Okay.  So should we sort of do the same order because I kind of -- now my head is stuck in to following that.  And I think it is nice because Anja lays out the big broad conceptual, you know, sort of gives us a glimpse into one aspect which is clearly gendered.

     And then Joana talks about the future so should we do one more round in the same order and then we can mix it up.  Anja, are you good to go?

     >> ANJA KOVACS: Sure, yes.  Thanks, Bishakha.  I realize that you cannot approach your question in two ways, right?  One is like what is at the end of that imagination in your head at the moment.

     But I approached it or I want to approach in a more practical way.  Because I do think we also really need to think about how do we -- how do we make sure that we have ways to collectively imagine, right?  And that is also really about finding a language to document.  And at the Internet Democracy Project what we are arguing, it is important to put bodies back into the debate.  And actually it is for two reasons.

     One is it is not just about the datafication of our bodies as such, but really all of the data that is being geared about us goes through the instrument of our body in some way, right?  Even our emotions, nobody gets access to until we type them out.  Which places we visit.  People don't get access to unless we walk there, et cetera.

     So all of that data is still generated through this very material locust no matter how disembodied a way we talk about it.  But it is also about the reverse.  The way the data is used, the ultimate goal of that at the end of the day is really to affect our behavior, what we do, right?  We strictly speaking still think whatever we want.  As long as we fall in line in terms of our behavior.

     And so recognizing that entanglement between bodies and data and no longer talking about data as a resource and that becomes sort of really I think empowering mechanism to kind of try and start to really imagine some of these things.  And the argument is not that we need to go back to more bounded imaginations of what a body is. In any case, that kind of imagination comes from the dual thinking between body and mind that was so popular in enlightenment thinking and has always been criticized by feminism.  Feminism or intersectional feminism has at least that bodies are always relationally constituted.  And what we see really then is that the data, our bodies are being relationally constituted in new ways.

     So feminist thinking around bodies and the constitution of bodies becomes a really useful tool to think with and develop both a language and a vision that can address some of these harms that we are seeing in a more clear way.  Can talk about it and name them.  But also can talk about the alternative imaginaries.

     And maybe I will briefly mention some of the work I did with my colleague Tripti Jean around consent.  We have a paper coming out at the end of the month and set up so many of the things that you were saying that the people you spoke to like weren't aware of et cetera.

     This is really a big part of the problem, right?  So our starting point for thinking about consent was well, if we actually see the entanglement with bodies perhaps we should look at debates around consent and feminism and we looked specifically at sexual consent and how it is dealt with.

     What is the difference between data governance where consent is failing, it is not doing enough and not working well enough et cetera and with feminism?  The crucial difference is really that feminism has always argued that to be able to consent at all if that is a possibility at all you have to be free from oppression and violence.

     And so what it means is that feminism always puts the power of relationships in which consent is structured on the table.  That brings up a whole bunch of questions.  Do we have influence on the terms of the agreement we are supposedly entering into this?  Can we negotiate before, during, after entering into it?  Can we walk away?

     In strong sexual consent the answer to all of these questions are yes, right?  Negotiation during an agreement means it is not because you agreed to kiss somebody that you have agreed to have intercourse with them.

And data governance, though, strangely all of this is erased and there is a very black and white, are you consenting or not without really looking at these underlying power structures in more depth.  And what you then get is that consent which is supposedly -- I mean it is still talked about if you look at data governance laws, consent is put forward as a way to enable autonomy.  What consent is doing is cementing power relationships that are already existing by saying to people, right, but you consented, did you not?  You checked that box.

So the rather than questioning the underlying power relations that structure this, what data governance does is really look just at that top level.  And asking the questions from this perspective makes it possible to develop.  So what we can in the paper and I saw Joana post their work also which we reference as well, we tried to build on it by develop a list of feminism principles of consent against which we then also came up with concrete suggestions for changes that need to be made in policy and that we are now applying against good law and technological initiatives in India, for example.  It becomes a way to from a different perspective rooted in history and bodies of knowledge that we already developed actually make a case to change certain things concrete.  Let me stop here.

     >> MODERATOR: Thanks, that's really helpful, Anja.  And actually, you know, since Joana did post the link to the article that she has done, Joana, would you be able to also sort of pitch into this and talk about some of the principles of the way you all approached it?  Consent, I mean.

     >> JOANA VARON: Sure, sure.  The article is called Consent to Our Data Bodies License From Feminism to Enforce That Protection.

     And on this, one part of the article was exactly to list what are the qualifiers of consent of feminism theories regarding to our bodies.  And what are the qualifiers of consent in data protection legislations.

     So and tried to make the equivalences, no, consent needs to be active, clear, informed, specific, freely giving, reversible, ongoing as a continuous process.  Mutual, comfortable.  Sincere.  Based on equal power.  Considers historical, sociological structures.  Needs to be adapted to contextual situations.  Those were all of the qualifiers from feminist theories when feminists talk about consent.

     When we look at the GDPR, but also at the Brazilian data protection legislation, the list felt quite shorter than this.  And mostly it doesn't, as Anja has mentioned, there is no consideration regarding power relations and all of those qualifiers are expressed just by the act of clicking an agree button agreeing to terms of services or a privacy policy that no one read.

     So our assessment is that consent as it is in the data protection framework is a very individualistic and now liberal approach to consent.  And what we need, as I said before, we need to consider privacy as a collective matter.  And if you put the burden on an individual in an unequal power relation to say yes or no, when actually they don't have the -- or we, we don't have the ability to say no if saying no means some sort of digital exclusion.

     So let me quote Sadahar Ahmed, and she says about the intersection approach towards an impossibility of saying no.  It says, “The experience of being subordinate deemed lower or of a lower rank could be understood as being deprived of no.  To be deprived of no is to be determined by another's will.”

     That is the kind of consent that we give today with those terms of services and that is the consent that is written in data protection legislation.

     And as I mentioned, is going to get worse because as we are moving also from the devices to bodies, the data collection is moving from devices to bodies, we don't even have that button to click and agree.

     >> MODERATOR: Thanks, Joana.  Okay.  Sadaf, sort of bringing us back to the bigger question of which we sort of made this little side part but an important part nevertheless is that what are the changes you would like to see?  And you are welcome to answer more broadly or if you would like to talk about consent, it is all up to you, yeah.

     >> SADAF KHAN: So I would like to connect back to what Anja was saying about data governance.  And I think one of the elements that is at least in this part of the world I think not discussed enough is the economy of it all.  There is a lot of talk of policies and multi-stakeholders and policies, but the fact of the matter is that the economies within the capitalist society are created in a certain way.  And it is the nature of that very capitalist economy itself that comes to have an impact on how these technologies are designed and then marketed at men and women.

     And the power dynamics and power relations we kind of keep talking about I think capitalism itself and the economy itself becomes a part of that power dynamic.  So one element that in terms of change that I would like to think about is how do we start to have an influence on these economic structures of big data, of corporate, of the roles that States play in the role that third-party applications play.

     So that is one thing.  I do feel that there is a lot of discussions especially in context about universal access and we have seen both countries focusing on infrastructure access.  What I find lacking in these discussions is again the issue of power dynamics that exist.  If the next billion are connected and the same disparities and same are replicated and enhanced as more and more people connect we are looking at the situation where the vulnerable both in terms of just access and actual harm and vulnerability but also vulnerability in terms of consent of giving the data and giving data and the boundaries that Dr. Anja talked about that will be exacerbated.

     Another thing is where I look at policies and we just started working extensively on the digital media information literacy of it which has become a buzz word and there is talk about the information disorder and whatnot.  But the focus of most of these interventions is like literacy is focused on the technicalities of using the internet.

     The media information in programs in schools and colleges, they talk about the technical aspects of it and red flags of it but they don't talk about the metadata and what goes on beyond the screens we see, what is outside of the user interface.

     And a lot of questions that we are discussing now, the AI, the biometrics and facial recognition technologies all connect with things that don't really -- that are not technically qualified as a part of the digital media information literacy interventions that we are seeing being designed.

     Now I understand that this is a technical field and I understand that it is difficult to understand these things.  But it is not that difficult to simplify it and break it down and de-construct on a level especially if there is already work at the school level, at the primary school level.

     I think in terms of what policy interventions I would like to see and what changes I would like to see, these two areas come most supreme to me the discussion on the economy of the technology, that eventually affects the design of technology as well and the governance obviously.  And a broadening of the digital media information literacy focus to include the questions that we are raising.

     I also think that one thing that we -- that I would like to see happening is a collective reimagining of privacy in the digital age.  Now we live in a very strange environment and I'm sure there is similarities in India and South America as well.  On one hand, we talk about privacy as something that is inherently there, women don't talk a lot about, you know, what they are going through, things that are inside the home are kept inside the home, the domestic matters.

     But if you look at home units and family units, there is a lot of invasion of privacy.  Children are told not to lock doors, wives are surveilled, mothers are surveilled.  There are key questions that should be private to an individual that are probed upon.

     And so when this contradiction exists on the what you correspondent to people of being a private safe kind of society and yet when you look at -- if you de-construct what happens within the home units you see a completely different picture technology and the data we are sharing online adds a complexity that people are not really equipped to deal with.

     So even if you look at case laws and if you look at how legal instruments have been used to look at privacy -- and I'm not talking about just data protection laws but privacy laws that exist already -- I think there is lack of understanding within the Court and within the public discourse on what digital privacy looks like and what should it be for an individual, what should it be for the home collective because there is so many more involved.  Your ISP provider, your work, internet data you are using and then go to big corporate.  So the different layers I think they are not well understood, and I think they have to be well understood for us to collectively reimagine what privacy in the digital age looks like.  Those three are the things.

     I want to go back to something that Joana was mentioning earlier about talking about the science fiction is for today and not for the future.  And this is something that strikes me every time -- I'm a big Star Trek buff and there is a future for all kinds of alien species but if you look at the composition of the actual Star Trek teams and the captaion, it is the white male leading everything and making the decision and whatnot and it is obvious that the minority and the women and Black men they are just there to add to the diversity of it but the franchise itself, that was the future we imagined.

     So somehow like within ourselves the imagination is still so limited.  And yet everywhere every alien in the world is welcome and considered equal.  So yeah, it really speaks to me what you said that we need to the imagination I think it does come to speak to the world that we are creating today.

     >> MODERATOR: Thanks, Sadar, for raising a bunch of important questions and policy questions and what you said about Star Trek, right, you really pointed out a bias that in terms of sort of who is the sort of technology subject in a sense, right?  And who sort of the diversity but yet other species are sort of all welcome kind of thing.

     You know, I think a couple of things that I wanted to raise as part of this and we'll then move to the last part of it.  I think one of the points you were making, Sadaf, about digital and media literacy being very technical.

     I think that is a very valid point and also a struggle like honestly as we are sitting here doing the first workshop tomorrow with domestic workers in India and doing it, you know, online, right?

     And so I think the struggle for us how to use digital media like how to use Google Meet, et cetera, I think it is how to understand the concept of data.  We use metaphors and things like that, right?  But often you know, it comes -- it is like when grassroots communities they are thinking about telephone data and mobile data and recharges et cetera, et cetera.

     What I would at some point love to actually really sit down and brainstorm further how can we find ways and means actually to talk about the very critical questions with communities that are still sort of at the early adoption stage or just getting into digital technologies but super valid point.

     So the other thing I wanted to say is there is a very interesting discussion going on around consent in the chat and I wanted to actually draw the panelists' attention to one of the comments.  You know, who talks about sort of the relationship between individual consent being overwritten by public interest during the pandemic, which I think we have all seen now.

     Recently, and notes that it is not necessarily that the support for this is because of capitalism but can be from a perspective or discourse of healthcare, right?  I'm curious if any of you would have any comments around this?  Because I think it is a question that we have all faced in the last few months.

     And you know, yeah.  If any of you would like to sort of just comment on that, I thought I would just share it with you.

     >> SADAF KHAN: Slightly unrelated to what we have been talking about we studied in the last year of the perceptions of the internet.  We were looking at how men and women perceive the internet and what comes to their mind, a gendered study.  And I was fascinated by the idea of giving consent to be monitored by male members of the team of the families and women we when were working with a largely working-class group of women, early users and adopters and they termed this like it was culture.  And as Anja was talking about how we disembody the concept of data and the body but it was very physical.

     And women talked about how if they are living in proximity with their husbands and if they consent to being physical with them.  The consent automatically translates to the things they own.  So the idea of being possessed simply because you are having sex with somebody and somehow that concept automatically extending to what you are saying online, what you are doing online, and that being okay and that being not questionable or something that has to be thought through or separately dealt with.  That is something that we I think need at least on my team in our privileged kind of bubbles we had not thought of it that way, that it is not -- we always see that subjugation.  Not that they are invading but that they consented to let them have this access and that was also related to the protectionist mentality.  He is my protector and if he says something then it is okay for him to say this.

     So I think it is interesting to see how cultural norms that have been very physical are getting so much more prominent as more and more people switch to digital.

     And I think during COVID I see this being, you know, more and more problem.  We do get a lot of calls from people who have just acquired their devices because now they need to for their livelihood and whatnot.  And often rather than asking questions how to use it they will direct them to their husbands or sons or the male in the families.  And the question of consent doesn't arise because they are seeing them as facilitators.

     So just in terms of the terminologies and we are talking about how we engage with people.  I keep wondering whether the concept is there but it is packaged in a different way.  And then we also need to kind of understand the legality that people who are just adapting to technology are using and then use that legality to de-construct the concept that we are talking about.  I don't know how to do it, frankly.

     There is a huge language thing as well.  Technology is predominantly English and we are privileged enough to have that language but just the technicality of it.  I think there is a lot of work to be done in data and how else do we talk about consent.

     And it is such a sacred word that I think we don't want to shy away from engaging materiality that people see it in a different manner.  But I think we need to start questioning kind of our own protection of that word as well.  I don't know if that made sense.

     >> MODERATOR: Actually, super helpful, Sadaf. And I was thinking while you are talking that you surfaced something that we don't think about that much, which is that is there a universal sort of understanding within a culture of something like consent.

     Or does it also correlate with, you know, income, class, and other sorts of things, right?  Like just through the example you shared.  That was what was going on in my head because I think in the feminist movement we sort of assumed a universal kind of understanding of consent, right?  So I think this is extremely helpful.  I think if anybody would like to comment on this, that is fine.  If not, we can move to the last sort of piece of this, because we have about 10 minutes left.

     And the last piece of this, you know, as always, at the IGF is to try and leave to governance and policy but I want to frame in a particular way.

     Last year I spoke about bodies and data.  And everybody found it very interesting.  But there was somebody who turned to the moderator and said like you know, without like genuinely said that look, this is all really interesting, but does this have legs?

     Okay.  And I think what she was trying to say is that conceptually this is really interesting, the notion of us thinking about the body rather than data as a resource, et cetera.  But is there something we can really do about it in terms of policies and laws.  Even if we take something like GDPR, would we reframe?  So I'm curious if we can have a little bit of a conversation about how do we bring this extremely useful theoretical construct, right, into policy?  Like how do we give it legs?  Anja, I turn to you again first.  I think I'm stuck in this pattern rut.

     >> ANJA KOVACS: So I'm also still thinking about the question.  Let me maybe go back to that paper on consent that I wrote with Tripti.

     The one of the things that we started with was actually we went back to a feminist analysis by Carol Peyton in particular.  She analyzed a lot of the social contract and the social contract is really what modern nation state is based only and most of our law is based on, right?

     And so the idea of that and liberal theory is that we are all free and equal human beings.

     And if we enter into a contract we do that as free and equal beings.  And a contract is only valid if we have consented to it as free and equal beings.  Now, what Beeban did in her work is call attention to a particular set of contracts which are about property in and the person and that is a recognized concept.  It has been written about and central to the whole notion of consent.  So the idea is that you can sign away your property and your person as long as you consent.  Relations today are based on that.

     We are acting as if we can sign a contract in which we commit that our employer can use our skills or talents or whatever as if we can somehow send those talents to work without us actually having to go to work as well.  So we make something subject to contract in a piecemeal way without recognizing that actually the whole person is being made subject to this contract.

     Beeban highlighted is that the notion of consent is that the fundamental difference between slavery and modern employment contracts because actually in both cases the entire person has to be there but in labor contracts we have consented to doing this and somehow it becomes okay.  If you live somewhere where labor laws are really strong the impact of this is becoming less and less and you do have a sense of autonomy, agency at your place of work if all goes well, right?  But in India, for example, many people and thinking of people like coal miners still don't have that really the line between labor and slavery sometimes there is really, really small.

     So I think those questions around the concept of property in the person as being relevant to data as well because when we sent the data, the point about it not being disembodied is that this is not just something you are going to manipulate without an effect on me.  When you manipulate the data you want to also manipulate me.  On the internet data project we don't think about the internet bodies.  This is me you are manipulating.  That connection is that close, right?  And so if you don't come back to that -- if you then come back to the question of consent, if you look at it like that, if you see that even employment contracts today establish relationships of subordination which is what we keep coming back to in the examples.

     We don't know certain actress will become rich and powerful on the basis of what they get from us and we don't even know how they manipulate us through it.  It is called nudging when advertising speak.  When does nudging become not manipulation, right?  There are the relationships are subordination being established through consent.  If we actually so believe in human dignity going forward what are the measures we should take to make sure that the deep inequalities that are generated through the new relationships cannot continue to materialize.

     And so that is why that question about broader -- the broader context is important because if you bring that into the picture, what you start to do is you start actually to be able to say there are certain things we should not be allowed to do.  There are certain things that should never be asked from us.  For example, that what Sadaf talked about the third-party data sharing.  If third-party data sharing so fundamentally undermines the control we have over how we exist in the digital world then this should just go.

     And for many people this seems like outlandish, but let's remember at some point in sometime slavery was legal and yet in no self-respecting democracy today you can sign yourself into bonded labor.  Even if you argue if I wanted to do that and consent to that as a free and equal human being why should I not be allowed to do it?  We have recognized there is something about human dignity so fundamentally underlined by these conditions that we cannot allow it as a society.  Exactly the same thing we need to move to in digital policy as well.

     So the examples, the concrete suggestions Tripti and I make on consent in our paper are at three levels.  One is at the moment when the data is collected because as Joana also pointed out if you look for an example that consent has to be specific almost every data governance law says that but how is concept specific if the actor is going to send your data to 150 third parties whose consent agreements you have to go and read separately?  This is just ridiculous.  How can we even call this specific consent?  How can this be legal under that language and under any law, right?

     The second set of changes are what can be done with data after it is collected?  And so the behaviors surface and the book talks about as well in surveillance capitalism all of the things that they collect from us but don't need to improve a service.  Why should we agree to that being collected and by default agree to it being collective?  Maybe there are some actors I share this it because I started to trust them.  How is it possible that undermines our control by default should be allowed?

     And the third set of changes are really about the changes that need to be made for people who are already vulnerable.  And this can be focused on specific protected characteristics like sexual orientation, for example.  But it can also be focused on specific situations or relationships.  So again somebody gave the example already of what happens in many places of employment, the kind of data that employers can take.  At the moment we have job data protection law in India that basically your employer can collect any data that they see important about my performance on the job. 

So if they want to collect data on my mental health that is okay?  How could this possibly be legal?  They could argue this would lead to or affect our performance on the job, right?  So I think really it is partly about making these statements again and again and asking the questions again and again.  Saying it should not be legal.  It is not because we are doing it now that we have to go along with this, no?  Big companies like Google and Facebook have presented to us in India as in some other countries, the government is working very hard to reproduce the same dynamics of collecting as much data as possible from citizens as well.

     Often to support the growth of a local economic ecosystem.  But it is not because we are going like this right now that we have to accept it.  So I think really yes, there is value, this is operationable.  But we need to really make the effort to imagine along these lines and then to continue to say this should not be allowed.

     >> MODERATOR: Thank you, Anja. I think you have definitely showed how this can be given legs.

     And I am also conscious that we have literally two minutes left.  So Joana and Sadaf, may I invite you to make a teeny comment so we can wrap up in time.  30 seconds each, please.  Joana.

     >> JOANA VARON: I just posted in the chat developed through the allied media conference and tried to use design justice principles to think about consent addressed to technologies, the developers.

     It is an interesting resource, but still, I think there are more considerations about power that is to be thought of -- talked out loud as Anja said because also consent is in the core of many of data protection legislations. But when this comes to the relation with governments, there are many exceptions and so things get complicated.  So my last thought would be really to talk about more about the centralization of systems and technical solutions to be able to track our data where it is flowing and then have proper transparency about what is going on.  Thank you so much for the invitations also.

     >> MODERATOR: Thank you.

     >> JOANA VARON: It was very nice.

     >> MODERATOR: And last words to you but again tight.

     >> SADAF KHAN: I will be quick.  Something I have been thinking about is the validity and legitimacy we keep giving to global standards and global policy standards specifically.

     The more I look at how the same standards are applied differently and have different implications especially in the global south where context we are going relations with the government states with the concept of democracy is applied differently at a different state of development, the more I feel that there needs to be a more broadened discussion about how do we adapt these, yes, there are standards about principles but there are policies about operationalizing those principles in ways that makes the most sense in different context.  That is one area I would like to see improvement in.

     One word that I will leave for everybody because I'm sure there are people who do policy decision, thinking about quoting the actor what is it called actor centered institutionalism approach to policy research which looks at like how different actors essentially come to play their own part when institutions are created.  And it is a developing thing in the political science but I think because we talk so much about multi-stakeholderism I think that is one of those that we need to start looking at and I'm still trying to learn more about it and read a few articles.  That is one that I would like everybody to read about and see if we can apply it to the multi-stakeholder approach that we get there and maybe that would offer solutions in improving policy interventions.  Thank you.

     >> MODERATOR: Thank you, so much, Sadaf.  I put that in the chat.  A useful way to think about the box of policy itself.  Thanks very, very much Anja, Joana, Sadaf, for this really valuable discussion.  Thank you, Debarati, for sharing the gender report cards of 2019.  Thank you Carolina for your support. And thank you, Sandra, the captioner, for the captions.  Thank you IGF 2020.  See you all at the next session.

     >> JOANA VARON: And thank you, Bishakha.

     >> MODERATOR: This was great.  My mind is reeling with ideas to be honest.  Okay.  Thanks.