IGF 2019 – Day 3 – Estrel Saal B – DC Gender and Internet Governance

The following are the outputs of the real-time captioning taken during the Fourteenth Annual Meeting of the Internet Governance Forum (IGF) in Berlin, Germany, from 25 to 29 November 2019. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> SMITA VANNIYAR: Hi, good morning.  Can I request the ones sitting a little further away to move closer because we are hoping that the session will be more of a conversation so this will be helpful.  If it is okay for you, not requesting you to move first thing in the morning.  Thank you.

Good morning.  Thank you for coming to this session.  I understand it is early in the morning and it's the third day, but I really appreciate that all of you are here.  This is the main session for the Gender Dynamic Coalition.  The Dynamic Coalition on Gender and Internet Governance.  Usually what -- the Dynamic Coalitions are free to structure the sessions in ways that they think is most beneficial.

We thought that this year it would be interesting if we make the space a learning and knowledge sharing space, and not to just share updates from the Dynamic Coalition's work alone.

And today I have my colleague, Dr. Anja Kovacs here, who will be presenting her research on the body as data in the age of datafication.

I also have four fantastic discussants with me. Here is Kyong Mijo (phonetic), there is Chenau Chair, there is Dr. Ruhiya Seward, and Professor Park will be joining us shortly.

I also have my colleagues Baldeep and Valejia who are rapporteuring and helping me moderate the session.

So the format with what we have decided is that for about 20 to 30 minutes Anja will present the research.  And then we will have a discussion and a conversation to engage better with the topic.

Just a short introduction on why this particular research, and why this particular research in relation to gender and Internet Governance.  One of the main things of the IGF this year is data governance.  The aim of the data governance track is to basically discuss the fundamental challenges which come in the way of benefiting -- which come in the way of using data for the benefit of the people.

But the truth is also that data collection, data generation, and all of this is not done in a manner which keeps the people at the center of this, right.  So when we are talking about data governance, it is also important to think about the bodies behind data which is why we thought that this session is particularly important.

In specific with regards to gender, as it is with marginalized communities in the offline space, in the online space as well when bodies are removed out of the picture the first groups of people who are affected are the marginalized people.  So in this case it would be women, it would be queer persons, it would be people who are from non-English speaking backgrounds, who are not from global countries.  So that is just a brief introduction.  I will let Anja take over now.

>> ANJA KOVACS: Thanks so much, Smita.  Also, for inviting me to present this work here, which is obviously an amazing opportunity to share the ideas with people like you who are experts in the field already.

My name is Anja, and I work with the Internet Democracy Project in India.  We started our work around these issues actually looking at surveillance from gender perspective specifically.  And that was basically with the aim to bring out the most structural aspects of the harms of surveillance and debate on privacy alone as the solution to the harms.

As we did that work, questions around data just became stronger and stronger.  And what I want to talk about today is a little bit different from the traditional approaches you have to the body and data.  We have seen, for example, a lot of work on, say, menstruation apps, right, where you have a very direct translation of the body into data.

What we are actually arguing, though, is that the way data is connected to our bodies today is leading to a fundamental reconceptualization of what our bodies are.  And if we do not bring that into the debate on data governance, we are not going to find the right protections for human rights in the digital age.

It's not a gendered approach to data, it is a feminist approach to data where we draw a lot on the research around bodies that is done by feminism and other progressive academic disciplines, but it is not only about gender.  What I am going o present is slightly different from the usual IGF presentation in that there will be a bunch of social theory to start with, but I will come to the policy implications in the end.  And I'm quite sure that some of this will be familiar for many of you, but I want to run through it because I think the way we stitch these things together matters.

So to come to the point to see the validity of the perspective, I think it is important to run through this.  Also, in between, if you want to stop me at any point, have any questions or comments, please do.  As Smita said, the idea is to try to make it more interactive, although I guess I am going to talk for a bit.  Okay.

I think I still want to start from this idea that this intermixing of technology and human beings has the potential for amazing empowerment.  We wouldn't be here if we didn't believe in that in a way.  But why are we not living in techno paradise yet?

After the second World War, a discipline called cybernetics developed that actually really heavily influenced at how we look at the world today.

And it led to two particular shifts.  The first one is how we think about data metaphorically and how we speak about it metaphorically.  So in cybernetics, I mean the image kind of is a hint of the big metaphor used today is data is oil, but we speak about data as resource, for example, right.

The idea in cybernetics where the ideas come from is that information informs everything.  It is a layer that sits in everything.  But in a way, it is also independent from the median that generates it.  So you can take the data out of the thing and it is not going to change the data or the thing.

In that way, then data becomes something that is out there, that you can mine, that you can grab.  We also, because of that conceptualization, we give data enormous power.  We say data is the truth, right, because it is that layer of reflection of reality.  And in the most extreme forms you see these ideas when computer scientists, for example, talk about one day hoping to be able to download a human brain into a computer.  The idea is that that human brain would still be the same as if it was in a human being.  So the body in which that brain was housed supposedly doesn't matter then to what that brain would be like.

Now, I think as feminists we know that that is not correct in practice.  There is a very close interlinking between data and power relations.  And these two are closely intertwined in many different ways.  The easy examples to see this are cases of context or use discrimination.

So if you think of, for example, a non-consensually shared nude image of a woman or of a man, of breasts or a dick pic, even though in terms of the data that's bits and bytes in the same way, right, but we read it very differently because we give a different meaning to it.  That's context or use discrimination.

There are other forms in which power balances play out.  But I want to focus specifically on the ones that have to do with data to date most closely.

That is what we experience today is more and more discrimination by abstraction, right?  Where we are actually made into data points.  And we often have very little control, first of all, over the data as such.  That is problematic because some people have always been surveilled much more than others, and so there might be over representation.  In other cases, we also have under representation.  And this is particularly problematic in the age of big data because we rely on the past to predict the future.  So the data that we have or do not have matters.

Now I want to give an example here specifically from the context of India where in the police manual of the State of Karnataka, which is in the south of India, for four years from 2012 onwards there was a provision that actually allowed the police to predict -- what is it called again -- to arrest transgender people on the suspicion that they might become a disturbance to public order or peace.

And this provision was broadly worded.  It also gave the right for the police to have a register of transgender people so that they would know where to go and find them when anything happened in the neighborhood.

Now if you have a provision that focuses so much on a particular social group, then obviously that group is going to be overrepresented in the crime statistics of that state simply because whenever something happens and the police has to show they are doing something, they are easy people to turn to.

We have a few other examples like that in India including what in colonial times used to be called criminal tribes.  Those groups are now not called criminal tribes officially, but again through regulations and police manuals they are still under much more heavy surveillance just because of their backgrounds.  And what you get when you build the system of predictive policing, is obviously that if you don't correct for these biases, those groups will continue to be targeted disproportionately.

But we also have very little control over the categories we are sorted into as well as how the categories are formed.

I used this image.  We all know from airports this routine.  Now, as feminists we have been arguing for decades that gender is not a binary, it is a spectrum.  But when this happens in an airport, there is no space for the spectrum, right, the space is for the binary.  The police officer who sees you has to decide whether you are male or female.  And if they are not sure where you fit or if they make the wrong choice, you get pulled out of the queue.  What if you identify as a woman and that is not how the officer reads you, and you get patted down as a man because that is how the officer did see you, right?  These kind of choices where data basically decides who we are or somebody else uses data to decide who we are can put people in really, really vulnerable situations.

I don't think you can see this very well, but in any base, there are ample examples like this where even our representation is out of our hands.  The images I used are -- this was a controversy in India a few years ago.  If you Googled North Indian Masalas, Masalas is a specific mix of spices in India.  If you Google North Indian Masalas, you get images like this which are overwhelmingly of food.

And if you Google South Indian Masala, you get images like these which are mostly celebrities, models in various positions.  And so, again, the women in these images did not necessarily choose to be represented like this, right?  But the control we have over how that happens is becoming increasingly weaker.

What is the problem here is really the absence of context, right?  Where we just become datapoints and then those datapoints are used to make decisions where we have no control which can lead to problems in discrimination, representation and social justice.

So why I wanted to outline this in some detail is because I think data as oil makes it sound as if gathering and analyzing data by default is always a good thing, it is always something that should be allowed and since it exists we should just use it.  But talking about bringing back context and bringing back people really reminds us that actually perhaps that shouldn't always be done even if it is possible.  And that not all algorithms have equal value.  Right.

Any comments or questions so far?  Okay.

The second big shift that's related to the construction of data as a resource is the way that surveillance works.  And bodies have always been central to surveillance.  But they are not any more in the same way as surveillance as we used to know it.

The way we talk about surveillance in many ways, I think at least until the northern revelations was still very much the image on the right, big brother is watching you, which is surveillance as monitoring.  It's watching what you have done or you are currently doing.

When we used to talk about surveillance and security, that was often the approach.  And a lot of the laws in many of our countries actually still cater to that more traditional form of surveillance.  And we had a fairly stable balance around I think consensus in many countries about how that should work in the 90's.

Of course, by then, already surveillance had started to shift.  That image on the right is of Panopticon which was designed by Jeremy Bentham in the late 18th century as a jail where the guard would stand in the middle and the prisoners would be all around.  And the prisoners would never know whether the guard could actually see them or not.  And then the idea is that you would adjust your behavior, right.  And this model was made popular again thanks to Foku who saw this central to all modern institutions, hospitals, schools where the idea that you are being watched actually incentivizes some kind of behaviors and discourages others.

And so then surveillance became both preemptive, that it is kind of prevented us from doing things in the first place so you don't have to go and look at what people did.  But also productive because it makes us do other things, it influences our choices.

And we see now that these possibilities have become even more extensive with big data.  I guess the most controversial example nowadays is around elections and how they actually influence voter behavior.  But there are also the examples of the social sorting, the categories that you are put in, for example, to decide credit scores, for insurance purposes, et cetera, right.  All of these are related to this.

We also know the disciplinary power of the cybergate in a much more personal way.  Many of us have kind of thought oh, why did you put up that picture on Facebook, I look so terrible in it.  It is actually the same logic at work.

But surveillance isn't just that any more.  It is really fundamentally changing in what two scholars Haggerty and Ericson have called the surveillant assemblage.  Does anybody know what that image is?

(Off mic)

>> ANJA KOVACS: Yes, exactly.  It is a rhizome.  So that is those wheats where on the top you can see it but actually underneath the roots are all connected.

What Haggerty and Ericson argued is that this is really the model of surveillance today and the fundamental change.  What we have now is not any more earlier where you had the guard in the middle or the government watching what you were doing.  Surveillance is actually a system that always has multiple actors working together.  And if, for example, you use a Fitbit or any of these apps that like check things that have to do with your bodily data, you are contributing to that data yourself.  You play a role in that.

So you always have multiple actors.  What has become important now is the data flows and who is able to capture those data flows and what they are able to do with that.

Now, that is also fluid.  Not every actor will have the same power to capture all of the same interests at different points in time.  But we can see that slowly there are nodes emerging that actually are being more institutionalized and have the power to capture a lot of that flow and then use that flow or the data they have captured in that flow to direct or govern the behavior from others.

How is this different from the surveillance that we used before?  Well, first of all, because it is everywhere which we are quite conscious of, in a way, right, and that is one of the reasons we have the IGF about focusing on data.

What is really different, though, is that it doesn't need to see us in our totality anymore.  The idea of total body vision doesn't matter anymore.  There is a specific purpose why a specific actor wants to be able to surveil us and what they need is to be able to get hold of that bit of us.  So in that sense also, like the context of our life is really denuded on purpose, that's part of the way that that system works.

It's also fundamentally unstable because it's about flows and actors that manage to grab flows in between.  That means that the earlier resistance to surveillance can't work in the same way anymore.  We have to realize this is a different paradigm if we want to be effective in resisting it.

If you look at this from a feminist perspective, the differences then that our body is broken down into information, right?  That is really -- that disassembling is really what is different now.  And that is done to make more comparable and use the bits to make decisions about our lives, about who we are.

That means that surveillance is much more disbursed but also fundamentally distributed unevenly.  We are not subject to the same gaze anymore as again in these earlier two forms of surveillance, there was still much more similarity.  Now even your income can determine what you see on Google, for example, as search results, right?  There are fundamental differences that we aren't even conscious of as societies because it so hard to see this.

The opaqueness of all of this is really a big problem both in terms of how algorithms work and of how surveillance works.  Right.  So that is all really depressing.

But maybe there is hope.  And so what we are arguing is that the way out of this conundrum really is at least in part to put bodies back at the center.

And why are we saying that?  There is a Dutch scholar who has been arguing that actually the line between our physical body and our virtual body is becoming irrelevant.  And so that is why at the Internet Democracy Project we don't talk about data doubles, we talk about data as being embodied, which is a different perspective, a different emphasis.

Where does our embodied experience of data really matter?  I will give you a few extreme examples.  The first one is the story a few years ago there was an American academic who did an experiment with a transgender friend who had just had a sex change operation from male to female and now had breasts.  They were in the State of Florida where the law says that if you are a man you can drive around topless in a car, but if you are woman you cannot.

Now the person's driving license still said that they were male, even though they had breasts.  So they decided to do an experiment and see whether the law would pull them up or not.  They were stopped by a police officer, but they were let go because the driving license said the person was male.

This is one example where clearly the physical and the virtual, what is really determining who we are even if you are standing in front of the person as a physical person.

Another case also from Florida, which has been great in giving us examples, it seems, this is a really sad case, though.  It's a bit controversial how it played out.  But the bit I wanted to get out here is there was a woman who filed a rape case against a man.  And that rape had happened at night when they were together in a house and her Fitbit data was used to undermine her testimony that she was asleep at the time when she claimed that the rape happened.

And so in the end, like I said, it was a messy case.  It was thrown out.  But again, this is a fundamental change in how we deal with these things, right?  Because earlier this would not have been possible as an individual in that situation at that point of the day, we wouldn't have had data to check the verity of what you were saying.

So again, her story about the physical was disowned and the data was what was believed.

And then an extreme case from India, an extreme example perhaps.  This is -- we have in India a unique identification system called Aadhaar.  You need an Aadhaar number to get benefits from the government that people have a right to under the law.

And specifically, in ration shops what has happened -- so this is an image of a ration shop.  People who have a right to these rations go to a ration shop where the owner might have known them for 30 years and know that they are who they say that they are.

But because the biometric identification that comes with Aadhaar fails and the system does not recognize who they are, they do not get their data.

The absolute numbers are not very large, but half of our starvation that's in India since 2015 have been blamed on biometric identification failures through Aadhaar and people who did not get their rations as a consequence.

Again, we are living in a world where increasingly data is used to determine our bodily experiences to such an extent but we cannot continue to talk about this as something that is out there independently from us.

And this is really a fundamental paradigm shift.  I think that's really important to understand From the proof set, I think it is difficult to see these things while you are in it. The more stories we unearth like that the, more you can sense that.

If you look at the middle age Europe, for example, before and after scientists started to cut open people's bodies and look at what was going on inside, the way we talked about bodies, the way we imagined bodies, they way we conceptualized them was completely different.  So many of the ways in which we talk about PMS today, for example, hormones, all of these things which we use to explain and translate our experiences were not possible until science started to cut open people.

And the way we talked about it before that was much more through like mystical religious metaphors, et cetera, et cetera.  It didn't just determine the body then.  It really determined our experience of the world in a wider sense as well.

Of course, Europe was a bit late.  In the Arab world, for example, they had actually been cutting people open much later.  I'm using that example because it is quite well known.

Yes, so really a paradigm shift.  And that is also why I said in the beginning that I think we need to start integrating, this is early days so how that plays out in practice is not that easy yet to see.

But we are at the Internet Democracy Project trying to start to think through then what does it mean.  And I don't it radically -- it is not as if we are going to radically start thinking differently about protections.  I do think we can give more depth, and in some cases ask questions that at the moment do not seem to be on the table.

For us, it is like the body becomes a starting point to read that surveillant assemblage.  And then see what, as our experience from data teaches us, do we need to change in how that surveillant assemblage works to actually make sure that as human beings we feel our human rights, our dignity are respected.

And with this line between the physical and the virtual disappearing, it really means we need to start looking at different protections, right, because all these examples that I give, data protection legislation is not going to solve these, right?  In its own that is never going to be enough.  Some of these are questions of bodily integrity, for example.

Another case in which you can see this really stark is the way we deal in most countries with non-consensual sharing of sexual images is a privacy or data protection violation.  If you speak to women or sexual minorities of who this has happened, they speak of this in terms of sexual assault.  The harm of assault is never addressed in our law, right?  It is a physical direct experience that people have of this.  And so, again, at the moment, our laws do not allow for that to be addressed because we do not make this connection.

To end, I want to briefly run you through some of the thinking we have started to do around consent from this perspective of the embodied experience of data.  So consent in the way it is treated in data governance legislation at the moment just to give you like a teaser of where we hope this will go.

I always found it really interesting that you can sign consent away with a box.  There is no other fundamental right that we can sign away ticking a box.  You can't sign away your freedom of expression ticking a box, right?

The reason we can do that is because we treat consent in data protection as we do in contract law.  It is about modulating more narrowly the flows of information and have better informed input from consumers.  But really this is about preventing liability for consumers.  It is not fundamentally about the consent of the individuals involved.

What if we actually start to think about consent again the way we do around bodies in feminism around sexual relation, for example, right?

In India at some point in the debate around the draft data protection legislation, there was one scholar who suggested that we do away with consent all together and have only an accountability framework.

People were outraged about this.  But I was the one who asked but on what ground are you so outraged because if you treat data as a resource in that disembodied way, that is a completely acceptable approach.  Actually, philosophically you don't have a ground to stand on to say that this is not acceptable.  For us, bringing the body back in makes it clear immediately why consent in the digital age and in the age of datafication remains so important.

And so if you think of consent, I mean this is Planned Parenthood, it's not I think exactly what -- how we would frame it in the context of data.  But still, consent has to be negotiable to be meaningful if we think about consent in a deeper way.  It has to be -- you need to have all of the information you need to make informed consent.

It is best understood when it expands only incrementally.  And when you build a relationship of trust, right?  In sexual relationships, this is part of the debate always.  It is not because you've kissed somebody that you want to sleep with them.  This goes step by step.  But somehow on the internet, the idea is that you have to sign away from the outside everything and then just accept.

And also that consent is always based on the constant flow of information and transparency which is back and forth, and you have moments to evaluate which again on the internet at the moment often we do not have.

If you look, for example, at the GDPR there are already attempts to start addressing this.  In India, for example, the draft bill that we have under discussion right now does not allow you to ask for a full copy of the data that a data controller has over you.  I know that in the GDPR, for example, that does happen.  I think some of these rights are ways to kind of try and implement consent in a more meaningful way.

But I think putting the body back also in the debate really encourages to take a more structural approach to these questions, right?  Because the problem with how we talk about consent now is also that it really individualizes the issue.  Consent puts the burden completely on you as an individual the way we treat it now in data protection.

Again, in other human rights, that doesn't necessarily happen.  And I have one interesting example, I think, or important example is you can't sign away your right to life even if you want.  In most Democratic societies, all that I know of, you can't sign yourself into slavery or bonded labor.  That is not something you can consent to.

The idea is there is an essence to human dignity that one needs to preserve and that we are aware that conditions might put people in a position to agree to, but as societies we have said we still need to try and work against this.  And in India, for example, we have bonded labor in practice still happening, but it is not legal.

I think this is a way to start thinking about consent as well.  Are there parts where that modulation of information flow should not even be allowed to happen?  Are there questions that should not be allowed to ask?  Is there data that should not be allowed to gather?

And one for me really striking example, and that is also why this image is up there, so that image is from 18th or 19th century.  You know, when there was a whole in part driven by colonialism disciplines that measured people's heads to kind of claim whether they would be criminal, whether they would be in this case reliable mothers, et cetera.

There were two scientists -- scientists, one of whom was at Stanford who developed an algorithm to predict people's sexual orientation based on their face.  This was an academic paper published a few years ago.

How on earth is it possible that that question can even be asked in the 21st century?  It is actually a repeat of these things when we now talk about colonialism we are all outraged that these things could happen at that time, right?  That you could measure people's faces and then classify them in categories like that.

And actually, with facial recognition, a lot of the exercises are happening all over again.  So I think really what we need to start thinking of is what are some of the things that should not be allowed to ask to be done, et cetera.  And again, I think if we put bodies and the embodied experience of data of different people back at the center of the debate that we are going to find it easier to move in a direction where we can actually find answers to these questions.  Thank you.

(Applause)

>> SMITA VANNIYAR: Thank you very much, Anja, that was fantastic.  We will start with some questions from our discussants.

>> ANJA KOVACS: Or comments.

>> SMITA VANNIYAR: Or comments.  Yeah.

>> KS PARK: Hi, I am KS Park, a law professor from Korea.  Also, Director of Open Net Korea which is a digital rights organization.

Boy, again, confirms my fear that you can never fully understand French philosophers.  Very hard to penetrate.

Now, there is so many questions, I don't know where to begin because it was very profound and very all-encompassing.

I was given the idea that when you talk about body and data, I received a text on data sovereignty as in sovereignty of a nation.  So I was thinking about that, but the presentation research is about, again, body and data.

So, I will ask questions about body and data.  Although, all of this has implications for the data sovereignty talk that the Indian government has been pushing apparently.

So I guess the first question is strengthening the link between bodies and data or putting data back into bodies.  I think that it has the fallacy of actually strengthening the Indian government's propaganda for data sovereignty and the push for data localization.

Which has been actually used and used by different authoritarian governments to facilitate surveillance on their own people and censorship on their own people.  Because if a server is hosting postings or social media comments are domesticated, then they can much easier respond to -- they are much more easily subject to censorship orders from the local government.  So there is that.  So now I made the connection between data sovereignty and body and data.

Now, coming back to the main theme of body and data.  Data as an extension of a body, I think can be exemplified in the case of facial recognition, for instance, where it is not just other, it's not just biometric card.  Now they don't even need their card.  They can just -- they have extracted enough data points so that just showing up somewhere brings with you all of the data about you that are linked to the facial datasets.

So in that sense, I think that your demand for better protection or some protection outside the data protection law does make sense .  Having said that, if you really think about it, facial recognition is something that we have done even from prehistoric ages.  When a human being meets another human being in the forest, you know, fully naked, without any tools, the first thing the human beings have evolved to learn was to recognize one another.

So you know that all human beings have the -- animals -- vertebrate animals have learned to recognize one another by looking between the eyes.  That is where you recognize a person.

So facial recognition, modern facial recognition is different from the in a primordial facial recognition but in terms of scale.  Basically the number of faces that one can recognize.  Probably I can recognize less than 20,000 faces, but with the facial recognition tool where we aggregate the memories of faces of, you know, several hundred people or several million people now I can recognize probably a billion people.  And that is what the Chinese government is doing to recognize their own billion people, pulling facial datasets.

But my question then is do we need something higher, some other qualitative protection from data protection law when we look at, for instance, facial recognition.  If you really -- if you really disassemble the problem of facial recognition, there are two problems.

One is faces have unique identification power that go beyond any of other national identifiers.  At the same time, faces, I mean what is unique about faces is even with such identifying power you wear them out, you make them available publicly when you walk around.  See, you don't walk around showing your Social Security numbers.  You don't walk around showing your Aadhaar number, but face you wear out.  Actually, you cover every part of your body except your face, although you know that, you know, the sunlight will make you look older through UV rays, but you choose to only make your face available for other people's inspection.  Why do we do that, right?  There is a reason for doing that.

But so how do we deal with this problem of wearing out, wearing out on your sleeves the data that has such strong identifying power.  And we can learn a little bit from the literature and study on license plate numbers because that is another area where the data that has shown identity power are made available publicly to visual inspection.

And also the area of -- I mean a subset of the data protection jurisprudence is about prevention of the function creep.  The more identifiable or the more identifying power data has, the less functions have to be built upon it.

So like right here in Germany, you need a separate number for medical services, separate number for library, separate number for education.  Back in Korea, with one number -- I mean when you borrow the book from the library, you need the registration number and the same number gets you medical services and education services.  All of that.  So Korea is a really bad example of not having prevented the function creep.

So what I'm trying to say is maybe we don't need to be so apocalyptic about how data has been disembodied.  There is enough literature and enough movement on foot about how to reduce function creep and what to do with this -- what to do with public exposure of strong identifying -- identifying data.

Now, my final point is this -- you talked about doing away with the consent.  But the real power of data protection law is not really consent.  The real power of the data protection law is what the different position is if there is no consent.

The idea is that a list -- the fiction that data protection law wanted to put on ourselves is that we own our data, which means that if there is no consent then that data cannot be used by anybody else.  It is the different position that is important.  It is not the consent.  I mean if the consent was important, then we didn't have to push for the mantra that we own data about ourselves or we own our own data.

But the reason we push for data ownership as the metaphor for data governance was because, you know, when you find something, when you find these glasses lying on the ground, you will think that it probably is owned by somebody so to pick that up you need consent from somebody else.

So, there is a lot of talk about how the consent not being efficient governance tool.  But when you are throwing away consent maybe you are throwing away the baby with the water.  Because you need that different position.  The different position that you cannot use data unless you concert or obtain consent from the owner.

Now my personal view is that data ownership is only a metaphor, and I think it has to be moderated.  If you mechanistically apply the idea of data ownership you end up with a paradoxical result like right be forgotten or restriction on -- yes, restriction on data flow that actually works against vulnerable people.

We have -- we in Korea have something stronger than rights be forgotten which is the truth defamation.  And that has been used by main perpetrators of sexual assaults to silence me to revelations.  And many women have been threatened with defamation lawsuits.

And there the idea is that, you know, everyone owns data about him or herself.  So even truthful non-privacy inferencing facts can be suppressed.  So that extension of the data ownership idea has to be somehow contained.

So on that I wanted to seek your opinion.  Okay.  Thank you.

>> SMITA VANNIYAR: More questions?  Comments?

>> Unless you want to talk to discussants first.

>> SMITA VANNIYAR: Okay.

>> AUDIENCE: Thank you so much.  My name is Lisa Rambo from City Action Network.

The idea of data governance is passionate for me.  I'm doing research on digital lending, that use of the concepts of collecting data from your phone to measure what amount of loans that you might be able to get just from the datapoints from your phone.

So I think we should strike a balance between the usage of data to our advantage and the dangers of using the same data.  I'm speaking from the point that the body and with the different genders that you might be displaying, offers various data points to which they are so connected to society and the culture unknowns that are out there.  And if they come out, then they will -- they have the potential of stigmatizing or causing harms to the individuals who own this data.

And so I see this subject as a really critical one to see where we can strike a balance between the hands of data and at the same time allowing the different genders learn about their bodies and actually taking advantage of these data points and the information that they have irregardless which applications that they use, Fitbit, the menstrual cycle one, and all those kind of data.

I think maybe when going through the research we see the different data principles.  The principles of minimalization, specification.  That is with time and proper specification and consent.  If you can divide our researches into these principles and how they can practically be implemented within the various applications that we are using here.

>> SMITA VANNIYAR: Thank you.  We will take one more question from one of the discussants, and then maybe Anja could answer three of them.  Chenai.

>> CHENAI CHAIR: My name is Chenai for the transcript.  And I always find this work fascinating.  Liz mentioned something around the context and cultural norms and you had also talked about how context is taken out of when you think about data points.

So the question for me is while we are here thinking about it from personally speaking from the feminist and end user perspective my question is also around the actual designers of systems that require the data.

Have you had opportunity -- who may not be feminists -- to actually engage with them to see how they think about what they think -- what they would say is the resource that they need as data to actually ask them -- and I know there is a lot of innovations of AI for good, AI for algorithms to respond to social issues.

But I think the question is from the developer's perspective do they see the body or see the body as a distraction from what they real really want to get which is data to do good in society?

>> SMITA VANNIYAR: Anja.

>> ANJA KOVACS: So I think maybe to start with, something that speaks in a way to all of the questions and even all comments.

Because I think yours was more a comment, right?  Your input was more a comment than a question, I think, right?  I completely agree with you.  But I think for us the starting point is if we think about privacy also, I don't think we are always clear in data governance what we are really talking about.

And I think you also see that still in the courts that actually the specific definition of privacy is very vague.  And the definition we use on the Internet Democracy Project is privacy is boundary management.  So what that means is that what we need to preserve the human dignity, autonomy, et cetera that privacy is supposed to protect is the ability to actually have some control over our boundaries, over what we share with whom at what point in time.

And that is essential to have the space where you can develop if you live in a society where norms are such that, for example, your gender identity is not something you can easily develop freely, you need to have that very interior space where you can be on your own and be -- you are never completely out of society, right, but where you are not under direct influences of the forces to start developing your own perspectives, et cetera.  That is really what boundary management or privacy for us is about.

But it is also always dynamic.  You develop relationships with people over time, and then you give them more information as you trust them.  Coming to KS' point about facial recognition.  So this work with faces.  And what we see is qualitatively completely different from what existed before.

I would come out on the street, yes, and you could see my face but you could not come up to me and say hey what is your name and me just giving that information, right?  That was just not happening.

In India, many people know Gandhi who used to say during the freedom struggle, he would call on Indian people to go back to the villages because that was supposedly the more authentic way of living an Indian life.

At the same time there was a great public intellectual called Dr. Anbedkar who also is the father of the Indian constitution and was a Dalit leader, a person from the lowest cast in the Hindu system.  He called on people to go and live in cities because his argument was that you would never be able to escape if you stayed in villages precisely because you would always already be known.

You would appear in public and people would have all this other data about you.  And his argument was that in cities even though you are out in the street you are publicly visible, but because that entire ecosystem of data around you is not easily accessible in the same way in a city, there is a way to escape that discrimination, that oppression that comes with cost.

This is something we are fundamentally undermining.  When it comes to -- the anonymity that actually big cities at the end of the day give many people, and for many people are still despite all of the shortcomings of big cities, why do many people still prefer to live there?  It has a lot to do with the anonymity that we perceived was accessible to us there as well.

This is really going -- that is fundamentally different.  On that particular question of facial recognition, I also think it is really interesting how differently we see treatment of this question starting to emerge in different societies, right.

In the U.S. in several states there is efforts to severely restrict at least the use of facial recognition.  In India after that famous gang rape that happened in 2012 that many people might have heard of, there was a massive fund installed by the Indian government to actually deal with projects of women's safety.

The overwhelming amount of those funds is being used to install CCTV cameras and facial recognition systems in Indian cities all in the name of women's safety.  But that most definitely is not just going to be used for women safety.  The police and the sergeantry in the state of Hyderabad have been walking up to workers, laborers standing in the street waiting to get work for that day and recording their facial data to insert in databases.  Not on any legal ground, but also not illegal.  There is nothing that stops them from doing this, right?

Again, I think like we see that the legitimacy of this is questioned in different countries in very different ways.  And the way it is implemented and rolled out is also different in different country.

There is space to still intervene, but for me that is a fundamentally different qualitatively not just quantitatively different from what used to go before.

Yes, so on the cultural issue, so that is where that question of boundary management comes in for me really importantly.  We used to think of -- I mean that was never perhaps fully correct, but in many ways you think of the boundary of your body as where your skin touches the air.  But I don't think of the boundary of my body like that anymore.  No, I am aware of the fact that there is all this data out.

And so if you start to think in terms of boundary management and put bodies back into the debate, I don't think we will ever get the strict lines even if we imagine it like that, like that border is never going to be so -- it never was and it definitely isn't in the digital age going to be so neat.

But then the question is how do you deal -- how do you build system where that fuzziness is such that as an individual you still have a say in how that expands, where that goes, with who you share what at what point in time.  The problem at the moment is that many of us don't.

I also think it is really problematic that we focus so much, for example, on sensitive data and other personal data.  If Facebook -- somebody on Facebook told me that they actually often can see that somebody is going to become depressed, heavily depressed, several months in advance.  That is not because you are writing down that you are depressed, right?

So if you are only going to protect sensitive data like that data is -- for them to draw that conclusion is not protected because that is based on data that will not be protected.  It will not be sensitive enough to get that level of protection.  These are ways in which true date protection legislation at the moment, we think we have protections which we actually don't.

There was one more comment.  I wanted to make.  Just because you use that -- I'm not going to address the sovereignty question because it is a big debate, and we have written a paper about it which KS has read and you guys have not so I think it would be a bit tedious.

But we never use data ownership as a metaphor in the Internet Democracy Project.  I think it is deeply followed and extremely limited.  What matters to us linked to the idea of privacy as boundary management is to have a certain level of control and to be able to have trust in the systems that you find yourself in.

But ownership as such just feeds into the various distractive logic of data capitalism today that creates some of these problems in the first place.  Chenai, I'm not sure I addressed your question.

Developers.  I think in all places -- I mean I'm sure this is going to inconvenient, what has been interesting for me is that in India, it took conversations but some of the people who have been most -- who have been vocally supportive of our work are very privileged men who took some time but then saw a point and thought we need to do something with this, we need to start making space for this somehow.

So yes, not a definite advance.

(Off mic)

>> AUDIENCE: I will make it short.  Thank you very much.  And thank you for the interesting session.  I have a question.

Are there any suggestions or thinking around also recourse?  So we are talking about sort of protecting and preventing which is, of course, incredibly important.  We know about the right to be forgotten.

But are there other reflections on this?  I give a really quick example.  For six years, I have been trying to get rid of a Twitter handle that is tied to an account I can't access anymore.  Every answer has been stupid because it is coming from a prison in Alabama, who knows.

But there is not much at stake.  But, of course, if you are defending on welfare benefits there is a lot at stake as we saw in the India case.  What are the reflections in the community around those kind of issues, in terms of downstream, once data is collected what are the measures in place or are being thought about?

>> SMITA VANNIYAR: Thank you.

>> KYONG MIHO: My name is Kyong Miho (phonetic) from Korea.  Before my question, please note that my questions are based on Dr. Anja's thesis.

Korea is not too different from the situation of India.  And because the internet-based technology happened established by the government-led industrialization process.  And so your thesis was very interesting.

And I have two questions.  And I think my first question is a bit fundamental.

Both of the questions are from the perspective of feminism.  At the beginning of the article you said you lay feminist politics of data as a theoretical framework.  However, this theoretical framework is mentioned briefly in the introduction but you did not mention in the text.  So can you elaborate more specifically on theoretical framework of feminist politics?  Why did you want to take this theoretical framework?

And I have also curiosity the benefit of choosing feminist politics as a theoretical framework.  I think this is very important.

As far as I understand the paper, citizens who are producing data are eliminated from the use of data by government and the technology.  In the middle of the paper you gave an example that this arrangement of forces can cause HPIV patient to give up the treatment.

I have another question.  Have you had any interesting or pessimistic examples involving women in data protection and alienation like HIV cases in India?

And I ask you like that.  But I think I got answers already through your announcement at the beginning of like feminist perspective.

And, yeah, okay, that's it.  Thank you very much.

>> SMITA VANNIYAR: Thank you. Baldeep.

>> BALDEEP GREWAL: Hello, everyone, my name is Baldeep.  A researcher in critical media studies and I'm a historian of science.

I wanted to quickly add nuances to some of the things you said, and I have a question for Anja.  So you talked about facial recognition and there is some historical context here that I think would be very useful for the discussion.

First of all, you talked about like primal facial recognition, how we recognize each other as humans.  And I -- I personally think it is a little bit of an ableist position to take because I'm on the spectrum and I read faces differently than you do.  It is a different datapoint for me.  I would not go by essentializing that.  When you said faces are always exposed, so are hands.

And the history of the relationship between fingerprinting and facial recognition is super interesting.  Like in the case of India, for example, when the British Raj started opium plantations in Bengal, the white officials could not differentiate between Indians because they couldn't read the faces.

And that is the beginning of the British empire engagement with fingerprinting.  They would take entire prints of the hand because what the Indian natives would do is if the white officials, if the police officer wants to arrest someone, they will send somebody else because they are not going to know the difference.

And so fingerprinting became really crucial for the empire to regulate bodies in that sense.  So I feel like it doesn't work if we conflate them, if you treat them as the same thing.  If we treat fingerprinting and facial recognition as the same thing because there is the colonial context and we have to recognize that certain bodies were fingerprinted before others.

And in this case of the opium plantation in the history of fingerprinting is taken to be the first ever case of biometric identification.  So this is before digital.  This is 300 years ago, very analog.

And the question that I had for Anja comes from like my obsession with thinking about media and how that works as an interface between bodies and data.

And what I was thinking about is, of course, I agree with everything that you said, but I also have to think about data has its own body, too.  It is stored on something and bodies are datafied via something.

And when you were talking about Haggerty and Ericson, I also thought about Mark Fisher because he talks about in the university context how the cell phone screen is a disciplinary mechanism because students in his class were always looking at the cell phone even though they didn't know why they were looking at it.

So the screen in itself is disciplining them.  And it is very interesting because recently in the Silicon Valley in the super privileged families of tech people there has been this social move where they want to wean their kids off of screens.

So I feel like the tools that datafy us or tools that we consent to being datafied by are also the amount of privilege we have.  In India, people are obsessed with screens, right, because they are taking them somewhere.

So I also -- I was wondering what you would have to say about that, how do we think critically about datafying tools and mediums and how do they use us between the entire relationship between body and data?

>> SMITA VANNIYAR: Thank you, Baldeep.  Dr. Ruhiya, do you also want to add comments and questions and then Anja can take them together?  Thank you.

>> RUHIYA SEWARD: Hi.  As a political scientist and someone who works in research and policy, I tend to think in kind of a macro way about this contestation over ideas.

And so it kind of picks up on some of the things that people have asked so just bear with me.

Which is like each, you know, those bodies at the center that we are talking about, each of those bodies actually has their opinions about how they should live.  They are entrepreneurs, they are policemen, they are government ministers.  So, for instance, bear with me as I say this, but you know, this idea of data as a resource or data as the new oil really emerged of a neoliberal discourse that thought that, you know, like data is feeding into a lot of innovation ecosystems and actually helping people.

And I'm not necessarily in agreement with this, but it is feeding a lot of potential, you know, innovation in a positive way and bringing money to the people, the entrepreneurs that are using it.

So I see that, you know, even in our legal systems like we have a contestation over ideas.  And how do we as a community of people all over the world, seven billion people, how do we come to some agreement?  Like a feminist activist might say that the CCTV cameras are useful for preventing harms.

So there is like this security narrative, there is this innovation and digital rights or human rights narrative and these do not always intersect.  So how do we come to some kind of policy and legal systems that balance those narratives when in fact many people would say that kind of neoliberalism has lifted people out of poverty and done all these amazing things.  And we might not agree with that, but we have to exist and develop policies and do research in that kind of framework where we have to debate and come to things that matter.

So if security on the internet matter to you, then you care about those issues.  And if digital rights matter to you, then you care about the issues.  If you are an entrepreneur, then you want access to that data.  So how do we balance these competing narratives?  That would be my main thing.

So many things that we do we stumble into unintended consequences of laws that we pass.  And we can see that with the CCTV example and the way that it becomes a surveillance tool when it is really meant to be a sort of security piece to prevent harms.  So there is all these unintended consequences to everything that we do.

And when it comes to data governance challenges.  For instance, when I'm thinking of data as embodiment, we have been talking a lot about data trusts as a way of -- as a tool for better data governance.  And I'm -- and maybe you might not know or have an answer to this, but I'm wondering how the idea of data as embodiment could feed in or be leveraged in a kind of data trust context as an institutional tool.  So, anyways, I'm throwing a bunch of things at you.  Thank you.

>> SMITA VANNIYAR: Thank you.

>> ANJA KOVACS: So to some of this, I have no answers.  And that is also because we have been working on this -- I mean we started thinking about it a year and a half ago.  We started properly working on it about six months ago.  So it is very new, and that is also why for me these inputs are really useful because there are a lot of ideas and direction.

On Baldeep's point, that is a good thing to think about.  We didn't yet, so we should chat more.  It made me think also, though, about how, for example, if you use apps or Fitbits or things like that, people that use those say that it also changes their behavior and often even changes their intuition.  Even though they feel whatever.  If you see that you haven't slept enough, you feel tired because the data is saying you haven't slept enough.  You check in with the data before you check in with how you actually feel.

There is really interesting qualitative research where people said I stopped using it because that was freaking me out that I was actually privileging the data to search and extend.  This is what we do as society in general, right?  We privilege data.  At an individual level obviously that is going to happen as well.

That is something I have to think about.  I know that is not exactly what you are saying.  That is still different.

And the same with the recourse question, you make me realize we have been focusing on protections and haven't talked about when it goes wrong, what next.  So we will do that.  If you have ideas, please let me know.  I would love to talk more about that.

On the competing narratives, so I do think we still have those competing narratives because this is still a space in flux, right?  I mean it is not that it is ever going to go away, but I hope there is still space to change some of this.

And it comes back a little bit to the question Chenai was asking as well.  So among the things that we are starting to look at right now, just to give you a sense because I think that is where for me part of the pushback will come from is so we are looking at data colonialism as well.

There are scholars who have drawn parallels between historical colonialism and data colonialism.  Not at the metaphoric level but actually at the structural level of how these things are happening.  And one of the things they have pointed to is how surveillance capitalism is built on extraction of resources, in this case then data, in a very intense way.  And good data surveillance capitalism has access to parts of our lives that it didn't before because our social relationships, the most intimate thoughts we have, the things we worry about, all of that is now being commodified as well.

And they make, those scholars make parallels between how that happens and how then that data is described as being something that is up for grabs.  And how in historic colonialism land where people have been living for millennia was described as sterile because either the people didn't claim ownership in the right way or they were wiped out by diseases in any case.

And you have these instances where I think the Spanish conquistadores read out in Spanish announcements of what they are set out to do and there was no response obviously because people did not understand Spanish.  And then those lands were taken over.

And they are making parallels with the terms and conditions we are signing today.  Most of the equivalents they are drawing is actually between land and extraction and data.  But what we are starting to look into is whether also we can also make a connection between slavery and the extraction of bodies physically and data today.

And I think for me, so that is a very complex debate.  But I think historically that kind of commodification of bodies we have also seen in the whole debate about organ transplantation, for example.  In the 18th century, grave robberies were quite common because you could get money for organs.  And there was no regulation yet.  So there were entrepreneurs who thought that this was a great idea because there wasn't anything stopping them.

And then there was the whole debate about so why is this an issue?  How does this affect the dignity of people?  And you have seen that continue also with organ donation, bioethics.

In India, the debate around surrogacy has been very strong.  So we are trying to kind of look at all of these debates to see what are lessons we can learn there if we think because in a way this datafication is a different form of commodification, which like almost all forms of commodification we happily partake in and to some extent we feel might undermine our human dignity at a very profound level.

Where does that shift and what do we need to have in place to shift it?  I think for me when you talk about innovation et cetera, like I don't want to stop innovation.  I just want to stop innovation that goes at the expense of people -- people's dignity, right?

And I think that question is just not in the debate enough right now, and it is hard to put it there because with the language of data as a resource, you just don't see the challenges clearly enough.

(Off mic)

>> ANJA KOVACS: We are saying they are winning that debate.

Well, that is all -- I mean -- there is quite a few people in the room, which is great.

And then on the point about the data sovereignty issues.  So basically what we are trying to do is use the framework and apply it in different areas and then see what would it mean for policy, how would it shift things.  And one place we looked at was data sovereignty in the context of India which is used as an example -- as a framework to push for data localization.

What we wanted to -- what we tried to say in the paper, and I'm sorry if it didn't come out enough, but was really that the starting point is for the Indian government to make the claims it does is based on the use of data as a resource.

And I think this I will come back to again and again.  I really think like the data protection legislation in India is really weak.  Actually, consent is ground for processing in very few cases.  There are a ton of other grounds that take up much more space.

And if you ask like why is India going to pass probably a fairly sloppy data protection law, because we talk about data as a resource.  That told me really why. It might make sense in Europe to have much stronger protections but if you look at data as a resource and that is the narrative in the context of a country where India can actually foster growth through claiming building a data ecosystem within the country, building an industry on top of that and having as few barriers to that as possible, that's true.

So in a way, I think what the Indian government is doing is taking the debate on data as a resource to its logical conclusion.  And because there is no other language available like what in this particular context, what arguments will you have to stand on except -- I don't know.  I don't -- I think we don't have any.

So in a way on the data sovereignty issue, that really is the point that the language they use is that of a resource.  And then you get a form, a narrative around data sovereignty that doesn't question, for example, that very extractive logic of data colonialism.  And if you don't question the extractive logic of data colonialism or surveillant capitalism, whatever you want to call it right now, if you don't start questioning that, then this bodily integrity is not going to be put back on the table because that is part of the problem.

The idea that we can commodify every part of our life and there is nothing that should not be preserved to be accessible to being captured in the information flows I was talking about earlier.

So in that way we tried to make the link to say like we need to put bodies back in and the starting point is seeing what happens when you don't.

>> SMITA VANNIYAR: Thank you, Anja.  Just a second.  I have a quick comment and then I can come to you.

When we are talking about body, it is also -- because the question about CCTV cameras for safety came up, and I think I was talking with both of you yesterday about it.

When we talk about CCTV cameras and technology and data collection as of now, it is largely in the binaries.  And that the conception effort, the question of can binary code really hold on binary identities, right.

Because when you talk about CCTV cameras, the point of surveillance is because you are surveilling certain bodies.  And your assumption is that bodies are in two genders, male and female.  And the female bodies are more often than not harmed by the male bodies.  But then what happens when someone doesn't fit in either of these categories and what of people left in the gap?

A small example is a couple of years ago I was taking a cab booked by one of these apps.  And I faced a sort and I was in the cab.  When I tried to file a complaint with the company, they took the complaint after much annoyance.  After speaking to them for about an hour and a half they said they will come back to me with more questions.  By that time I was always presenting in the way I seem right now.

In the in between the time between the interviews, they Googled me and saw me with short hair and called back and the first question was that are you even sure this is a woman.  And this was a question in spite of having ID cards that say female.  And the question of me the way I identified doesn't matter because the protection comes in there as one.

Two, the problem with the assumption that one certain types of gender cause harm to certain other types, it is not untrue, it is specifically true, but it is also important to question when we use technology for safety, who are we leaving out and who falls through the gaps, right?

Facial recognition, actually there is a university in U.S. which is experimenting with cameras outside of the women's bathrooms and the cameras have facial recognition.  So if anyone who the camera reads as male come towards the bathroom it will sound an alarm inside the bathroom and at the security guard center.  Who does this leave out, right?

In airports, again, same thing.  The scanning thing, they say that it is gender neutral, but it's not.  The person selects male or female.  And that is when they scan the body, right, there.  I don't know if women can hide like things better.  Is that why you need to like -- I don't know if women and men's bodies can hide like suspicious material better or like differently.  I don't understand what the point of it is, but that is how it is.

And I think these are certain things which we need to think about in terms of who are we leaving out, what are the gaps in between which come up.

And especially with facial recognition because at the end of the day -- and this is not just here -- like Facebook has six gender options on the front end, but on the back end it is still male and female.  How do they decide?  Especially when someone creates an account newly.  Do they scan the millions of photos and then decide.  What happens there?

Even silly things like FaceApp which makes your face older.  Based on the way you hold the phone, it determines whether you are male or female and shows you filters only for that gender and not any other gender.

So the reason I was saying this is because I think when we -- because in India particularly the disclose of CCTV cameras and the police put is only for tech alone and tech for the safety of women.

So like who is left behind and who is left out?  Who is harmed more in the process of it?  Just to highlight that a little.  Someone had question.

(Off mic)

>> AUDIENCE: Focusing on racial -- but the focus on gender.  This is something I just want to -- I just want to put out there because they are trying to work with developers and trying to highlight a number of biases especially the facial recognition but --

(Off mic)

>> SMITA VANNIYAR: Thank you.  That was Dalia.  Any other questions?

There was someone sitting here who doesn't seem to be here any more.  But any other questions or comments?  We have another five minutes.  Chenai?

>> CHENAI CHAIR: So I think in all of this as we were talking about like centering people in this conversation and then we went and looked at frameworks and we went and looked at the original perspective.

I was in a panel yesterday where someone was talking about artificial intelligence and stressed as being something that even their grandmother can understand, someone who is not privileged to be in these spaces can engage with.

I think that is the kind of work I would like to see you -- see this project carry on with.  I remember I posed this question to you before of in thinking about consent also that balance of efficiency.  Kind of like if I'm going to give away my data, if I'm going to get something better.

So in the conversation becomes as we are centering the body, how are people if we were to say you are giving away your body, the whole idea of people getting compensated back for the data they give away, how many people are going to be morally outraged by the idea of selling their body?

And I think that is a way that you can then have the conversation going on.  Because the people whose -- who we are trying to engage with us and our families and relatives and friends is where I think we could actually mobilize this discussion a little more because the data protection frameworks seem to be language out there, and how do we switch back that language to ensure that if we are speaking of the body we speak about the language that even my mother who is engaging with algorithms is actually able to be like this algorithm is very problematic.

>> SMITA VANNIYAR: Thank you.  KS, you had a question?

>> KS PARK: I wanted to respond to your -- Smita, your point about how technology leaves out some while including others.

I mean moving on from ontology to pragmatics, like what do we to do solve that problem?  Do we include more people's data into the database?  So that now the facial recognition or whatever technology makes more granulated determinations about people?  So what are we saying?

That means more data collection.  That means more data analysis about people.  That means less privacy for all of us.  I think that there is -- everything has a cost.

So I mean another way of creating more fair AI is not including more people's data in the database.  Instead of doing that, maybe we can add some sort of a counter weighing factor that works against whatever biases that we may have detected.

I'm going to find out what the Justice League is doing in Cambridge, whether they are working on a post processing solution or preprocessing solution.  And I think that -- I think there is a vacuum on that discussion how to make the technology more inclusive without costing people's privacy.

I mean Amazon shut down its AI hiring system after realizing that they don't have enough data on women who are successful, you know, as a careerist.  So, they could not -- they could not hire -- so does it mean include more, right?  Should Amazon go out and collect more data about women?

Second point, just one more, about Anja -- I mean don't take me wrong, I think it is a great paper.  But because it is great -- it is great because it is risk taking.  So yes, data as resources is -- I mean it's a bad concept.  But we should also distinguish between data as libertarian or capitalistic commodities and also data as public resources.

We are representing this feminist kind of radical feminist website operator, website is called The Womat, who has used a mirroring strategy to show to men and the public what verbal abuse that women take by turning the narrative around.

So there are a lot of talks, violent talks about cutting off dicks and, you know, all of that.  We see Womat as public resources.  We see Womat, the website, the flow of data around Womat as public resources to educate the public about feminism and all other gender issues.

Same thing with, you know, revelations.  Revelations, that is data flow.  We are getting more information about what men in power do, you know, at -- at higher echelons that we did not have access to before.  So data as public resources is something that we should, you know, have -- we should take a nuanced approach to instead of just pushing back or talk of data as resources.

>> SMITA VANNIYAR: Just a quick answer.  I don't think -- and it's a good question, like there is more data collection, I don't think that's the answer.

I have my issues not with the machine reading me wrong.  My issue is what happens after that, right?  A machine can see me as a man and say that I shouldn't go into the bathroom, but my issue is after that if someone comes and drags me out, that is my problem.

So what are you doing with the data you are collecting is the issue because data is not absent of human intervention after that either.

I don't think -- and similarly like the Amazon facial recognition also shut down because it read a black Congressman as a criminal as well.  It was much more than -- the hiring process itself was also problematic, but they were using more and more in police stations.

My issue is that when you rely on data more than people and you rely on data alone, and that affects people in very physical and very real ways.  That is where the problem is and interventions may not be needed in terms of collecting more data.  Because people did that as well.  They illegally without consent took videos of people who had uploaded -- trans persons who uploaded videos of themselves transitioning.  They took it out of YouTube and tried to build an algorithm to figure out how transitioning works.  Whereas, the truth is that it is not a linear process in all bodies.  So the idea is not to encourage more data collection, I'm happy being off the grid.  But I do not want the consequences of my data being misinterpreted and misread coming on me and others like me.

>> ANJA KOVACS: But I think that is a really important point.  Why do they need to collect the data in the first place?

Actually, what you were saying about why in the airport do they push the button male or female.  That is a really good point, right?  From a security perspective, does your gender matter?  What matters is whether you have a gun hidden under your clothes or something?

So is it just so they know beforehand whether they are going to send you to a male or female officer if you need to be patted down?  But then they could just as well ask if you prefer to be patted down by a male or female officer.

It is important to not assume that the data needs to be collected in the first place.  And I think even with the public interest issue, we often have so little control over what is supposedly deemed in the public interest.

I actually think my civic sense is quite healthy, but I still want to make a decision myself about whether I want to contribute my data to projects for the public interest.  And at the moment I have very little control over that.

I think I'm probably going to repeat some of what I said.  But the points that Smita was making really make me think again of in India we had a strong privacy judgment in 2017.  Since then which reconfirmed a fundamental right to privacy under the Constitution even though it is not written in the Constitution.

Since then, we have a few positive judgments on privacy offline.  For example, it is finally not criminal anymore to have gay sex in India.  But we also earlier this week had a transgender bill passed in Parliament where actually to get a transgender certificate you need to go to a district magistrate and they will decide whether or not you are the gender that you say you are.  None of us have to do that if we identify as male or female.  So why would a transgender community need a committee to decide what their gender is?

We have the struggles offline so much to get bodily autonomy to be able to take the decisions over our own lives.  What is happening now, and I guess this is like why we think it is so important to put bodies in the debate, and coming back to what Smita said.  Data is becoming a new frontier where we need to fight a battle.  And we don't have the language to fight it because we talk about data as a resource as if it is disembodied, as if there is disconnect happening.  And all of these examples showed that actually in practice that is not the experience of people.  That is definitely not the experience of vulnerable people.

And also like coming back to the algorithm, I know they pointed out that a lot of facial recognition doesn't recognize the faces of black people.  And in this case there were a lot of black people who said in this case that is a feature, and not a bug.  You don't necessarily want to be recognized, right.

Because if it is recognizing you better, those are the people who are going to be arrested more in the U.S.  No, but this is true.  Not being in the data is not by definition a bad thing.  Sometimes it is good to fly under the radar.  Sometimes that is exactly what you want, right?

Again, it comes back to the question of boundary management.  The control we have.  To what extent do we still have bodily autonomy?  To what extent is the digital undermining this further?

>> SMITA VANNIYAR: Sorry, we have to wrap up quickly.  Thank you very much, Anja, Chenai, and KS, and thank you everyone here for joining us this morning and for a very interesting and fruitful discussion.  Thank you very much.

>> ANJA KOVACS: Thank you.

(Applause)