You are here

IGF 2019 - Day 3 - Estrel Saal B - DC Gender and Internet Governance - RAW

The following are the outputs of the real-time captioning taken during the Fourteenth Annual Meeting of the Internet Governance Forum (IGF) in Berlin, Germany, from 25 to 29 November 2019. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> ANJA KOVACS: Hi, good morning.  Can I request the one sitting a little further away to move closer because we are hoping that the session will be participation so this will be helpful if it is okay for you first thing to move in the morning.  Thank you.

Good morning.  Thank you for coming to this session.  I understand it is early in the morning and the third day, but I really appreciate that all of you are here.  This is the main session for the gender Dynamic Coalition.  The Dynamic Coalition on Gender and Internet Governance.  The Dynamic Coalitions are free to structure in ways that they think is most beneficial.  We thought it would be will interesting to make a sharing and learning space and not to share from the Dynamic Coalition work alone.  I have.

>> ANJA KOVACS: Here who will be presenting research on the body as data in.  And four at any time discussants with me.I also have my colleagues here who are helping me moderate the session.  So the format with what we have decided is that for about 20 to 30 minutes I will present the research and then have a discussion and the conversation on the topic.  Short introduction on why this particular research and why it is in relation to gender and Internet Governance Forum.  One the main things is data governance.  The aim of the track is to basically discuss the fundamental challenges which come in the way of benefiting in which come in the way of using data for the benefit of the people.

But the truth is also that data collection, data generation, and all of this is not done in a manner which keeps the people at the center of this, right?  So when you are talking about data governance is is important to think about the bodies behind data and why this session important.  As with marginalized communities in the offline, in the online as well when bodies we moved from the picture the first group affected are the marginalized people.  Women and queer people and people from not English speaking backgrounds.  So that is just a brief introduction.  I will let Anja take over now.

>> ANJA KOVACS: Thanks so much, Smita.  This is an amazing opportunity to share the ideas with people like you who are experts in the field already.  My name is Anja and I work with the Internet Democracy Project in India.  We started our work around these issues looking at surveillance from gender perspective specifically and that was what the aim to bring out the most structural aspects of the harms of surveillance and debate on privacy alone as the solution to the harms.

As we did the work, questions around data became stronger and stronger.  What I want to talk about is different from the traditional approaches to the body and data.  We have seen work on menstruation apps.  We are arguing the way data is connected to our bodies today is leading to a Funtmental reconceptualization of what bodies are organization.  And if we do not bring that into the debate on data governance we are not going to find the right protections for human rights in the digital age.

Not gender approach, it is a feminist approach to data where we draw on resealing on bodies done by -- research done on the body but did is not only about gender.  What I present is different from the usual presentation in that there will be a bunch of social theory to start with but I will come to the policy implications in the end and I'm quite sure that some of this will be familiar for many of you but I want to run through it because the way we stitch these things together matters.

To come to the point to see the validity of the perspective is important to run through this.  If between, if you want to stop me and have any questions or comments, please do.  As Smita said, the idea is try to make it more interactive but I guess I'm going to talk for a bit.  Okay.

I think I still want to start from the idea that the intermixing of technology and human beings has the potential for empowerment.  While we are not living in techno paradise yet.  After the second war, it influenced how we look at the world today.

And it led to two particular shifts.  The first is how we think about data that metaphorically and speak about it metaphorically.  In cybernetics it is the hint for the big net used today is data is oil but we speak about data as resource for example, right.  The idea in cyber nettics where the ideas come from is that information informs everything.  It is a layer that sits in everything.  But in a way, it is also independent from the median that generates it.  You can take the data out of the thing and it is not going to change the data or the thing.

In that way, then data becomes something that is out there, that you can mine, that you can grab.  We also because of that conceptualization we give data enormous power.  We say data is the truth, right, ba because it is the layer of reflection of reality.  You see the ideas when the computer scientists talk about one day hoping to be able to download a human brain into the computer.  So the body in which that brain was housed supposedly doesn't matter then to what the brain would be like.

Now, I think as feminists we know that that is not correct in practice.  There is a very close interlinking between data and power relations and these two are closely intertwined in many different ways.  The easy examples to see this are cases of context or use discrimination.  If you thing of -- think of for example a nonconsentwally shared image of a woman or man of breasts or a dick pic we use it -- there are other forms in which power balances play out.  But I want to focus specifically on the ones that have to do with data to date most closely.

That is what we experience today is more and more discrimination by abstraction, right?  Where we are actually made into data points and we often have very little control.  First of all, over the data as such.  That is problematic because some people have always been Sur veiled much more than others and there might be over representation.  In other cases we have under representation.  And this is particularly problematic in the age of big data because we rely on the past to predict the future.

So the data that we have or do not have matters.  I want to give an example here specifically from the context of India where in the police manual of the state in the south of India, for four years from 2012 onwards there was a provision that actually allowed the police to predict -- what is it called again -- to arrest trans gender people on the suspicion that they might become a disturbance to public order and peace.  And this provision was broadly worded and also gave the right for the police to have a register of transgender people so they would know where to go and find them when anything happened in the neighborhood.

If you have a provision that focuses so much on a particular social group obviously that group is going to be overrepresented in the crime statistics of that state simply because whenever something happens and the police ha is to show they are doing something, they are easy people to turn to.

We have a few other examples like that in India including what in colonial times used to be called criminal tribes.  The groups are now not called criminal tribes officially but through regulations and police manuals they are still under more heavy surveillance just because of their backgrounds.  And when what you get when you build the system if you don't correct for the biases the groups will continue to be targeted disproportionately.  We have little control over the categories we are sorted into as well as how the categories are formed.

I used this image we all know from airports this routine.  Now, as feminists we have been arguing for decades that gender is not a binary it is a spectrum.  When this happens in the airport there is no space for the spectrum.  The police officer who sees you has to decide whether you are male or female and if they are not sure where you fit or if they make the wrong choice you get pulled out of the queue.  What if you identify as a woman and that is not how the officer reads you but you get patted down as a man because that is how the officer did see you, right?  These kind of choices where data basically decides who we are or somebody else use data to decide who we are can put people in really vulnerable situations.

I don't think you can see this very well, but in any base, there are ample examples like this where even our representation a out of our hands.  The images used -- this was a controversy in India a few years ago if you goggled north Indian Maraslles, a specific size.  You get images like this which are overwhelmingly of food.  And if you Google south Indian you get images like these which are mostly celebrities, models in various positions.  And so, again, the women in these images did not necessarily choose to be represented like this, right?  But the control we have over how that happens is becoming increasingly weaker.

What is the problem here is really the absence of context, right?  Where we just become datapoints and then the data points are used to make decisions where we have no control which can lead to problems in discrimination, representation and social justice.

Why I wanted to outline this in some detail I think data as oil makes it sound as if gathering and analyzing data by default is always a good thing and should be allowed and since it exists we should just use it.  But talking about bringing back context and bringing back people really reminds us that actually perhaps that shouldn't always be done even if it is possible.

And that not all algorithms have equal value.  Right.  Any comments or questions so far?  Okay.

The second big shift related to the construction of data as a resource is the way that surveillance works.  And bodies have always been central to surveillance.  But they are not anymore in the same way as surveillance as we used to know it.

The way we talk about surveillance in many ways I think at least until the northern revelations of still very much the image on the right, big brother is watching you, which is surveillance as monitoring.  Watching what you have done or you are currently doing.  When we used to talk about surveillance and security, that was often the approach and a lot of the laws in many of our countries actually still cater to that more traditional form of surveillance and we had a fairly stable balance around I think consensus in many countries about how that should work in the 1990's.  By then, already, surveillance had started to shift.  The image on the right is of an octagon which was designed in the late 18th century as a jail where the guard would stand in the middle and the prisoners would be all around and the prisoners would never know whether the guard could actually see them or not and the idea is that you would adjust your behavior.  And this model was made popular again thanks to Foku who saw this central to all institutions, hospitals and schools where the idea that you are being watched actually incentivizes some kind of behaviors and discourages others.

And so then surveillance became both preemptive that it is kind of prevented us from doing things in the first place so you don't have to go and look at what people did but also productive because it makes us do other things, it influences our choices.

And we see now that these possibilities have become even more extensive with big data.  I guess the most controversial example is around elections and how they actually influence voter behavior.  But there are also examples of the social sorting, the categories that you are put in, for example, to decide credit scores for insurance purposes et cetera, right.  All of these are related to this.

We also know that disciplinary power of the cyber gate in a much more personal way.  Many of us have kind of thought why did you put up that picture on Facebook, I look so terrible in it.  It is actually the same logic at work.

But surveillance isn't just that anymore.  It is really fundamentally changing in what two scholars Haggerty and Ericsson have called the surveillans assemblage.  Does anybody know what that is?

(Off mic).

>> ANJA KOVACS: It is.  So the wheats where on the top you can see it but underneath the roots are all connected.

What Haggerty and Ericsson argued is that this is really the model of surveillance today and the fundamental change.  What we have now is not earlier where you had the guard in the middle or the government watching what you were doing, surveillance is actually a system that always has multiple actors working together.  If for example you use a Fitbit or apps that check things that have to do with the data you are contributing to that data yourself.  You may a role in that.

So you always have multiple actors.  What has become important now is the data flows and who is able to capture those data flows and what they are able to do with that.

Now, that is also fluid.  Not every actor will have the same power to capture all of the same interests at different points in time.  But we can see that slowly there are nodes emerging that are being more institutionalized and have the power to capture a lot of that flow and then use that flow or the data that he have captured in that flow to direct or govern the behavior from others.

How is this different from the surveillance that we used before?  First of all, because it is everywhere which we are conscious of, right, and that is one of the reasons we have the IGF about focusing on data.  What is really different, though, is that is doesn't need to see us in our totality anymore.  The idea of total body vision doesn't matter anymore.  There is a specific purpose why a specific actor wants to be able to surveil us and they need to get hold of that bit of us.  The context of our life is Demuted.

And flows and actors that a manage to grab flows in between.  The earlier resistance to surveillance can't work in the same way anymore.  This is a different paradigm if we want to be effective in resisting it.  If you look at this from a feminist perspective the differences then that our body is broken down into information, right?  That is really -- that this assembling is really what is different now.  And that is done to make more comparable and use the bits to make decisions about our lives about who we are.

That means that surveillance is much more disbursed but also fundamentally distributed unevenly.  We are not subject to the same gaze anymore as again in the earlier two forms of surveillance there was still much more similarity.  Now even your income can determine what you see on Google for example as search results, right?  There are fundamental differences we aren't conscious of as societies because it so hard to see this.

The opaqueness of all of this is a big problem in terms of how algorithms work and how surveillance works.  Right.  So that is all really depressing.

But maybe this is hope.  And so what we are arguing is that the way out of this conundrum really is at least in part to put bodies back at the center.

And why are we saying that?  There is a Dutch scholar who has been arguing that actually the line between our physical body and our virtual body is becoming irrelevant.  And that is a why at the Internet Democracy Project we don't talk about data doubles we talk about data as being embodied which is a different perspective, a different emphasis.

Where does our embodied experience of data really matter?  I will give you a few extreme examples.  The first one is the story a few years ago there was an American academic who did an experiment with a transgender friend who just had a sex change operation from male to female and now had breasts.  They were in the state of Florida where the law says that if you are a man you can drive around topless in a car but if you are woman you cannot.

Now the person's driving license still said that they were male, even though they had breasts.  So they decided to do an experiment and see whether the law would pull them up or not.  They were stopped by a police officer but they were let go because the driving license said the person was male.

This is one example where clearly the physical and the virtual what is really determining who we are even if you are standing in front of the person as a physical person.

Another case also from Florida, which has been great in giving us examples it seems, this is a really sad case, though.  It as bit controversial how it played out.  But the bit I wanted to get out here is there was a woman who filed a rape case against a man.  And that rape had happened at night when they were together in a house and her fitby the data was used to undermine her testimony that she was asleep at the time when she Claimed that the rape happened.  And so in the end, like I said, it was a messy case.  It was thrown out, but again this is a fundamental change in how we deal with these things, right?  Because earlier this would not have been possible as an individual in that situation at that point of the day we wouldn't have had data to check the verity of what you were saying.

So again, her story about the physical was disowned and the data was what was believed.

And then an extreme case from India.  An extreme example perhaps.  This is we have in India a unique identification system called Adan.  You need a number to get benefits from the government that people have a right to under the law.

And specifically, in recent shops what has happened this is an image of a shop, people who have a right to these rations go to a shop where the owner might have known them for 30 years and know they are who they say they are but because the biomet identification fails and the system does not recognize who they are they do not get the data.  The absolute numbers are not very large but half of our starvation in India since 2015 have been blamed on biometric identification failures and people who did not get their rations as a consequence.  Again, we are living in a world where increasingly data is used to determine our bodily experiences to such an extent but we cannot continue to talk about this as something out there independently from us.

And this is really a fundamental paradigm shift.  I think that is really important to understand.  I think it is difficult to see these things while you are in is the more stories we underearth like that the more you can sense that.  If you look at the middle age Europe for example before and after scientists started to cut open people's bodies and look at what was going on inside, the way we talked about bodiesers the way we imagined bodies and conceptualized them was completely different.  Ways in which we talk about PMS today and hormones and all of these things which we use to explain and translate our experiences were not possible until science started to cut open people.  The way we talked about it before that was much more like mystical religious metaphors et cetera.  It didn't just determine the body.  It really determined our experience of the world in a wider sense as well.

Of course, Europe was a bit late in the Arab world for example they had been cutting people open much later.  I'm using that example because it is quite well known.

Yes.  So really a paradigm shift.  And that is also why I said in the beginning I think we need to start integrating, this is early days so how that plays out in practice is not that easy yet to see.  But we are at the Internet Democracy Project trying to start to think through what does it mean.  It is not as if we are going to radically start thinking differently about protections.  I do think we can give more depth and in some cases ask questions that at the moment do not seem to be on the table.

For us it is like the body becomes a starting point to read that surveillans assemblage and what as our experience from data teaches us do we need to change in how that surveillant assembly works to make sure that as human beings we feel respected.  We need to start looking at different protections, right, because all of the examples that I give, data protection ledge lakes is not going to solve these, right?  This its own that is never going to be enough.  Some of these are questions of bodily integrity for example.

Another case in which you can see this really Stark is the way we Dole in most countries with -- deal in most countries it with nan consential sharing of sexual images is a privacy or data protection violation.  If you speak to women who this has happened they speak of this in terms of sexual assault.  The harm of assault is never addressed in our law, right?  It is a physical direct experience that people have of this.  And so, again, at the moment, our laws do not allow for that to be addressed because we do not make this connection.

To end, I want to briefly run you through the thinking we have started to to around consent around the body of data.  Consented in the way it is treated in data governance legislation at the moment to give you a teaser of where we hope this will go.  I always found it really interesting that you can sign consent away with a box.  There is no other fundamental right that we can sign away ticking a box.  You can't sign away your freedom of expression ticking a box, right?

The reason we can do that is we treat consent in data protection as we do in contract law.  It is about modulating more narrowly the flows of information and have better informed input from consumers but really this is about preventing liability for consumers.  It is not fundamentally about the concept of the individuals involved.

What if we start to think of consent around the way we do in bodies around sexual relation for example, right?  In India at some point in the debate around the draft data protection legislation one scholar suggestd that we do away with concept all together and have only an accountability framework.

People were outraged about this.  But, I was the one who asked but on what ground are you so outraged because if you treat data as a resource in that disembodied way that is a completely acceptable approach.  Philosophically you don't have a ground to stand on to say this is not acceptable.  For us, bringing the body back in makes it clear immediately why consent in the digital age and in the age of datafication remains so important.

And so if you think of consent this is Planned Parenthood not exactly what how we would frame it in the context of data but still, consent has to be negotiable to be meaningful if we think about consent in a deeper way.  You need to have all of the information you need to make informed consent.

It is best understood when it expands only incrementally.  And when you build a relationship of trust, right?  In sexual relationships, this is part of the debate always.  It is not because you kissed somebody that you want to sleep with them.  This goes step by step.  But somehow on the internet, the idea is that you have to sign away from the outside everything and then just accept.

And also that concept is always based on the constant flow of information and trans parentcy which is a back -- trans parentcy which is back and forth and you have moments to eevaluate which again on the internet at the moment often we do not have.

If you look for scam at the GDPR there are already attempts to start addressing this.  In India, for example the draft bill under discussion right now does not allow you to ask for a full copy of the data that a data controller has over you.  I know that in the GDPR, for example, that does happen.  I think some of these rights are ways to kind of try and implement consent in a more meaningful way.

But I think putting the body back also in the debate really encourages to take a more structural approach to these questions, right?  The problem with how we talk about consent now is also that is really individualizes the issue.  Consent puts the burden completely on you as an individual the way we treat it now in data protection.

Again, in other human rights that doesn't necessarily happen.  And I have one interesting example or important example is you can't sign away your right to life even if you want.  In most Democratic societies, all that I know of you can't sign yourself into slavery or bonded labor.  That is not something you can consent to.

The idea is there is an essence to human dignity that one needs to preserve and that we are aware that conditions might put people in a position to agree to but as societies we have said we still need to try and work against this.  And in India, for example, we have bonded labor in practice still happening but it is not legal.

I think this is a way to start thinking about consent as well.  Are there Parts where that modulation of information flow should not even be allowed to happen?  Are there questions that should not be allowed to ask?  Is there data that should not be allowed to gather?  And one for me really striking example and that is also why it image is up there, so the image is from 18th or 19th century.  You know, when there was a whole in part driven by colonialism disciplines that measured people's heads to claim whether they would be criminal, whether they would about in this case reliable mothers, et cetera.

There were two scientists, scientists, one of whom was at Stanford who developed an algorithm to predict people's sexual orientation based on their face.  This was an academic paper published a few years ago.  How on earth is it possible that that question can even be asked in the 21ST century?  It is a repeat of the things when we not caulk about colonialism we are all outraged that these things could happen at that time, right?  That you could measure people's faces and then classify them in categories like that.

And with facial recognition a lot of the exercises are happening all over again.  I think really what we need to start thinking of is what are some of the things that should not be allowed to ask to be done et cetera.  And again, I think if we put bodies and the embodied experience of data of different people back at the center of the debate that we are going to find it easier to move in a direction where we can actually find answers to these questions.

Thank you.

[APPLAUSE]

>> SMITA VANNIYAR: Thank you very much.  That was fantastic.  We will start with questions from the discussants.

>> ANJA KOVACS: For comments.

>> SMITA VANNIYAR: Or comments.  Yeah.

>> I'm KS park.  A professor from Korea and director of Open Net Korea which is a digital rights organization.

Boy, again, confirms my fear that you can never fully understand French philosophers.  Very hard to penetrate.

Now, there is so many questions.  I don't know where to begin because it was a very profound and very all encompassing.

I was given the idea that when you talk about body and data, I received a text on data sovereignty as in sovereignty of a nation.

So I was thinking about that, but the presentation research is about, again, body and data.  So, I will ask questions about body and data.  Although, all of this has implications for the data sovereignty talk that the India government has been pushing apparently.

I guess the first question is strengthening the link between bodies and data or putting data back into bodies.  I think that it has the fallacy of actually strengthening the Indian government's propaganda for data sovereignty and the push for data localization.

With I has been actually used and used by different authorityian governments to facilitate surveillance on their own people and censorship on their own people.  Because if a server is posting social media comments domesticated, then they can much easier respond to -- they are much more easily subject to censorship orders from the local government.

So there is that.  So now I made the connection between data sovereignty and body and data.  Now, coming back to the main theme of body and data.  Data is an extension of a body.  I think can be exemplified in the face of facial recognition.  It is not just biometric card.  Now they don't even need a card.  They just have extracted enough data points so that just showing up somewhere brings with you all of the data about you that are linked to the facial datasets.

So in that sense, I think that your demand for better protection or some protection aside the data protection law does make sense -- outside the data protection law does make sense.  Having said that, if you really think about it, facial recognition is something that we have done even from prehistoric ages.  When a human being meets another human being in the forest, you know, fully naked, without any tools, the first thing the human beings have evolved to learn was to recognize one another.

So, you know that all human beings have the animals -- vertebrate animals have learned to recognize one another by looking between the eyes.  That is where you recognize a person.

So facial recognition.  Modern facial recognition is different but in terms of scale.  Basically the number of faces that one can recognize.  Probably I can recognize less than 20,000 faces but with the facial recognition tool where we aggregate the memories of faces of you know, several hundred people or several million people now I can recognize probably a million people and that is what the Chinese government is doing to recognize their own billion people, pulling facial datasets.

My question then is do we need something higher, some other qualitative protection from data protection law when we look at, for instance, facial recognition.  If you really -- if you really disasellable the problem of facial recognition, there are two problems.  One is faces have unique identification power that go beyond any of other national identifiers.

At the same time, faces, what is unique about faces is even with such identifying power you will them out, you make them available publicly when you walk around.  See, you don't walk around showing your social security numbers.  You don't walk around showing your national ID number but face you wear out.  Actually you cover every part of your body except your face, although you know that, you know, the sunlight will make you look older through UV rays but you choose to only make your face available for other people's inspection.  Why do we do that, right?  There is a reason for doing that.

But so how do we deal with this problem of wearing out, wearing out on your sleeves the data that has such strong identifying power.  And we can learn a little bit from the literature and study on license plate numbers because that is another area where the data that has shown identity power are made available publicly to visual inspection.  And also the area of -- I mean a subset of the representation of jurisprudence.  The more identifiable or the more identifying power data has, the less functions have to be built upon it.

Like right here in Germany, you need a separate number for medical services.  Separate number for library.  Separate number for education.  Back in Korea in when you borrow the book from the library, you need the registration number and the same number gets you medical services and education services.  All of that.  So Korea is a really bad example of not having prevented the function creep.

What I'm trying to say is maybe we don't need to be so apocalyptic about how data has been disembodied.  There is enough literature and enough movement on foot about how to reduce function creep and what to do with this -- what to do with public exposure of strong identifying -- identifying data.

Now, my final point is this -- you talked about doing away with the consent.  But the real power of data possession law is not really consent.  The real power of the data law is what the different position is if there is no consent.

The idea is that a list -- the fiction that the law wanted to put on the service is that we own our data which means that if there is no consent this then that data cannot be used by anybody else.  It is the different position that is important.  It is not the consent.  If the consent was important, then we didn't have to push for the mantra that we own data about ourselves or we own our own data.

But the reason we push for data ownership as the metaphor for data governance was because when you find something, when you find these glasses lying on the ground, you will think that it probably is owned by somebody so to pick that up you need consent from somebody else.  So, there is a lot of talk about how the consent not being efficient governance tool.  But when you are throwing away consent maybe you are throwing away the baby with the water.  Because you need that different position.  The different position that you cannot use data unless you con concert or consent from the owner.

Now my personal view is that data ownership is only a metaphor and I think it has to be moderated.  If you mechanickistically apply the idea of data ownership you end one a paradoxal result like right be forgotten or restriction on -- yes, restriction on data flow that actually works against vulnerable people.

We have we in Korea have stronger stronger than rights be forgotten which is the truth deprivation and it has been used by researchers to silence me to revelations and many women have been threatened with defamation lawsuits.  And there the idea is that, you know, everyone owns data about him or herself.  Even truthful nonprivatecy injuring facts can be suppressed.  That extension of the data ownership idea has to be somehow contained.

So on that I wanted to seek your opinion.  Okay.  Thank you.

>> SMITA VANNIYAR: More veries?  Comments?

(Off mic)

>> SMITA VANNIYAR: Okay.

>> Thank you so much.  I'm Lisa from the action network.  The idea of data governance is passionate for me.  I'm doing research on lending the concepts of collecting data from your phone to measure what amount of loans that you might be able to get just from the datapoints from your phone.

So I think we should strike a balance between the use and of data to advantage and the dangers of using the same data.  And speaking from the point that the body and with the different genders that you might be displaying, offers various datapoints to which is also connected to society and the culture unknowns that are out there.

And if they come out, then they will -- they have the potential of stigmatizing or causing harms to the individuals who own this data.

And so I see this subject as a really critical one to see where we can strike a balance between the hands of data and at the same time allowing the different genders learn about their bodies and actually taking advantage of these datapoints and information that they have irregardless which applications that they use, bit fit, the menstrual cycle one and the different data.

Going through the research we see the different data principles.  Principles of minimalization, specification.  That is with time and pop-up specification and consent.  If you can divide our researches into these principles and how they can practically be implemented within the various applications that we are using here.

>> SMITA VANNIYAR: Thank you.  We will take one more question from one of the discussants and then maybe Anja could answer three of them.  CHenan.

>> CHENAI CHAIR: My name is Chenai Chair for the transcript.  Liz mentioned around the context and culturial norms and you talked about how context is taken out of when you think about datapoint sys the question for me is while we are here thinking about it from personally speaking from the feminist and end user perspective my question is also around the actual designers of systems that require the data.

Have you had opportunity who may not be feminists -- to actually engage with them to see how they think about what they think -- what they would say is the resource that they need as data to actually ask them -- and I though there is a lot ofen know expectations around AI for good, AI for algorithms to respond to social issues.  But I think the question is have the developer's perspective do they see the body or see the body as a distraction from what they real really want to get which is data to do good in society?

>> SMITA VANNIYAR: Anja.

>> ANJA KOVACS: Something that speaks in a way to all of the questions and even all comments because I think yours was more a comment, right?  Your input was more a comment than a question, I think, right?  I completely agree with you.  If we think about privacy also, I don't think we are always clear in data governance what we are really talking about and I think you also see that still in the Courts that actually the specific definition of privacy is very vague.

And the definition we use on the Internet Democracy Project is privacy is boundary management.  What that means is that what we need to preserve the human dignity autonomy et cetera that privacy is supposed to protect is the ability to actually have some control over our boundaries, over what we share with whom at what point in time.  That is essential to have the space where you can develop if you live this a society where norms such that for example your gender identity is not something you can develop freely you need to have the interior space where you can be on your own but with where you are not under direct influences of the forces to start developing your own perspectives, et cetera.  That is really what boundary management or privacy for us is about.

But it is also always dynamic.  You develop relationsships with people over time and then you give them more information as you trust them.  Coming to KS's point about facial recognition.  There is work with faces.  And what we see is qualitatively completely different from what existed before.  I would come out on the street and would you could see my face but you could not come up and say hey what is your name and me give that information, right?  That was just not happening.

In India, many people know Gandhi who said during the freedom struggle he would call on Indian people to go back to the villages because that was supposedly the more authentic way of living an Indian way of life.  At the same time there was a great public intellectual who also is the father of the Indian constitution.  And was a leader, is person from the lowest cast in the Hindu system.  He called on people to go and live in cities because his argument that was you would never be able to escape if you stayed in villages precisely because you would always already be known.  You would appear if public and people would have all this other data about you.  And his argument was that in cities even though you are out in the street you are publicly visible, but because that entire ecosystem of data around you is not easily accessible in the same way in a city, there is a way to escape that discrimination, that oppression that comes with kass.  This is something we are fundamentally undermining.  When it comes to -- the anonymity that bigcies at the end of the day give many people, and for many people are still despite all of the shortcomings of big cities, why do many people prefer to live there?  It to do with the anonymity that we perceived was accessible to us as well.

This is fundamentally different is.  On the question of facial recognition, it is difference interesting to see how the see it emerge in different society.

In the United States there is efforts to severely restrict at least the use of facial recognition.  In India after the famous gang rape that happened in 2012 that many people might have heard of there was a massive fund installed by the Indian government to deal with projects of women's safety.  The overwhelming amount of those funds is being used to install CCTV cameras and facial recognition cameras in Indian cities all in the name of women safety.  But that most definitely is not going to just be used for women safety.  The police in the state has been walking up to workers, labors standing in the treat waiting to get work for that day and recording their facial data to insert in databases.  Not on any legal ground but also not illegal.  There is nothing that stops them from doing this, right?

Again, I think that like we see the legitimacy of this is questioned in different countries in different ways.  And the way it is implemented and rolled out is different in different country.

There is space to still intervene but for me that is a fundamentally different qualitatively not just quantitatively different from what used to go before.

Yes, so on the culture issue, so that is where that question of boundary management comes in for me really importantly.  We used to think of -- I mean that was never perhaps fully correct but in many ways you think of the boundary of your body as where your continue touches the air.  But I don't think of the boundary of my body like that anymore.  No, I am aware of the fact that there is all this data out.

And if you start to think in terms of boundary management and put bodies back into the debate I don't think we will ever get the strict lines even if we imagine it like that, like that border is never going to be so, it never was and it definitely isn't in the digital age going to be so neat.  Then the question is how do you deal -- how do you build system where the fuzziness is such that as an individual you still have a say in how that ex-pans, with that goes, with who you share what at what point in time.  The problem at the moment is that many of us don't.

I also think it is really problematic that we focus so much, for example, on a sensitive data and other personal data.  If Facebook -- somebody on Facebook told me that they actually often can see that somebody is going to become depressed, heavily depressed, several months in advance.  That is not because you are writing down that you are depressed, right?

So if you are only going to protect sensitive data like that data is -- for them to draw that conclusion is not protected because that is based on data that will not be protected.  It will not be sensitive enough Tophet that protection.  These are ways at the moment we think we have protections which we actually don't.

There was one more comment.  I wanted to make.  Just because you use that -- I'm not going to address the sovereignty question because it is a big debate and we have written a paper about it which KS has read and you guys have not so I think it would be a bit tedious.

We never use data ownership as a metaphor in the Internet Democracy Project.  It is deeply followed and extremely limitedy, what matters to us linked to the idea of privacy as boundary management is to have a certain level of control and to be able to have trust in the systems that you find yourself in.  But ownership as such just feeds into the various distractive logic of data capital many today that creates some of these problems in the first place.  Chenai, I'm not sure I addressed your question.

Developers.  What has been interesting for me is that in India, it took conversations but some of the people who have been most -- who have been vocally supportive of our work are very privileged men who took some time but then saw a point and thought we need to do something with this.  We need to start making space for this somehow.

So yes, not a definite advance.

(Off mic).

>> I will make it short.  Thank you very much.  And thank you for the interesting session.  I have a question.  Are there any suggestions or thinking around also recourse?  So we are talking about sort of protecting and preventing which is, of course, incredibly important.  We know about the right to be forgotten.  But are there other reflexes on this?  I give a really quick example it.  For six years I have been trying to get rid of a Twitter handle that is tied to an account I can't access anymore.  Every answer has been stupid because it is coming from a prison in Alabama, who knows.  There is not much at stake.  But if you are defending on welfare benefits there is a lot at stake as we saw in the India case.  What are the reflections in terms of downstream, once data is collected what are the measures in place or are being thought about.

>> SMITA VANNIYAR: Thank you.

>> My name is -- please note that my consider are basend on Dr. Anjas this thesis.  Korea not too different from the situation of India.

And because the internet-based technology happened as by the government led industrialization process.  And so your thesis was very interesting and I have two questions.  And I think my first question is a bit fundamental.

Both of the questions are from the perspective of feminism.  At the beginning of the article you said you lay feminist politics of data as a theory mark.  However, this theoretical framework is mentioned but you did not mention in the text.  Can you elaborate more specifically on the framework of feminist politics.  Where do you want to take this theoretical framework and the benefit of choosing feminist politics as a theoretical framework.  I think this is very important.

As far as I understand the paper, citizens who are producing data are (?) from the use of data by government and the technology.  In the middle of the paper you gave an example that this arrangement of (?) can cause HPIV patient to give up the treatment.  I have another question.  Have you had any interesting or pessimistic examples involving women in data prediction and alienation like HIV cases in India?

And I ask you like that.  But I think I got answers already through your announcement at the beginning of like feminist perspective.

And, yeah, okay, that's it.  Thank you very much.

>> SMITA VANNIYAR: Thank you.

>> BALDEEP GREWAL: Hello, my name is Baldeep.  A researcher in critical media studies and historian of science,.

>> I want to quickly add nuance the things you said and I have a question for Anja.  So you talked about facial recognition and there is some historical context here that I think would be very useful for the discussion.

First of all, you talked about like primal facial recognition how you recognize each other as humans.  And I -- I personally think it is a little bit of an evilest position to take.  I'm on the spectrum and I read faces differently than you do.  It is a different datapoint for me.  I would not go by essentializing that.  When you said faces are always exposed.  So are hands and the history of the relationship between fingerprinting and facial recognition is super interesting like in the case of India for example when the they started opium plantations in Bengal, the white officials could not differentiate between Indians because they couldn't read the faces.

And that is the beginning of the British empire engagement with fingerprinting.  They would take entire prints of the hand because what the Indian natives would do if the white officials if the police officer wants to arrest someone they will send somebody else because they are not going to know the difference.

And so fingerprinting became really crucial for the empire to regulate bodies in that sense.  I feel like it doesn't work if we conflate them, if you treat them as the same thing.  If we treat fingerprinting and facial recognition as the same thing because there is the colonial context and we have to recognize that certain bodies were fingerprinted before others.  In the case of the opium plantation in the history of fingerprinting is taken to be the first ever case of biometric identification.

So this is before digital.  This is 300 years ago, very analog.  And the question that I had for Anja comes from like my obsession with thinking about media and how that works as an interface between bodies and data.  And what I was thinking about is, of course, I agree with everything that you said, but I also have to think about data has its own body, too.  It is stored on something and bodies are datafied via something.  And when you were talking about hagty and Ericsson in Belize, I also thought about mark fisher because he talks about in the university context how the cell phone screen is a disciplinary mechanism students in his class were always looking at the cell phone even though they didn't know why they were looking at it.

So the screen itself is disciplining them.  It is interesting because recently in the silicon valley in the super privileged families of tech people there has been a social move where they want to wean their kids off of screens.  The tools that datafy us or tools that we don't sent to being dataified by are also the amount of privilege we have.  In India, people are obsessed with screens, right, because they are taking them somewhere.

So I also -- I was wondering what you would have to say about that, how do we think critically about dataifying tools and mediums and how do they use us between the entire relationship between body and data?

>> SMITA VANNIYAR: Thank you, Baldeep.  Do you also want to add comments and questions and then Anja can take them together?

>> Hi.  As a political scientist and someone who works in research and policy, I tend to think in kind of a macro way about this contestation over ideas.

And so it kind of picks up on some of the things that people have asked.  So just bear with me.  Which is like each, you know to the bodies at the center that we are talking about.  Each of those bodies actually has their opinions about how they should live.  They are entrepreneurs, policemen, government ministers.  For instance, bear with me as I say this, but this idea of data as a resource or data as the new oil really emerged of a neoliberal discourse that thought that you know, like data is feeding into a lot of innovation ecosystems and actually helping people.  And I'm not necessarily in agreement with this but it is feeding a lot of potential, you know, innovation in a positive way and bringing money to the people, the entrepreneurs that are using it.

So I see that even no your legal systems like we have a contestation over ideas and how do we as a community of people all over the world seven billion people, how do we come so some agreement.  Like a feminist activist might say that the CCTV cameras are useful for preventing harms.

Security narrative and innovation and digital rights or human rights narrative and these do not always intersect.  How do we come to some kind of policy and legal systems that balance the narratives when in fact many people would say that neoliberalism has lifted people out of poverty and done amazing things and we might not agree with that but we have to exist and develop policies and do research in that kind of framework where we have to debate and come things that matter.  If security on the internet matter to you then you care about those issues and if digital rights matter to you, you care about the issues.  If you are an enterpreneur, you want access to the data.  How do we balance the competing merittives?  That would be -- competing narratives.  That would be my main thing.

So many things that we do we stumble into unintended consequences of laws that we pass and we can see that with the CCTV example and the way it becomes a surveillance tool when it is meant to be a security piece to prevent harms.  So there is unintended consequences to everything we do.  And data governance challenges.  For instance, when I'm thinking of data as embodiment, we have been talking a lot about data trusts as a way of -- as a tool for better data governance.  And I'm -- and maybe you might not know or have an answer to this but I'm wondering how the idea of data as embodiment could feed in or be leveraged in a kind of data trust context as an institutional tool.  Anyways, I'm throwing a bunch of things at you.  Thank you.

>> SMITA VANNIYAR: Thank you.

>> ANJA KOVACS: So to some of this, I have no answers.  And that is also because we have been working on this -- I mean we started thinking about it a year and a half ago.  We started properly working on it about six months ago so it is very new and that is also why for me these inputs are really useful because there are a lot of ideas.

On Baldeep's point, that is a good thing to think about.  So we should chat more.  It made me think also about how for example if you use apps or Fitbits or things like that people say that it changes their behavior and often even changes their intuition.  Even though you feel whatever if you feel you haven't slept enough you feel tired because the data is saying you haven't leapt enough.  You check in with the data before you check in with how you actually feel.  Interest is TrueTesting qualitative research where people said I stopped using it because that was freaking me out that I was actually privileging the data to search and extend.  This is what we do as society in general, right?  We privilege data.  At an individual level obviously that is going to happen as well.

That is something I have to think about.  I know that is not exactly what you are saying.  That is still different.  And the same with the recourse question, you make me realize we have been focusing on protections and haven't talked about whether he it goes wrong, what next.  So we will do that.  If you have ideas please let me know.  I would love to talk more about that.

On the competing narratives, so I do think we still have those competing narratives because this is still a space in flux, right?  It is not that it is ever going to go away but I hope there is still space to change some of this.

And it comes back a little bit to the question Chenai was asking as well.  So among the things that we are starting to look at right now, just to give you a sense because I think that is where for me part of the pushback will come from is so we are looking at data colonialism as well.  There are scholars who have drawn parallels between historical colonialism and data colonialism not at the met forrical level but how they are happening.  One thing is how surveillance capitalism is built on extraction of resources in this case then data in a very intense way and the data surveillance capitalism has access to parts of our lives that it didn't before because your social relationships, the most intimate thoughts and things we worry about all of that is be comodified as wellance how that data is up for grabs and how in historic colonialism land where people have been living for ma linnia was described as (?) because either the people didn't claim ownership in the right way or they were wiped out by diseases in any base.

(Audio echo).

>> ANJA KOVACS: And issues where they read out in Spanish announcements of what they are set out to do and there was no response obviously because people did not understand Spanish.  And then those lands were taken over and they are taking parallels with the terms and conditions we are signing today.  Most of the equivalents is between land and extraction and data.  What we are starting to look into is where we can also make a connection between slavery and the extraction of bodies physically and data today.  And I think for me, so that is a very complex debate but historically that kind of comodfication of bodies and organ transplant.  In the 18th century grave robberies were quite common because you could get money for organs and there were no regulation yet there were enterpreneurs who thought this was a great idea because there wasn't anything stopping them and then there was the whole debate about why is this an issue and how does this affect the dignity of people.  And you have seen that continue also with organ donation, bioethics.  In India the debate around surrogacy is very strong.  What are lessons we can learn there if we think because in a way the datafication is a different form of commodification.  Which some forms we happy hi par in and to some extent we feel might undermine our human dignity at a very profound level.  Where does that shift and what do we need to have in place to shift it?  I think for me when you talk about innovation et cetera, like I don't want to stop innovation.  I just want to stop innovation that goes at the expense of people -- people's dignity, right?

And I think that question is just not in the debate enough right now and it is hard to put it there because with the language of data as a resource, you just don't see the challenges clearly enough.

(Off mic).

>> ANJA KOVACS: We are saying they are winning that debate.  Well, that is all -- I mean -- there is quite a few people in the room, which is great.

And then on the point about the data sovereignty issues.  Basically we are trying to use the framework and apply it in different areas and then see what would it mean for policy, how would it shift things.  And one place we looked at was data sovereignty in the context of India which is used as an example -- as a framework to push for data localization.

What we wanted to -- what we tried to say in the paper and I'm sorry if it didn't come out enough what was really that the starting point is for the Indian government to make the claims it does is based on the use of data as a resource.  And I think did -- I will come back to again and again, I think like do data protection legislation in India is really weak.  Actually, consent is grown erground for processing in very few cases.  Ton of other that take up much more space.  And if you ask like why is India going to pass probably a fairly sloppy data protection law, because we talk about data as a resource.  That is whom we really my it might make sense in Europe to have much stronger protections but if you look at data as a resource and that is the narrative in the con debt of a country where India can actually foster growth through claiming building a data ecosystem in the country and building industry on top of that and having as few barriers to that as possible, that's true.  In a way, I think what the Indian government is doing is taking the debate on data as a resource to its logical conclusion.  And because there is no other language available like what in this tick context, what arguments will you have to stand on except I don't know.  I don't.  I think we don't have any.  So in a way on the data sovereignty issue, that really is the point that the language they use is that of a resource.  And then you get a form, it, a narrative around data sovereignty that doesn't question for example the very effective logic of data colonialism.  And if you don't question the extractive logic of data colonialism or Sur veilian capitalism whatever you want to call it right now if you don't start questioning that, then this bodily integrity is not going to be put back on the table because that is part of the problem.  The aid idea that we can comodify every part of our life and there is nothing that should not be preserved to be accessible to being captured in the information flows I was talking about earlier.  This that way we tried to make the link to say like we need to put bodies back in and the starting point is seeing what happens when you don't.

>> SMITA VANNIYAR: Thank you, Anja.  Just a second.  I have a quick comment and then I can come to you.  When talking about body, it is also because the question about CCTV cameras for safety came up and I think I was talking yesterday about it.

When we talk about CCTV cameras and technology as of now, it is largely in the binaries and that the conception effort, the question of can binary code really hold on binary identities right.  Because when we talk about CCTV cameras, the point of surveillance is because they are Sur veiling certain bodies and the assumption is bodies are in two genders, male and female.  Happens when someone doesn't fit in the categories and what are people left in the gap.  A small example is a couple of years ago I was taking a cab booked by one of these apps and I faced a sort and I was in the cab.  When I tried to file a complaint with the company they took the complaint after much annoyance.  After speaking to them for about an hour and a half they said they will come back to me with more questions.  By that time I was always presenting in the way I seem right now.  Not in between time they Googled me and saw me with short hair and called back and the first question was that are you even sure this is a woman.  And this was a question in spite of having ID cards that say female.  And the question of me the way I identified doesn't matter because the protection comes in there as one.

The problem with the assumption that one certain types of gender cause harm to certain other types is it is not untrue, it is specifically true, but it is also important to question when we use technology for sifting, who are we leaving out and who falls through the gaps, right?

Facial recognition actually there was a university in U.S. which is experimenting with cameras outside of the men's bathrooms and the cameras have facial recognition so if anyone who the camera reads as male come to the bathroom it sounds an alarm inside the bathroom and at the security guard center.  Who does this leave out, right?  Same thing.  The scanning thing, they say that it is gender neutral but it's not.  The person selects male or female.  And that is when they scan the body, right there.  I don't know if women can hide like things better.  Is that why you need to like -- I don't know if women and men's bodies can hide suspicious material better or differently.  I don't understand what the point of it is.  But that is how it is.

And I think these are certain things which we need to think about in terms of who are we leaving out and what are the gaps in between which come up.  And especially with facial recognition because at the end of the day -- and there is not just -- like we spoke the first gender options on the front end but on the back end it is still male and female.  How do they decide?  Especially when someone creates an account.  Scan the millions of photos and then decide.  What happens there?  Even silly things like face app which makes your face older.  Bayed on you how hold the phone it shows you only photos for that again and not IOSer in gender.

I was saying this because in particularly the disclose of CCTV cameras and the police put is only for (?) alone and tech for the safety of women.  So like who is left behind and who is left out?  Who is harmed more in the process of it just to highlight that a little.  Someone had question.

(Off mic).

>> There is actually a recent

(Off mic)

Focusing on racial -- but the focus on gender.  This is something I just want to -- I just want to put out there because they are trying to work with developers and trying to high late a number of biases especially the facial recognition but --

(Off mic)

>> SMITA VANNIYAR: That will dalia.  Any other questions?  There was one sitting here who doesn't seem to anymore.  Any other questions or comments?  We have another five minutes.  Chenai?

>> CHENAI CHAIR: I think in all of this as we were talking about centering people in the conversation and then we went and looked at frameworks and we went and looked at (echo in the audio).

>> I was in a panel with someone talking about intelligence and stressed as being something that their brandmother can understand and someone mott privileged to be in the spaces can engage with.  I think that is the kind of work I would like to see you -- see this project carry on with.  I posed this question to you before of in thinking about consent also that the efficiencies.  Kind of like I'm going to give away my data if I'm going to get something better.  So in the conversation becomes as we are centering the body how are people if we were to say you are giving away your body, the whole idea of people getting compensated back for the data they give away, how many people are going to be morally outraged by the idea of selling their body?  And I think that is a way that you can then have the conversation going on because the people whose -- who we are trying to engage with us and our families and relatives and friends is where we can mobilize the discussion a little more because the data protection frameworks seem to be (?) and how do we switch back to ensure that if we are speaking of the body we speak about the language that even my mother is engaging with algorithms is actually able to be like this algorithm is very problematic.

>> SMITA VANNIYAR: Thank you.  You had a question?

>> KS PARK: I wanted to respond to your, Smita, your point about how technology leaves out some while including others.  I mean moving on from oncology to pragmatics, like what do we to do solve that problem?  Do we include more people's data into the database?  So that now the facial recognition makes more granulated determinations about people?  That means more data collection.  That means more data analysis about people.  That means less privacy for all of us.  I think that there is -- everything has a cost.  So I mean another way of creating more fair AI is not including more people's data in the database.  Instead of doing that, maybe we can add some sort of a counter weighing factor that works against whatever biases we may have deteched.  I will find out what the justice league is doing in cam bridge, working on a post processing solution and preprocessing solution.  And I think that -- I think there is a vacuum on that discussion.

How to make the technology more inclusive without costing people's privacy.  I mean Amazon shut down its AI hiring system after realizing that they don't have enough data on women who are successful, you know, as a careerist.  So, they could not -- they could not hire -- so does it mean include more?  Should Amazon go out and collect more information about women?  Second point, just one more.

About Anja, I mean don't take me wrong, I think it is a great paper.  Because it is great -- it is great because it is risk taking.  Yes, data as resources is -- I mean it's a bad concept.  But we should also distinguish between data as libertarian or capitalistic co-moddities and data as -- commodities and data as public resources.

We are representing this feminist kind of radical feminist website operator, website is called the WOMAT.  Who has used a mirroring strategy to show to men and the public what verbal abuse that women take by turning the narrative around.  There are a lot of violent talks about cutting off dicks and all of that.  We see Womat, the website, the flow of data around Womat as public resources to educate the public about feminism and all other gender issues.

Same thing with, you know, revelations.  Revelations, that is data flow.  We are getting more information about what men in power do, you know, at -- at higher echelons that we do not have access to before.  Data as public resources is something that we should, you know, have, we should take a nuanced approach to instead of just pushing back or talk of data as resources.

>> The answer I don't think that is the answer.  I do think --

>> ANJA KOVACS: I do think I have my issues not with the marine reading me wrong.  My issue is what happens after this contribution right?  Is machine can see me as a man and say that I shouldn't go into the bathroom but the issue is after that if someone comes and drags me out that is my problem.  What are you doing when the data you are collecting which is the issue because data is not absent of human intervention after that either.  I don't think and similarly like the Amazon facial recognition shut down because it read a black Congressman as a criminal as well.  The hiring was problematic but they were using more and more in police station.

When you rely on data more than people and rely on data alone, I think that is where the problem is and interventions may not be needed in terms of collecting more data because people did that as well contribution they illegally without consent took videos of people who uploaded, trans persons were uploaded.  They took it out of YouTube and tried to understand an algorithm how transitioning works.  It is not a linear process in all bodies.  I'm happy being off the grid.  But I do not want the consequences of my data being misinterpreted and misread coming on me and others like me.

>> ANJA KOVACS: I think that is a really important point.  Why do they need to collect the data in the first place.  What you were saying about why in the airport do they push the button male or female.  That is a really good point, right?  From a security perspective does your gender matter?  What matters is when you have a gun hidden under your clothes or something?  Is it just so they know before hand they send you to a male or female officer.  They could just as well ask if you prefer to be patted down by a male or female officer.  It is important to not assume that the data needs to be collected in the first place.  I think even with the public interest issue we often have so little control over what is supposedly deemed in the public interest.  I actually think my civic sense is quite healthy but I want to make a decision myself when I want to contribute my data to projects for the public interest.  And at the moment I have very little control over that.  I think I'm probably going to repeat some of what I said.  But the points that Smita was making really make me think again of in independencia we had a strong privacy judgment in 2017 which then confirmed a fundamental right to privacy under the constitution even though it is not written in the constitution.  Since then, a few positive judgments and privacy offline.  For example, it is finally not criminal anymore to have gay sex in India.  But we also earlier this week had a transgender bill passed in parliament to get a trans gender certificate you need to go to the imagine straight and they will decide whether or not you are the gender that you say you are.  None of us have to to that if we identify as male or female.  So why would they need a committee to decide what the gender is?  We have the struggles offline so much to get bodily autonomy to be able to take the decisions over our own lives.  What is happening now and I guess this is why we think it so important to put bodies in the debate and coming back to what Smita said, data is becoming a new frontier where we need to fight a battle and we don't have the language to fight it because we talk about data as a resource as if it is disembodied and there is disconnect happening.  And all of the examples showed that in practice that is not the experience of people.  That is definitely not the experience of vulnerable people.

And coming back to the algorithm, I know they pointed out that the facial recognition a lot doesn't recognize the faces of black people and there were a lot of black people who said in this case that is a feature, you don't necessarily want to be recognized.  Because if if it is recognizing you better those are the people who are going to be arrested more in the U.S.  This is true.  Not being in the data is not by definition a bad thing.  Sometimes it is good to ply fly under the radar.  Sometimes that is exactly with you want.  It comes back to the question of boundary management.  The control we have.  To what extent do we still have bodily autonomy and what extent are they undermining this further.

>> SMITA VANNIYAR: Sorry, we have to wrap up quickly.  Thank you very much.  Anja, genai and KS and thank you everyone here for joining us this morning and for an interesting and fruitful discussion.  Thank you very much.

>> ANJA KOVACS: Thank you.

[APPLAUSE]

Contact Information

United Nations
Secretariat of the Internet Governance Forum (IGF)

Villa Le Bocage
Palais des Nations,
CH-1211 Geneva 10
Switzerland

igf [at] un [dot] org
+41 (0) 229 173 411