You are here

IGF 2017 - Day 2 - Room XXIV - WS152 Online Freedom for All=No Unfreedom for Women. How Do We Solve This Equation?

 

The following are the outputs of the real-time captioning taken during the Twelfth Annual Meeting of the Internet Governance Forum (IGF) in Geneva, Switzerland, from 17 to 21 December 2017. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> So once again, good evening, good afternoon.  If you are still jet lagged, good morning.  I know it's been a long day and I hope you have been having a good IGF so far.  Welcome to Online Freedom for All.  How Do We Solve This Equation?  My name is Nanjeeria and I'll be this session's moderator.  Pleased to be joined by fantastic panel of women, no apologies.  It should also be normal to see all women speak up to various issues and I know that some of you have been involved in this discussion on technology mediated violence.  It's never an easy topic.  It drains the soul many times, but also we want to reflect on what's been working so far, what the role of each actor is and to see how we can learn from one another and take the stuff that's proven and possible and automatically make sure the spaces are free and secure and safe for women and girls and indeed, marginalize communities to actually occupy them.

I will start with a quick overview.  I think there's already some great recognition that something has to be done about the digital gender divide.  That is meaningful use of the internet.  But zeroing down on what has to be done and safe public ‑‑ homophobia and racism and making Ron line freedoms illusive and particularly for women.  Now understanding sexism and all it's unholy variance is vital to have considerations online, women's rights to freedom from violence, whether these are legal measures, private complaint by platforms that we use or its Civil Society campaigns and best practices.  We will try and explore what the rule of governments, Civil Society platforms or intermediary is more broadly, how they can be improved upon, if anyone is snoozing on the job and also reflecting on what's been working because I think we should be able to also acknowledge that some efforts have also been in place to work on these issues.  We will try and see what the legal framework, what's the space of legal framework in addressing this issue and balancing all the tensions between freedom of expression and attack on mediated violence.  So also what more ‑‑ I don't want to say what can, but what can Civil Society do?  It has been doing a lot even before we started talking about gender based violence.  So just assessing what more can be done and not necessarily adding more.  We already have 24 hours a day.  We're working 27 of them trying to make this thing work.

With that, I will invite my panelists to introduce themselves and give reflections from various country perspectives on what's working and what has not been working.  Hopefully there is should be in the audience or somebody working that couldn't get anyone as you all may know to get on the panel, but maybe they'll feel this is a safe space for them to join the discussion.  Last but not least, if you will be sharing your perspectives, please feel free to use women's rights online and continue this discussion once we leave this frame.  We will move to Anthony.

>> Anthony:  Thank you, Angie.  My name is Manandy from India called Idle for Change.  I would like to share with you broadly the key problems.  Like in many other context, technology media and violence is pervasive and the meal fact for women (inaudible) and expressing their opinion.  It puts the views on harassment and physical violence and women from social economic groups that have been historically marginalized in communities that are particularly accessible to these.  And in recent times, we are seeing organized (?) and in modest most (?)is that, even the modern citizen consultation on the national policy for empowerment of women or the social media account of the ministry of women in child development and even these things that have been (?) and the minister feels helplessness and outrage and demanding the trolling and recognized as a criminal act.  This is where I come from and work in and if you look at how our legal institution is to deal with this, I would like to focus on three key issues, which may also be similar in other context elsewhere.  One we don't have any dedicated (?) and these acts are booked under a range of pre‑existing legislation.  One of them is the Indian penal code with a Victorian era that comes from a highly modelistic framework and at best it is protection and at FIRST it looks at violence who asked her not to issue safe.  It's that kind of thinking that frames this.  And the other legislation that we have under which the policy books under abs of violence is the information technology act, but it is framed as a piece of commercial law to make the online secure for business transactions, but it does have a couple of provisions that talk about the transmission of sexually explicit content.  There is scene provisions and privacy related provisions.  It speaks the model and new age privacy and consented discos, but more often than not, they will book people that book the individual of obscenity.  Homosexuality is outlawed and anybody who has a sexual orientation, aspect tries to book them for this.  So this is a problem with the lodge of these provisions.  And also the policy doesn't understand multi‑layered consent in the full dimension.  We think if they shared an picture once and the other person put it up, these things can not be booked under privacy.  I can only use the obscenity.

Finally, there is no provision that deals with sexist speech that is not sexually explicit.  And most of trolling is in the nature of gender abuse and it will be very difficult to apply pre‑existing provisions.  So this is one of the problem that we have and also recently India passed a very unique supreme court order.  This used to be invoked earlier.  But predictably, this was used by political elite and opposition and it became something that is a violation of free speech and because it is seen as an arbitrary excessive and unconstitutional free speech, they shot this down.  While striking this down, it also says that earlier regimen reliability and flag had an approach that would not hold anymore.  But now you have to wait for executable judicial order before you take down anything.  And this leaves women who are facing like very gray problems.  If a woman is raped in that video of sane is unloaded or sold, this is a growing problem and very serious.  She can do to insure the video is Dean town and finely, I want to say two things because we're trying to think here about what kind of an approach do we want to take.  The right to privacy has to be understood as consisting bodily integrated and the privacy of choice.  We need loss that are due in such an approach and then law enforcement officials must be made aware.  That's the language they use and this has to change and we need social support that insures that victims are not stigmatized because they come out with their story.  Finally, we need to think about an approach to liability because before prioritizing censorship, victims have a FIRST step of balancing that and free speech for all and I think this really is the crux and how we go towards this.  This is something I would like to ask everyone in the room.  Thank you.

>> Thank you so much.  We're already taking copious notes.  I will hand it over to Titey about we were discussing that almost everything is happening at once as everybody gets online, now cultures are being cultivated.  What are you seeing happening legally?

>> Yes.  Thank you leading me be on this panel.  I would like to share the two very ‑‑ the two most important problems regarding with Miamar.  So Miamar is the green field for mobile and Internet and right now, last two or three years ago, we had just lost 10% of people having connected to Internet, but right now, more than 90% of the countries populations have coverage to the internet and we have more sim cuts than more people in the country.  So the accessibility is definitely growing, but the problem is do women have the same accessibility as the rest?  So I would like to reflect on one of the ‑‑ one of the research we did in 2015 and 2016 with one of our partner called in Asia.  It is a survey we collected from 12,000 people.  What we see is there's a digital divide in mobile phones ownership.  It has been around 28% in both 2015 and 2016.  So although the number of people that are coming up online are increasing, the digital divide is still there in terms of men and women.  And also, when looking at that, comparing to countries like India, Bangladesh and Pakistan, culturally, women have ‑‑ they are known to be much more ‑‑ have much more decision making level in their ‑‑ at home.  So in our ‑‑ in our surveys we did, we also find out women are giving a strong place to actually make financial decisions in the family.  So when it comes to the Internet and mobile, what happens is that when we asked whether she would like to spend her money on groceries or on the top out for the mobile phones, definitely there are a lot of women who said that their money they have will go to the groceries rather than buying phones and mobile phones and top ups in the country.  On the other hand, women assess various access regarding income and barrier of access in terms ever the culture in the country.  So that's a first problem.  Accessibility is not actually coming up as high as the man.  Number 2 is so that's the quantitative survey that we did.  Another is the qualitative survey that we survey on digital rights to more than 100%.  In groups, there were men, women and transgender people.  Women face different types of problems and they have to behave differently to men because of the problems.  Women are harassed online.  On the interviews, we see a lot of women say they are harassed online.  They have text that they receive are text messages like comments like can you lift up your skirt and send me a picture from strangers and women see how images of women are photo shopped into nudes and edited their bodies.  Then many women say they don't post pictures of themselves and only post photos of them and of others.  So women have a perception on that.  Women have used a male identity when signing up for an account, they take the box as mill.  They felt that when they are online as a male, they will not get harassed when they come online as a woman.  So, this is very interesting because men were never mentioned and this is something they do.  Take a persona of a women because of the first arrests and also women often when they are asking a Facebook profile, they put a photo of their husbands and the kids and they will provide photos because this is a way of signaling to the world that I am a woman.  I am married and I have a child too.  Also women also don't open account themselves.  And they just use their husbands or their brothers or their sons facebook account to get and watch more information.  So, two of the biggest concern in the country, which is number 1 is accessibility is not actually increasing.  And there's very big digital divide known part and win.  Harass and violence has only been happening to women, but not the other gender.  So that's what I would like to share.

>> Thank you so much.  Just reminds us that some of this is always just an exception of norms that have existed offline and ho do we use the spaces when they bridged two inequalities.  I will hand it over to Amalia.  Thanks.

>> Amalia:  My name is Amalia.

We are digital roots booked organization.  A few years I have been working on gender issues.  We started working on gender its have.  We wrote a lit bit more to have a more ‑‑ to bring a more positive view on the issue.  Now we are working on women's rights online with foundation and we have done several projects that if you wanted to know about it, you can come to me.  Now I will focus on the content of Colombia in this everybody and gender base Ron line.  This is something that we have done for this submission, the call for submission, special repertoire on violence against bottom we also have done something more regional with other similar organization.  We can say it shows significant progress in terms of recognizing women's rights and overcoming the discrimination of violence in different areas of our life.  It also has shown significant progress in developing norms from gender equality and safe guard women's rights.  Even the judiciary has a treatment regarding the recognition of historical and (inaudible) discrimination in the effective equality interest national ‑‑ by this legal development, there is still a significant gender gap in the practice.  Many factors can explain this gap between standard and practice; however, we can highlight as one of the main causes the persistent of social culture skins under local powers between wan and woman.  And the rates of violence against woman show that the problems have affected response from the state without going to the details and figures, I can assure you that these rates make it clear that gender and equality persist and the violence against woman is a serial problem in the country.  But if we try to investigate about digital violence against woman, we find a significant and even greater gap when it comes to a steady state.  Stay defined digital approaches and strategy and this is a major challenge because ICT especially the internet is a field of tension for the enjoyment of woman's right.  To try to understand and provide evidence about the consequences digital violence that women suffer, they made a preliminary diagnosis.  It is often under estimated and it affects ‑‑ it is by understated we were aim to see the attacks materialized in physical world.  Physical violence changes in journalistic.  It is also we're noting in many cases, we established consequence conditions was self‑censorship.  They stopped expressing themselves online to avoid the wave of violence.  Us this, we identify that the gender base violences provide this woman with half real consequences.  It is recognizable by close environment, territories or the victim themselves.  And they're looking affected responses, remedy and solutions.  Although, there are some ties of criminal offenses defining the Colombian legal framework to come back digital violence against woman.  Like I mentioned, abuse access to computer system, personal data, infringements and sexual harassment.  The macro little from a more corporate perspective.  On top of that, the reality shows there is a high rate of immunity and a lot of work in woman whether it is from a platform authorities.  That is women of recommends of they do not report for personal for fear of personal and social repercussion or because they ignored their lights ‑‑ rights.  But when there is a complaint, there is high probability that the victim will be ignored or victimize.  This is because there's no ‑‑ this scenario make people have a system of case records or to insure that there's an effective sanction against who attack especially when they're identified when they are already identified or can be identified.  On the other hand, digital platforms were many of these violence, of course, also very inconsistent in the way of acting against claims or reports of digital violence.  As I have shown, the reports in this case, in the case of the Colombian may be considered infective to help find more sensitive and effective solution.  But on the other hand, it's there you that finding a solution and effective remedy is not an easy task.  We have a lot to discuss.

>> Thank you so much, Amalia.  But breathing life is a whole other matter.  I will pass it over Joanna.  So if we could put to the FIRST thing yeah.  So in Brazil, we were have a group of organizations and people that are doing support, feminist lawyers or activists that have been targeted for online hate.  And many organizations, feminist organizations that were not connected or discussing digital rights, but had a lot of input to put into the conversation.  And in this report, we manage the why the beginning to do topography and typology of different cases over tax.  So if you can pass to the next tab, the second tab.  Next one.  Next one.  Thank you.  If you could click a new graph.  Go.  He was trying to understand how it could lead to many legal aspects of crimes and violations.  FIRST also to map out the attacks because we just as other kinds of violence against women there is a need to recognize sometimes the actions as recognize as well.  Of course, there are different degrees of damage other but we wanted also to collect all of the attacks and manifestations of violence that those people were suffering.  We also did another report for Latin America.  Also in the Brazilian report and Latin America report, even though the cases and the diversity was very different like from in Brazil, they show off race was very eminent.  Many cases that we collected.  Then in other countries in the region had the issue of raise and the issue of on ‑‑ of course, the online violence is a continuation of line violence.  In those three issues, they were very Latin.  Order things we collected to do the map.  This is also the Latin American report.  So I want to ‑‑ in those reports, we have cases documented in the problems that we find to solve them, analyzing two senses.  One is the public sector and then the responses from the Private Sector.  We found that the issue of access to justice, but also the continuation of the problems that we have, we again offline gender base violence that you go to the police and they will on we only have one training on the digital rides, but also on dealing with gender‑base violence.  For instance, there was one case a person had one of her peek ‑‑ was suffering for revenge porn.  I don't like that term because it's not porn and not revenge.  The girl was being (inaudible) and the police said give me the photos.  I don't want to give you the photos because then I will be exposed to you.  Revictimizing is also a trend.  So we find it very hard the legal solution in hand just as many other issues related to gender violence.  Focus on gender and digital and exactly on gender violence and we have classified those legislations and they are mostly bad.  And also seeing it as a way to criminalize conducts.  In the end, who is going to jail and who is going to again the poor people and even a woman from low‑income and the communities would say this is not the way, we don't want to put our boys in jail.  So that was also one interesting remark.  So then moving to the solutions to the private sector and to wrap my interaction.  We saw there was a massive feeling of impunity in one hand and the other hand a massive feeling of censorship.  Many feminists and collectives price ‑‑ the way of communicating in the feminist agenda also uses body a lot.  So then there is a conflict with the more of the platforms that they use.  Freedom of amendments and speeches is not as balanced like in Brazil.  Even we had a picture of old case, but we had a picture from the ministry of justice that was blocked because the minister of culture was blocked because it was an indigenous person not wearing clothes, but a piece of art.  So it's not balanced.  Now more standards from one hand, but on the other hand, it's not balanced with hate speech because the FIRST is so supreme.  So, then I passed to the recommendations.  So the FIRST recommendation is to contextualize.  Next is to content textualize and I intersection approach to gender.  That considers race.  So the team that's analyzing it should be aware of this cultural differences and capable to work in cases of violence against women are not only women, but agenda violence.  There is a need to balance, always.  So the platform needs to be committed to address the issue, but not become anonymity and protection of privacy.  In terms of references and ways to redress it, so the tools to denounce should be easy to find.  There was this case that across online is a platform that analyzed a lot of the terms of references in the way the platforms are giving people to address the tax.  They found out and denounces for revenge porn in brackets.  In many platforms have several paths.  You can denounce it under private delegation, porn or copyright.  It's confusing and not naming properly what is going on, which is no consensual share of images.  Then it is hard to find a solution.  Then due process meaning that people need to have the right to appeal if I contained or has been taking down.  We have due process in the legal procedures engaging with government, but with the platforms, we don't.  So if your content has been censored, you should know why and maybe appeal.  In many cases, it happened, but just because they reach us or other organizations in Brazil that have access to a person in a particular platform and that's not the right way to solve it.  But also concerning due process, there is a need to have clear information about the processes, the criterias for the decisions,s timeframe.  So we're trying to address this feeling of impunity.  Why?  Why?  My concern has been removed, but I am being attacked by the person in racist attacks.  I will have more clear on that.

And then transparency.  It will be great.  The platforms already have transparency reports and only requests from the government to take down contents.  We would like to know how many cases of gender‑based violence are leading to take down content or how many cases of feminists content is being taken down and moral allegations.  And then consult and it would be nice to engage more platforms to address the solutions that they're taking.  I know that they implemented the hard content photos that have nudity, but then you need to give your nude pictures to Facebook.  So it's weird.  Maybe if you talk more of that weird (?) would show up earlier.  It helps campaign against gender against violence and all remembering there is a quarter of a quarter income tax.  We cannot do any university campaign because the forms of attacks varies a lot according to the content.  So thanks.

>> Thank you so much, Joanna.  You heard a lot from various perspectives and many more in IGF and elsewhere.  This can be a primary site.  How are you guys about addressing these issues?

>> Thank you so much for having me as part of this discussion.  That's a really good segway for me to come on board.  So thank you for setting this stage.  I manage global safety policy programs at Facebook.  I also speak really fast because I'm from India.  So if you need me to slow down, just let me know and I will slow down.  You might have to remind me again, but it's an ongoing process.

[Laughter]

Let talk about broadly.  I apologize for people that might have heard me say this.  What do you think about safety or keeping the community a safe place on Facebook?  We try to take a five point approach.  We want to make sure we have policies in place that clearly define what people can and cannot share on a platform.  Now we have people coming to Facebook is from many different cultures, speaking many, many different languages.  What my sense acceptable nudity may vary from what yours is and how do we balance this out and make a place where anybody feels they can come and Connect.  It's a huge challenge.  We have taken feedback into account.  So now we would have that photograph that you talked about which we treated down many years back.  It is a culturally conversation that you are having in society.  We have photographs of women breast feeding, but we don't allow complete nudity on the platform because we have people from different cultures.  They tell me this policy isn't in the right place.  It is propagating many (?).  It is a conversation that we have.  Right now because of various reasons, we don't allow complete nudity, but this is something we're talking about, constantly trying to evaluate where's the right place for our policy to sit down.  As the discussion keeps going, we might amend these policies.  What I am saying is this is an ongoing conversation to help us understand where the policy should sit.  Our philosophy is making sure we have the tools in place.  This includes back end tools which we try to keep the platform safe and keep think out certain content.  Tools to give you the control on your experience.  Who do you want to Connect with?  Who do you want to share with?  What are you seeing in the news feed?  We want to make the tools as easy to understand for people.  We have got a lot of feedback that tools got confusing especially in the security side.  So if you go to our safety center, Facebook.com slash safety, on the left‑hand panel, you will able to check something.  Go back and check who you have been chatting, what have you been sharing, change the settings because you may not want to share those with the whole wide world anymore.  Over the years, we have changed our tools as we become more sophisticated and now we allow people to have other tools built in to give you more security.  Make it stronger.

Now I will go into more detail about things I can keep talking about, but I want to talk about the third one.  A lot of times people don't know there are certain tools and ways they can be approaching us for help and support.  If someone has shared an intimate image without your consent, you should be able to go somewhere and get that information.  Why does it look different on Facebook versus other platforms?  We have a headset.  You go is to Facebook.com slash head, you will be able to get resources.  Last call we redesigned our safety center because we got feedback from people on what information they needed.  We made it available in over 55 languages and have a lunch of videos which step by step will walk you through various tools and features and policies.  We have worked with local organizations around the world and developed guides and worked with partners and listed all these things.  So if you need more hyper localized resources, you should be able to get them from the safety center.  They didn't know how to have a conversation with their kids about online safety.  So we developed a parents model, which is basic 101 on how you stay safe online.  We talked about resources.  That was my third.  The third and fourth super, super critical and according to me, cut across everything that I just talked about.  Partnerships and feedback making sure that we're working with the people that are in the field who are specializing on these issues.  We may be created building a platform, but we don know what is going on.  We don't know what the special needs of domestic violence victims are.  We work with the national center for social research.  A whole bunch of organizations keep us honest and our ear on the ground and tell us what we should be doing and where we need to strengthen our tools and policies.  So, this is really fundamental for us getting it right.

A couple years back when I took my role of global safety programs, we got out and traveled and speak to organizations and people working on women's issues, not just women's safety online, but women's issues.  We spoke to 150 organizations in that one year.  We have come to a new and that helped us private as and think of things that we can make a difference and give it as a safe space for women to have a voice.  I want to talk about a few of these tools that we launched as a result of these discussions and conversations.  The FIRST one I want to talk about is our data bank for non‑consensual sharing of intimate images.  Other people know it as revenge porn.  We don't like that term.  One of the big pieces of feedback when you were having these conversations, there was no country that we didn't hear this.  We'll use photo matching technologies so that no one can share this on any of our platforms, Facebook, messenger, ever again.  We thought it was the FIRST step.  Why should you go through that experience that someone shares this on the platform for you to come and report it to us?  Can we do something at a more preventative level?  Can we work with you to get your images, add them to the data bank and make sure they report shared on any of the platforms?  They walked us into a lot of questions.  When you say Facebook and the headlines ‑‑ I will let you use your imagination.  One of the things we did when we were developing and thinking through this pilot because this is a bite.  This will help us get a lot of feedback which will inform our work going forward.  We need to know what the experience is like, how can we build this out and scheme.  And bunch of other people in these countries to try and take this FIRST step and this is just a FIRST step.  And we're hoping that people report turned off by the headlines.  They still try and use this and help us figure out what's the best way we can prevent the initial change of these images on our platforms?  There has to be a best way of doing this.  I will stop talking about this.  If you have questions, I will be around.  One of the big pieces of feedback that I used to get when I would go out and talk to people was that people didn't like sharing profiled photos.  They didn't want people spoiling their image and more issues.  We call a profile photo guide.  You can use this to add security, but it is only available in India right now.  What it does is it basically gives you more control over who can download and comment on it.  Everybody can see it.  But who should be able to get access to it and downloaded easily.  So that's an example.  We heard feedback in one specific country and now trying tools specific to that country's needs.  Just a few minutes back, 29 minutes back, we announced a couple of new features.  One of them is based on feedback we received from a lot of women that sometimes people come and create fake accounts they reach out to harass you.  You report it and we take it down.  Then another account and another account.  So we've been doing a lot of work on the back end to see how can we stop this.  Today we announced new tools to help stop this person from reaching out again and again and setting up fake accounts.  The order thing is on messenger.  This was based on organizations related to domestic violence victims.  They didn't want to remove the person because it would aggravate the situation.  So it wasn't as disturbing to them.  They would not ‑‑ blocking meant they would not be able to see what this person was doing in of the places on Facebook.  You don't need to block them.  You can just acknowledge.  If you plan to go down the law enforcement path, you want to document abuse coming to you.  It would be your other messages folder.  So you can go back to it.  There are order things that we launched to give you a flavor how we went about looking at feedback and how we can build tools that were scalable that we could launch or in century markets.  But we want to layer this with programs, education and awareness.  People don't know they can report.  Reporting is anonymous and what happens when I click report?  Why report they getting reviewed.  I don't know how many of you have seen them which talk about what is reporting on Facebook.  We have been putting up education units on your news feed to remind you if you have taken a security check up.  So we're trying to do a lot more in the platform.  People have more awareness of the tools and policies for safety and security.  We're also doing great partnerships and I am super proud of things that are happening.  They're going and having conversations.  I will share one example because I know I need to stop talking.  So in India, there is an organization called center for research.  It goes to colleges and they do a program called social surfing where they talk to young people and talk to them about using social media to give a voice to women's issues.  How do they stay safe when doing that?  It's an absolutely amazing program.  They have done like 75 colleges around the country.  India is quite big.  So it's been fascinating to see them go in classrooms and engage people and give them agency and ownership of this space.  It is the tip of the ice berg.  We have a poster that says the gurney is 1% done.  This journey is literally 1% done.  We have started this journey.

>> Thank you so much for those perspectives.  Answering your question about the intermediary responsibility.  And now I promise you would have plenty of time to discuss, but I want to FIRST give the panel right of questioning one another and then opening it up to all of us.  What else are we missing out on?  Many of you have probably been engaged on this.  Any strategies of engaging across the board from Civil Society to government that we should learn about so we can maybe leave here.  Anyone wants to react or ask one another questions?  Yes?  No?  Perfect.  Just very short thing.  You were mentioning that you have talked with organizations and working on women's issues just to make a clarification.  You have not done it with Latin American and I'm sure of that because I was invited to a meeting with Facebook and it was a meeting in Washington with people from United States and from Canada.  We actually already spoke and we in touch with the people, the Latin team, but you did not have one meeting with Latin American.

>> We made amends?

>> I think it's still on.  We just made friends.  I was in Mexico last mom and we met with a whole bunch of organizations from the region.  I want to be superior very clear.  We went down to Mexico two know who's back and met with 30 others in the room working on a range of issues.  Women safety, child safety organizations, suicide prevention organizations to hear from him and continue the conversation.  So it has started.

>> That's not the meeting we were ‑‑ we don't want to focus on child for example.  So that's not exactly the meeting we are asking for.  It was not Latin American.  It was Mexican with a few other organizations.

>> Should we get that on the calendar?

>> Just a reminder that multi‑stakeholder is tricky.  It's going to so many sessions so far and it's been left getting people in the room.  Low it works out and also a constant struggle.  But when you say ‑‑ it's not just a tip box measure.  Foe one person ‑‑ I can't sit here never, ever or ask anyone to speak from a perspective rather than they're comfortable to do.  It is a advantage how we enforce diversity and I guess this is a great segway unless anyone else wants to ask a question from you guys.  Any strategies you have seen that maybe hold promise or new challenges that we need to be aware of so that we all leave here with a bit more insight.  I don't know if we have gone from get law enforcement side of things because it is always very tricky to get their perspective and consensualization of this.  I would love to hear what else you have to add to this discussion.  By show of hands, we can get this show on the road.

>> Hi.  I have one question.  To those who discussed the legal framework in the interest of the liability, I guess.  My main concern in terms of addressing some of these problems and you are absolutely right to talk about existing laws that tend to take a (inaudible) and we have to protect the morality of women.  I think that is something that the private companies need to look at.  I think what happens is according to the laws of the country and where they are is very problematic in the rest of the world so to speak.  So I think maybe just a quick question about how much ho proactive the companies are and also Civil Society groups were able to make the links and say I think it's not enough to say according to the laws of the countries because it is hugely problematic.  So I'm sorry I don't have a strategy, but this is where we find the biggest challenge and to make recommendations that would work for individuals, community societies, but also challenge the structure that is existing at the same time.  Thank you.

>> I can speak for Facebook.  We actually have a set of global community standards that apply globally regardless of the local.  We do respect local law and we do take ‑‑ we do work with local law requirements, but our community standards are global and apply across the platform.  We have been working across the countries.  It's a lot that we could be doing, but we have started working in the U.S. because I am based on a lot of the legislations for NCIAA.  I don't have the details, but this is something we do continuously look at and work with.  We work with local organizations that come to us and say hey, can you work with us on advocating for this.  So yes.

>> My name is Leticia and I'm from Brazil too.  The technical area is not my point, but I want to understand.  You talk about databases that in Facebook if you have some intimacy image that was shared, I don't know.  Is it ‑‑ can be applied to (?) because once in the net, I saw a lot of persons that have posted a photo.  It has been shared by someone and can take it off, but other people have been (?).  This is a huge problem.  I want to know if databases can include it and I want to make suggestions and I think that is very important.  I think the terms and (?) from social media can be changed.  Opt in and opt out.  Some of these apps or some of these digital pages has ‑‑ they ‑‑ I don't know.  I don't know how to talk about this, but you download and you already agreed with all the terms.  It's not like this.  This is one way to overcome this issue.

>> Both the questions.  Let me talk about the FIRST question that is the data bank for the matching technologies.  So right now there are spaces on the safety side.  One of them is for call for DNE which we use on the child safety or known images of child sexual abuse.  Any time you know a photograph is unloaded, it is against a known data bank.  Think of them as thumb prints.  It is a bunch of code.  It is scanned and if that image is found to be known CEI, it is taken down and we have to report it to the national center for exploited children.  We have to make sure we inform them that is being shared and they can go behind the person who is sharing it and bring them to justice.  Now we're trying this out on the photo safety side for intimate images.  It is starting ways of seeing quite complex.  The data banks as big as they get, the slower the service gets and all of that.  We're starting to look at it on intimate images.  We just start the template of that image and that's how it works on the back end.  Right now we are (?) screen shots.  I can see this conversation growing in the years to come.  When it comes to opt in and opt out, we learn along the way.  We have starting new features any we would keep them open.  We learned along the way the best way is to make them opt in rather than keep them by default.  Now when you do sign up for Facebook and sign up for an account, you have share friends only.  A lot of tools started develop think on the safety side.  We started building these things for keeping children safe and then realize it is a huge ‑‑ everybody needs this by default.  We learned and started making more and more things by default of you have a huge opt for them and I can speak for Facebook.  That is our policy.

>> So I think that at any point loss they basically have to reflect what seems acceptable and not acceptable at any point in time.  It is constantly moving target and we work and it's like a slow animal and it takes time to move and we have to engage with that.  But because we get frustrated with that, is it really the option to see we kind of give up on that and we kind of work on other means?  I'm not so sure.  And I also think that it does show that platforms that says Facebook or Twitter or anything else may be terms of service and things that users volunteer and do not opt in.  That's one part of the equation.  There are also things like public (?).  The state has a particular visit of approaching or defining public comment and moving the deck.  I think even in the ADF gender‑based violence, it becomes important to engage with national laws and try to change them.  I think that is equally important.

>> Anyone else?  Okay.  Perfect.

>> My name is Reester.  I want to thank you all for all the work you're doing and the feedback you're giving us.  I had a question she mentioned in Brazil that they have legislation for about this women can's rights online.  Now I want to refer to my countries Zambia.  I feel it is very difficult for grass roots organizations like mine which I'm working on.  Safety goes to have a say in policies regarding online rights for women.  So there is really a huge ‑‑ it is very difficult to get to government level and propose these policies.  So what can we do well there's no transparency or openness for ours for organizations and use to propose policies and be part of the process.  Thank you.

>> What an excellent question that I think is open to anybody on the panel and on the floor to answer.  Based on maybe people who have that past experience as well.  Thank you.  Go ahead.

>> Thank you.  My name is Sandra.  Just to say we had a very similar conversation on this issue just a few minutes ago.  At FIRST I think that strikes me is again we're having these conversations in kind of silos.  Is there somebody or some way to join us all up because I think those two things.  FIRST of all, work at the national level and work also at the international level and advocacy with the Internet platforms would benefit from a more coherent set of recommendations particularly.  And I think also there is some way in which those with if you like less power and agency can get supported by those with much more, Amalia, I love her because she doesn't let anything slide.  She will go after you until you answer answered her question.  Perhaps you can do that with more support and how do we join this up in ways that go forward.  I think there's also the other bit of the equation is around the whole issue about how we do protect and really meet the ambition of an open and inclusive Internet.  So it's not just if you like in the wheelhouse of the individuals rights piece, but how important is this space for some of the other values that we all hold dear.  I come from a democracy organization.  So how important is that space for conversations and political discourse around some of the values that we want to promote and support and our political action as well.

>> I'll step in there and now more global and Civil Society.  The power of networks and where they can be formulated and all of us here, not the to the exception of Facebook, but in conjunction with organizations like Facebook, we try to come together and have a voice.  As women's rights online network that we'll be able to be representative per se, but what we try to show is that you're not crazy one individual on the ground that the policy will ‑‑ policy makers will shut down.  The lessons is everything as you were saying.  With limited resources, we tried our best to have resources approaches we have used and you can borrow from one another to have various aspects and ICT.  Also trying to find access to the policy makers as you are saying.  It is ironic for many of us especially from such parts of the world, you have to come to je95a and you will not get airtime in the office.  But also to your reflection of who brings us all together.  On this issue, it will be very important and I hope as many of us here have contributed to it.  I think it's ‑‑ the UN system and all of us have an opportunity to share what has been working and what hasn't.  It will be packaged and put back will help us see what is working and where strategies can go forward.  I will be happy on touch base with you and talk about how we have been working and really strike up to figure out to whom do we partner to bring them into the room and have them engage proactively.  We see a difficult topic to get any type.  See the women's department.  But this is a societial.

>> Just one very small comeback.  One of the practical thins we could all do is make sure that we read the transcripts from the other panels because there will be something in them and I've been to two today where I learned things.  So that's one thing.  And the second thing is more of you have got to go into politics am we need you actually in the political space because that's the way we know we get positive policy outcomes for women.  We need women in Civil Society.  Yes we need women in the women's commissions, but we also need the women's reps.  That's my call to you because I'm old and gray now.  I expect you to move forward and take that space and really make it your own.

>> Thank you for that.  We'll take ‑‑ yeah.  Your question and Joanna will intervene.

>> Not a question.  Just saying to share some thoughts.  I'm from Brazil and I'm not from Civil Society.  I am from academic and we are making a project to help to protect women from violence.  This problem is called own school facing the violence against young women.  We work at schools with young girls between 15 and 17 jeers old teaching them how to program up applications and bring digital theaters bare because it is very important.  And the last ‑‑ during the last two years, it's 130 girls.  Just because of this position so this is terrible, but also the digital literacy is a huge problem.  Most young girls that we talk to interview, we are doing focal groups and whole Brazil.  They think that Internet are Facebook.  It's not a problem using Facebook.  It's a huge problem to think that Internet is only Facebook.  The opportunities that people think.  This girl has a lot ever opportunities on Internet is a lie.  They don't have it.  So they use this for example.  They don't blay and use games.  They don't go to other sites.  Just Facebook and what up.  So I think this is ‑‑ we ‑‑ we think it's very important to join closer to government.  We have prepared something like a plan to develop with government.  We are trying to make this a public policy.  It is very difficult.  It is complicated, but some things we made.  Some progress we made because now we do this project in five cities in Brazil.  Let's see if the government makes these likes a public policy.

>> Just a few comments.  FIRST just to agree on the issue to involve the community of those two reports that we viewed for this special for ‑‑ we were done in the community effort.  We know it's an issue and a hard one.  We move to stakeholder one to solve it, but also we needed to understand it better so we can hack it.  Even if the governments or the platforms are not going to solve it completely, the hacking will also come from building a community.  And by building a community can also be our way to produce counter discourse together and counter attack not only depending on the big players.  Thank you for all the clarifications about Facebook.  I see other measures are pretty much involved with security like increasing the security, but still it's very locked with Facebook.  We don't know exactly what's going on.  So I'm‑‑ I have trouble with two things.  The due process we still don't understand when a content is taking down or not and do not have a way to appeal.  And really I have been as an organization in Brazil that talks about gender rights, people are being attacked and it's always a mess because it's not our core thing.  We do give them digital security, but calling Facebook and say can you take down this.  Can you take up this is like it's not my role and so we need due process.  Transparency on what is going on on both sides.  About the diversity and the talk that Amalia mentioned, those kind of meetings if they're done in Latin America when they are, they should also include not us, but just the digital rights, people.  In the end, we're like intermediary in the conversation.  Luckily so far, I haven't been attacked so much.  Know there are people on there.  There are other points to make.  Thanks.

>> Thank you so much for that.  And just to talk about a couple of things we're doing on our side to help close the loop and give more information to people on when we take down content and why we are taking down, I want to talk about some of the things we're doing.  Just to take a step back, how does reporting work on Facebook?  You know why?  It's quite frustrating because people don't have so much information about it.  When you click on the report function at Facebook, we ask a CD of questions.  And you then give us some information.  It shouldn't be because it is embarrassing, it has somebody is thinking about committing suits or so ‑‑ so we have teams of trained specialists who will look at the reports based on the team that you selected, but these people are also trained across our community standards.  So it's not just because say my subject matter expertise was NCII.  When the report comes to me, if let's a vision of some other policy, I should be able to take the correct action and take gown that piece of content.  One of the things we have been working on is making sure we have made a language review.  We have over 40 languages.  So we know that just because I speak Spanish, I should not be reviewing content in Spanish because I lack local nuances and local content.  We want to make sure we have native language speakers when the reports are coming in.  Our international headquarters are in Dublin.  Every Monday is an induction for the new team people.  We're trying to get more and more languages, more and more representation to the tiles in terms of reviewers.  The other is a clicked report.  Facebook has not taken down the content.  There are reasons why these errors happen.  When you clicked on report, what have you reported to us?  Was it the post these folks should be looking at?  Where is the problem where this lies?  It is like looking for a needs in a haystack.  The second reason we made a mistake is many times we don't have the content on what's going on there.  You reported something because someone has put upon a pink rose on your page.  We don't know why it shouldn't be on your profile.  I work with domestic violence and advocates have caught us.  This is a way of needling with domestic violence.  It is like saying I got my eye on you.  We can't take the collect action.  We want to make sure we're building tooling to compliment those experiences giving you the ability to ignore messages, block them, delete them, report them if it's a fake account.  And I lost my train of thought, but I will try to capture it.  So that's the second reason why we would probably make an error because we don't have additional content.  Sometimes it is also aggravated like the trolling behavior.  So we don't know what is going on behind the scenes.  We don't take the action we think you should have taken.  This is feedback we're looking at and what more could we be doing there.  We think the tools can help in giving people control over who can reach out to you and what they can do on the platform.  This is to respond to the question of who is reviewing these?  One of the pieces has Facebook even looked at my report.  What did you side to take that action.  We're trying to give you messages when you submitted a report to facebook this can, is your haven't and see what is the status and what action did it table.  You will find out more about bullying policies, click here and go here.  In instances we're trying the whole appraisal process, but if you think Facebook has made a mistake, go here and let us know.  It's a work in progress because it's just pretty complex, but that is something we're committed to try to do more and more of.  The other thing in box experience is give people more local real time support.  It is probably shared on other places on the web.  The minute you report this to us can we give you a response back saying hey, you reported an intimate image.  Mean mile, here is local organizations who you could reach out to for help and haven't.  Could we be doing more with that?  It is sophisticated because sometimes people report content saying it is bullying, but when you look at it, it has order things playing around to it.  So it's hard to get the messaging absolutely accurate and not aggravate the person more.  Maybe them we've doing it at scale and that's something.  People who are on the ground taking important phone calls and leading facebook hasn't taken the right action, we can see whether or not we are taking the right action.  Is there something that we should be fixing at the back end?  Maybe the policy is in place, but the enforcement is not in the right place.  There is a team looking at the reports and looking at what we need to fix on our side.  We need to be doing more on that policy or enforcement.  So thank you for that.  I will stop there.

>> We have time for one last intervention before we wrap up.  Just very, very quick questions to build on your questions on transparency.  I am Valerie.  I am from the HCHR.  On what principals or BASIS were the standards build upon and what was the process of building the community standards and also second question that you held with organizations whether outputs from the meetings that could be circulated because neigh have potential to inform further conversations or processes that are other organizations conducting.  I personally like to see what came out of this consultations.

>> So to answer the last question, do we have what things came out of those consultations?  The way we try and communicate them is put up news room force.  The Facebook has a news room.  You should be able to see all the things you're announcing.  We shifting through those and finding out the one which are the most relevant to you will be quite.  Here are thins we worked on and developed on.  We decided to say we really, really have taken the feedback and worked on it.  So it's not, you know, the organizations that we booked with, we're trying to closed loop with them.  Your FIRST question was about community standards and how we came up with those and what is the thinking behind those.  I don't know if you had a chance to see Facebook.com.  We're on what are the principals behind the community standards.  People sometimes gain the system after getting to know a lot of details, but we wanted to give you the policy and what does it mean‑for‑it.  We're doing work to see how much more we can be sharing like putting will ‑‑ one way we are doing that is something called a hard question at Facebook.  I don't know if you have seen this.  We started launching on topics where we ‑‑ they were super, super difficult decisions we had to make on community standards and we wanted feedback from people whether they think we landed in the wrong place or right place.  We released one which was a super hard one.  When somebody dies, what should happen with their account.  Given us Legacy content, store us and it makes life a lot easier.  So what should we know doing on that instant?  There are a lot of questions put out there.  We are taking feedback more from the question.  We will give an e‑mail address so you can e mail us and tell us where you think that we're dropping the ball or where you think we could be doing it differently.  That's one way to open up the dialogue and hopefully it will be a lot more ways to open it up.

>> Thank you so very much.  We're coming to the end the time.  I feel I must thank you for being here.  You could have been at the equals party of what is going on.  It was a slot we were allotted and I just want to conclusion by saying there is still the average policy and policy makers.  All of us are a part of and we coordinate at the web foundation.  If you have any ideas and strategies, coding rights, we want to compile those and amplify best practices and we try to do advocacy work on the ground and engage with policy makers.  We're trying as much as possible to share stories about how that's working.  There's a toolkit and strategies and it's long‑term.  But thank you so much for being here and we will continue this discussion on women's rights online.  Thank you very much.  

 


 

Additional Information

 

Contact Information

United Nations
Secretariat of the Internet Governance Forum (IGF)

Villa Le Bocage
Palais des Nations,
CH-1211 Geneva 10
Switzerland

igf [at] un [dot] org
+41 (0) 229 173 678

Zircon - This is a contributing Drupal Theme
Design by WeebPal.