IGF 2022 Day 4 BPF Gender & Digital Rights: Regulatory Practices - A Friend or Foe to Gender and Digital Rights?

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> MODERATOR: Thanks for joining.  I know it's Friday after lunch, so thanks for joining us today.  We're holding a session on gender and digital rights.  It's a forum within the IGF that aims to discuss gender issues within the space.  This year's thematic session we are tapping on regulatory practices and where they stand at in terms of gender and digital rights.  So are they friend or foe to gender and digital rights?

I'm glad to be here with a stellar panel of speakers from across different sectors, and it will be a chill conversation on how we relate to regulation, especially Internet regulation as to how it affects freedom of expression, privacy and the work of human rights defenders and journalists among other cross‑cutting issues.

Before we go there, I want to go to Bruna.

>> MARWA AZELMAT:  Thank you.  My name is Marwa Azelmat.  Hi everyone on and offline.  Glad to see you here.  So this year at the BPF Gender and Digital Rights, as was mentioned we started studying, and we came up with a report that then would analyze that based into three thematic areas.  Systematic areas would be privacy and surveillance also with a focus on reproductive privacy.  The second area would be freedom of expression and gender disinformation, and the third area freedom of association and religion.

So throughout this year, the BPF process was to discuss how these areas affect gender and human rights, and we did the report based on case studies.  So community would bring the case studies to us, we would analyze them and they would then compile to the report.

  The report is still open for review.  

>> This is part of the work of the IGF, so we have been going on ever since 2015 and worked in topics surrounding gender discussions.  So we have addressed violence against women, female empowerment, diversity and participation at the IGF and there has been a lot of gender disinformation.  So we are happy to have the session today and for anyone interested in this work, if the BPF continues to exist, we have an open meeting, so if you would like to join or have more information, let us know and we can direct you to the website.  Thanks.

>> A request, please, speak slowly.  Sometimes the interpreters can’t catch up.

>> MODERATOR: Just to echo what Bruna and Daphnee said, the work we do at the BPF is community driven, so we welcome you all to submit inputs to the reports or the comments.  We haven't finalized it.

Informatics is really relatable or resonating with areas of work.  Please feel free to weigh in on the website or just shoot an email, you can find details on the website of the IGF.

>> MODERATOR: Now, I guess we can kick off the session.  I would like to start with remarks of the speakers.  We have different people across different layers of vulnerabilities and identities.  I would start off with Onica Makwakwa.  I know you have a very extensive experience managing and pioneering various national and international campaigns, and policy change, processes for women's rights, civil rights, media and digital transformation initiatives.

And as we are here today in Ethiopia holding the IGF, I would really love that you just weigh in on the state of regulation across the African continent and how does it really relate to the work that you do, especially for different communities.

>> ONICA MAKWAKWA: Thank you.  I hope I live up to that expectation.  Again, the name is Onica Makwakwa, I'm the head of Africa, the global digital inclusion partnership.  There is some background noise I'm hearing from this side of the room so perhaps someone who is listening to something else could put some headphones so that it's not ‑‑ it's the interpreter, okay.  It's like really echoing.  Okay.  Let's move on.

So in the work that I have done, you know, on this area, looking more specifically at regulation and attempting to mainstream the issues that predominantly.  And so and the African continent, more specifically.

The opportunities in that, there has been a lot around the pushback using defamation of character as a way to silence women online who are speaking up on these issues.  I'm pleased to say that we.  The outcome has been more recently in the western cape court in South Africa where young girls took to, to abuses online, and where subsequently consumed. We are able to take those cases up under organisation called Wise for Africa and got really great judgment that sets precedence where the judge actually says that this defamation lawsuits brought by people who are accused of abusing women through silence them online is a form of silencing.  It's not they were expecting when the laws of defamation were actually adopted.

So that gives us a glimpse of hope that going the route of public interest litigation could actually help us develop feminist Jurisprudence that begins to push back on some of this shrink women's voices, especially online.  I have to say that it's extremely expensive because these are usually high court cases where you need to hire an advocate and you are fighting both in court as well as the court of public opinion.

I think it's really important for us to push for implementation of regulations that exist because we have found that in pushing on feminist jurisprudence, we are actually using laws that exist to protect women and to defend women against some of this unfair cases.  And what it means is that even though the laws might claim to be equal for all genders, we have not challenged them and demanded and required that they include, and not just women, but people with gender identities.

That's one of the provisions I want to make in the document.  After this, I do apologize, I had not done that.  We are defending a two tier account that had already been taken down.  I suspect that this using of the courts to silence women's voices online is something that we will start to see mushrooming in other jurisdictions as well, and that we need to document this and be able to utilize the gains we are getting in other areas, but the regulations we find that even when they are there.

>> MODERATOR: Fantastic, thanks.  What resonates with me is the glimpse of hope and how we need to mobilize resources that we have.  Perhaps now I will go online with Mahima Kaul.  Mahima, public policy lead at Bumble.  It's nice to have someone from Bumble from us, and I know that your story is very personal as it started with a lot of trolls and online harassment.  I would love if you can share more with us before we really get into the conversation.  Thank you.

>> MAHIMA KAUL: Thank you so much for having me.  I hope you can hear me.

>> MODERATOR: We can't hear you.

>> MAHIMA KAUL: You can't?

>> MODERATOR: Hello?  We are dealing with a glitch.  That's very relatable.  Can you try now?  I think it's still muted okay.  We will come back to you.  I go to you Mariana Valente. Mariana Valente is a Professor at the St. Gallen  University, and she is a founder of Internet Lab Brazil.  I know you have tackled a lot of feminist issues with the Internet regulations sphere.  So can you please address one of those really like big blocks?

>> MARIANA VALENTE: I want to make the short introduction to how we think of regulation in the gender and digital rights field based on the research experience, especially at Internet lab.  That's always been collaborative with talented researchers, some of who are here.

So what I wanted to really focus on in these first minutes is that when we are addressing the legal area.  We will frequently focus on laws, on the process of approving them and what's stated in there.  So, for example, what the framework is in each country, and what kind of framework can women, LGBT and other marginalized communities rely on?  Regulation is a cycle, and any struggle that has the legal horizon in sight is not fulfilled, of course, once a law or regulation is approved because, of course, the regulation has to be enforced.

And it can be very surprising how many elements can play a role in the success of this law or the Internet results.  I'm really happy to speak now after Onica when she has mentioned public interest litigation which I appreciate a lot, but I wanted to mention three of the elements just to start the conversation.  So the first of them is that when we have a regulation approved, we might face unexpected interpretation of legal terms and concepts.

If you use defamation laws to protect women from abuse, from an abuse that has sexuality component, you might see judges using, for example, their morals to judge because the protected legal interest in defamation loss is usually the honour.

To tackle this we need to understand the logic of the legal system and I think that interdisciplinary research is something that can highlight that.  The second point I would like to highlight is legal procedural matters.  I think very little attention is given to these procedural matters which are, they might really hinder access to justice.

So, for example, suppose a procedural law requires that the party itself, the victim or the target, that they prosecuted instead of the state that is a private procedure process.  This might hinder access to justice for poor women, for poor LGBT people, for example.  Then the third thing related to other law enforcement barriers, so we could mention many obstacles facing police and other law enforcement officers, but we could mention software.  So if the police officer doesn't display a category encompassing the specific investigative terms, this can hinder even starting an investigative process.

So I think issues and barriers must also be considered when we speak of how to regulate, and knowing how the judiciary works, this makes proposed smarter regulations, so we go full circle.  These are things that are highlighted and research,  and research must be connected to the public debate, and that requires different methods as well, so analyzing case law, research and being multi‑disciplinary.  And I'm happy to speak of some results and understandings that we reached at Internet Lab by looking at some of those laws and further interventions.  Thank you.  Very happy to be here.  Thank you all for attending.

>> MODERATOR:   I think they set the tone for how we view the legal system in order to advance more feminist issues from within that system.  I hope now, Mahima, you can unmute and speak.

>> MARIANA VALENTE: Yes, can you hear me?

>> MODERATOR: Yes, we can hear you.

>> MARIANA VALENTE: Thank you so much for having me on this panel.  It's been great to listen to the views before me.  So, yes, let me give you a little background about myself and then I will talk a little bit about what bumble is doing and the current view of the Internet as we see it.  I like many women across the world had the unfortunate experience of experiencing controlling, experiencing many different kinds of threats whether on social media via emails or other forms.

Where it was different from me is I was working in a tech company so I had an interesting view into the tools that are available on social media platforms, but also what's legal infrastructure available to me, and that as somebody who is experiencing this very, you know, this harm, do I really want to go to court.

One of the reasons I would maybe go to the police or not go to the police if I had a case, and as we know it always really complicated and never as simple as some harm has been done to me so I will, therefore take X or Y option.  The idea of having a safe space online for women has been really important and to me personally, but also professionally.  I used to work at Twitter and Twitter was, of course, a battle ground for activists and for journalists and women in the public domain to stand there and make their voice heard.

Let me slow down.  There are some platforms that are really important for public figures, women public figures to make their voice heard, and we have captured, I think, in many different reports, in many different organisations who have talked about how these public figures actually are controlled and given threats and silenced.  There is also the case of everyday women who are in different areas of the Internet, you know, trying to live their private lives on different websites using different apps and over there as well they can be actually silenced or made to self‑sensor or just be scared and leave a platform because of the various things that are done to them.

So part of what we are looking at around the world is that there are obviously Governments around the world who say that they want to make the Internet safer for women.  Depending on which country you are in, they are taking different approaches.  In some countries, you know, clearly there is, Australia, for example, has a regulator set up called the E‑safety commission and it is specifically looking at holding industry to a standard, you know, across different apps to say this is a minimum standard of safety that you must ensure that your platform has.

And Australia has passed a bill recently, and they are also developing codes for industry to make sure that companies are able to provide that level of care.  In other places, it may not be regulators may not be sort of doing exactly the same thing, but there are laws that exist which tactically try to address some harms that are taking place.  It could be video, images, different activities, but where the work really lies and you think some of you would be doing that is seeing whether they are having the impact that they are supposed to, which is the safety of women, or having an impact where it's actually dumbed down surveillance and not serving the purpose that they are meant to.

As introductory remarks, we do need a lot more evidence‑based approach to what is happening in the regulatory sphere across the world because a lot of Governments and regulators say they are looking to protect women, and we must make sure that what they are trying to do is based in the evidence of these other harms that exist today, B, while we discuss factices, we must be aware that these things change as technology change as to the dominant harm that women are experiencing on the Internet today may not be what is available tomorrow.

And thirdly, there should be a method or mechanism to assess the impact of regulations that are taken or laws that are passed in different geographies to make sure they actually are having the desired effect.  I know Japan passed a law about cyber bullying which was hotly contested because it crosses into speech issues and they said they will review it after a period of time to see whether it's actually helping cases of cyber bullying or silencing the larger Internet.

So there are different approaches being taken, and so that's what I would like to do my introductory remarks.

>> MODERATOR: Thank you for these very extensive remarks and I will definitely come back to you regarding the bills that you mentioned and I think it's very important to honour the issue of online gender‑based violence been the 16 days of activism on violence again women and girls across the world.  But since we are, since we are mentioning or we are really talking about big tech and private sector, I will stay here and revert back to you Theo.  Theo is the Director of business and human rights at Ericsson, and I think it's a very unconventional actor that we have here at the BPF on Gender and Digital Rights.  So perhaps in your introductory remarks you can tell us about your work and Ericsson's stand on the issue we are tackling today at the BPF.  Thank you.  

>> THEO JAEKEL: In my introductory remarks we have heard other speakers talk about specific bills and election that addresses the issues we are talking about today, but being one of the representatives of the private sector and corporations, I thought I would just in the introductory remarks take a step back and talk about the current trend of regulating corporate conduct and human rights in general, and specifically the trends we are seeing on regulating and requiring companies to implement human rights due diligence measures.

So, and I will come back to kind of how this is linked to the issues we are talking about today.  The trends are very much currently being kind of pushed by the EU and the U.S.  We have seen, of course, legislation on specific human rights topics such as child rights and forced labor, but now there is this overarching push to implement or transform international standards such as the UN Guiding Principles on Business and Human Rights or the OECD guidelines into legislation and actually enforce human rights due diligence principles.

As a starting point, are we as a company, we definitely support this trend and sometimes it is viewed as going beyond the international standards, but it's important to remember that in the UN guiding principles themselves recognize that in the state duty to protect human rights also is included regulation of corporate conduct.  So this is a natural evolution and a natural step.

And two issues that I think are important to the topic that we are discussing now that we are trying to kind of push in the discussion, for example, on the EU level with the corporate sustainability due diligence directive is, one, to ensure that the scope of the diligence requirements take a full value chain approach.

There are some kind of interests within the EU that are pushing to limit this to only supply chain risks, and that would, of course, exclude all of the issues and risks we are talking about today, and limit corporate responsibility only to, for example, labor rights issues and supply chains.

So that is important especially from the perspective of a tech company, of course, to ensure that our responsibility also includes downstream due diligence and addresses how we develop technology, who we sell technology to, how we are engaged in policy discussions on specific bills and regulations such as on defamation, privacy, surveillance, so on.

That, I would say is one of the key issues to push through in the directive to make sure that this is dealt with in the right way.  The other issue, building on the comment on litigation is the provisions on civil liability in the directive.  And that, of course, ties back to the whole issue of value chain due diligence.  If we exclude downstream due diligence, that means that civil liability provisions also exclude downstream due diligence and issues of privacy and online harassment, for example.

If we make sure that a full value chain approach is ensured in the directive, then hopefully that also gives new tools to both civil society actors, but also, of course, advocates and litigators to ensure corporate conduct through those civil liability provisions.

So, of course, this directive is more general, not just talking about these issues specifically, but we have always pushed for using the international standards such as the UN guiding principles as a basis for the legislation, and why that's also important is that the UN guiding principles, of course, mention gender and women's rights specifically as issues that corporations need to take into account in their due diligence.

>> MODERATOR: I think it's a nice segue on the corporate conduct and how does it relate to gender and women's issues, and perhaps we will get back to some of the issues we mentioned.  I will go to you, Julia Haas, she is currently serving at the office of the OECE representative on freedom of the media.  I think maybe today, Julia Haas, you will be more concerned about media regulation.

I know that the OECE has, and maybe you can just give us more insights into this.

>> JULIA HAAS: Thanks for having me.  It's been great listening to the previous speakers and having this kind of interdisciplinary approach.  If you allow for the introductory, I would rather go a little bit on this abstract level as previously, and we can dive, I can provide more information later.

I think also acknowledging that it's the 16 day campaign against violence against women that you mentioned, and we heard already from previous speakers about really targeted hatred to silence women or gender disinformation to discredit women or surveillance with the same effect.

I think it's also important, and I think it's great that in this panel we have the opportunity to address gender and digital rights a little bit more holistically than only looking at violence, while violence needs to be addressed and is one of the biggest obstacles when we look at women's participation in digital spaces or just in public spaces generally, it's also important to look at various other things that are barriers and obstacles that women and others who are not representing the majority in society or have been marginalized in speaking out and exercising their human rights in the public and digital space.

So I think we also will have to look at the design and deployment and development of technology about biases that are often introduced or kind of like brought into technology and then if they are kind of like deployed in the digital setting then we have not only reproduction of these asymmetric power structures we have in the offline world and all of the biases and challenges they are being reinforced and amplified in the digital setting which then again impacts back into society.

So I think it's a great opportunity with the best practice presume but also this panel to address this from a more holistic approach.  There are two more things that I would like to say at this starting point, which is the first also linked a little bit to the holistic approach that I think is crucially important is that we see barriers and obstacles, challenges at every step.

So at the question of like women being able to speak out, participate, but also then the way they are targeted when they do so, which we have all seen this week in the very first session, which was the feminist web where they were really very massage gift and sexual, graphic attacks on Zoom, for example, which means when we speak about specific things we as society become targets, but then also the impact it has later on.  I'm sure we will dive deeper into this as well in the second round, but I want to mention it already now, and that this really impacts, of course, the individual woman who wants to speak out or the marginalized person, but it really impacts society.

It impacts our Democratic governance, our possibilities to really create a peaceful and Democratic kind of like common community and world we want to live in.  The last point I want to say at this stage because rhyme representing an intergovernmental organisation, that I think we need, of course, a whole of society approach. 

So I think it's important to have different perspectives, but I want to think talk about the positive obligations and human rights obligations of states.  And this is something Theo touched on and I'm grateful for that that we really always have to consider and remember this states have positive human rights obligation to protect, fulfill and realize our human rights, online and offline.

We have plenty of human rights including freedom of expression, media expression which I am working on but also specific women rights.  There is an obligation we have, this means it should be a regulation.  So I think that kind of like linking back to the title of today's session.  Regulatory responses are important, of course, but all of us know and I think we have heard plenty of examples throughout the week that unfortunately legislation has been misused or been adopted under false pretext or they are just taking place in a specific context and who is the legislator, whose concerns are being addressed?  Who sits at the table?

It's never a silo.  I think this is important to consider.  So the question of friend or foe is not easy to be responded, which, of course, is also the intention of the panel, but I think it is important to have regulation but ensure that it's properly done.  I think change is necessary, but we have to be cautious that they are developed in ray proper way and throughout the entire regulatory cycle that we have heard before, also the implementation and enforcement is crucially important.

>> MODERATOR: Thank you, Julia.  .  I think last but not least we have Bia Barbosa who is a journalist and a human rights specialist.  She is also civil society representative in the Brazilian Internet Steering Committee.  Thank you for joining us today.  And if you can give us your introductory remarks and the state of activism, I guess, in Brazil or Latin America.

>> BIA BARBOSA: Thank you very much for the invitation and also for the amazing job that the BPF on Gender and Digital Rights are doing and congrats as well for the report that is ongoing.  I have read it and I took parts that I will comment in the second round.  Thank you very much for this space.  I speak as a journalist who was closely followed the problem of women journalists being attached online for the last year, and if we look at the data of journalists killed in the world, and especially in Latin America, for example, where 90% of the killed journalists are men we may have a wrong done collusion about violence, if 90% of journalists are killed are men, there is no problem with women, but it's the opposite.

The digital environment has been a very difficult place for women journalists to work and the initiatives and the campaigns, the violent campaigns against women journalists that have been developed online everywhere in the world are very, have been very effective in silencing women journalists.

For example this year I worked for a reporter without borders often in Latin America and during the elections in Brazil, we followed the accounts of 120 journalists in Brazil, and from August to October only in the spirit, we registered around 3 million instant messages and also posts that seek to discredit the work of the press in general, and the report we published in the RSF website showed that every week four women occupied the list of the five most attacked journalists in Brazil.

So the most part of them on Twitter, so the volume of this aggression is frightening.  They were campaigns coordinated by extreme right wing groups, supporters of the current President of Brazil who lost the elections with highly misogynistic connotations.

When black women journalists were targeted, the messages were also racist.  The impact has been measured worldwide, not only in Brazil.

In Brazil research we have conducted with gender in number showed that at least 15% of women in LGBT journalists who have experienced online aggression have developed mental health problems.  So they in fact are only, it goes beyond the idea of silencing these women, but also we have developmental, they have developed mental health problems.

In the scenario, the response from digital platforms were practically nil.  Journalists are not recognized by this company, at least not from the point of policies in terms of use as a group that needs to be protected.  So that a population's right to assess accurate information is guaranteed and despite the research and evidence, all of the world, social networks continue to be permissive space for constant and organized violence against the press.  This is not exclusive to journalists.

We are talking about women and LGBT and minority groups, and in an ongoing basis it seems to me that we have a regulatory debate to be established relating to violence against journalists as well.  Because as in other aspects, the companies have not responded to the challenge that we have been observing.  I agree with the point brought related to different aspect of regulation and it's very positive of the feminist and the women's movement has already advanced this agenda, but from the journalist perspective, this topic is still quite a taboo because there is, it's quite distanced from the community of journalists in general mainly because there is a justified resistance of journalists to regulations related to freedom of expression because in the most part of the time these regulations are used to silence journalists everywhere, but I believe that this case, the level of the violence that we have reached means we cannot avoid, we can no longer avoid the debate because the absence of regulation or the social media and the platform regulation that is now silence women journalists so I think we have to move forward and circle this debate putting all of the human rights standards on the table but I think it can be no longer a taboo for us journalists.

>> MODERATOR: Thank you, I think you highlighted an important issue, which is the offline online kind of dichotomy where we see a lot of incidents online escalate into killings offline.  I think it's important to bring this to the fore when it comes to the regulation debate.  And perhaps just following up on this, I will go back to Mahima, and, can you please tell us more how Bumble is trying to approach these issues and how can you make sure that the dating app is safe for women users and people from diverse genders and sexualities.

>> MAHIMA KAUL: Happy to share what bumble is doing.  It's a women dating app and the whole goal is to have health and equitable relationships which is a challenge online but offline as well, so it's a tall mandate.  We have done some research in understanding what are the points of challenge for women when they come online, and when they spend time on an app like bumble.

So, for example, unwanted contact, which, of course, we are talking about in the context of activists and much more violent context, but unwanted contact is an area of concern for a lot of women, and that's one of the reasons they don't like to be online.  By its design the women make the first move.  That is a point of friction in the entire experience of dating online where women is not inundated with messages.  She is more in control of who is able to message here because she has a little bit of control in terms of the first contact made.

Similarly, unwanted images, lewd pictures of "dick pics"  that people refer to, and that is jarring and would keep people offline or off one platform or the other.  We use a proprietary technology to blur these images within the Bumble app and users could choose whether they wanted to see the image because it is an app which has consensual conversations as well.

It is blurred so people don't have to view it and they can report it directly.  What Bumble has done in the past is open sourced this technology, so now others online if they want to actually adopt this technology into their platforms and their spaces they can do that.

Similarly, what Bumble did on the policy side is go and find out markets and areas in which some of these behaviors were crimes offline but were not captured online, and have actually worked in six states in the U.S. and also in the U.K. when the online safety bill was being discussed to actually highlight that cyber flashing is something that needs to be captured within the law so when women go to the police to say this happened to me, they are not told, we know it's wrong, but it's not illegal.

So and just a last, so there, you know, we are making efforts to make the Internet Ecosystem not just on Bumble but outside Bumble safer for women because we believe then it will become safer for everyone.  Just to point to the mental health, you know, the aspect that you mentioned, when someone is the victim of online abuse and certainly sexual assault or violence, we know that one thing online is also taking account, and that seems like a victory.  We also at Bumble are aware that there could be a need for talking to a counselor or a therapist.

So Bumble has partnered with the organisation to, called Cheyenne, which is a French NGO, and the program we offer is Block.  So our users can bloom, so users can access a therapist via chat or in person sessions if they have been the victim of such an incident online or offline if they met somebody through our app.

So we are trying to take a holistic approach to see how we can make Bumble safer, certainly, but can the effects benefit the rest of the online ecosystem?

>> MODERATOR: Thank you so much for this, and perhaps also just following up on the mental health issue, I will go to you Onica, and I know you are showing a strong sensitivity to a lot of issue are, and dynamics affecting women and other disadvantaged populations.  Today, to what extent do you see the issue around mental health and trauma‑informed spaces is really echoing within the regulatory space, especially in Africa where there is a lot of conflicts and people need counseling and support and when it comes to trauma and violent images and, yes, images of rape and really, really shocking graphics online.

>> ONICA MAKWAKWA: We can thank the COVID experience.  One of the things I'm seeing is we are now finally open to having this discussions and even updates some of the policies and laws.  So, for example, South Africa has one of, I think, what we have been talking about the latest and most up‑to‑date cybercriminal bills that we have so far in the region, yet we are still faced with the challenge of having a bill that's considered to be progressive on these issues, but a law enforcement regime that's not knowledgeable and educated about how to enforce these laws that exist.

So I think, you know, so we sort of have this gap of policy and implementation which is something, I think, that plagues this region a lot.  We are really good at adopting Conventions and treaties at international level.  We are also great at updating our national laws, but implementation just doesn't seem to happen as fast, and I think that we have an opportunity to recognize that there is a capacity deficit that exists with the current enforcement agency that we have right now, and the laws that we have either adopted or are seeking for them to adopt.

So we have to make sure that we do this capacity development for these agencies so that they understand these issues.  The other thing that I think that gets us into a bit of a conundrum is the issue of language.

Sometimes the language itself is violent, and it takes a while for us to recognize that because that we are here speaking English is actually not a natural thing even for me.  So sometimes, and we see this, I think the platforms have a lot of this experience when it comes to moderating speech online because our biggest complaints now has been that it's not localized this moderation of speech, so when I report something, I will just give you an example, and I apologize that it's really graphic, where the gentleman, well, I don't know if we can call him a gentleman, but anyway, this particular male person posted online, on one social media platform that the vagina of a young girl is nice, however, society has a problem with it, but what they did was they posted it in their vernacular language.

So for English moderators, this is meaningless, and even when we attempted to translate it for them and report it and consistently say this is violent, right, it just rings all of the bells, we still got this response of this does not violate our community standards.

So the question comes, who sets community standards?  Who is the community?  If we are sitting here in this region and our speech moderators are sitting in another continent and may or may not understand our language, and, yes, there is the automatic translation online, but the context requires some level of nuance as well, that goes beyond just simple translation.

We have finally recognized the issues, I think we are conversant on what's about what is happening now, and what we need to adopt in terms of how to fix it, but implementation is really hitting us badly.

>> MODERATOR: Thank you, Onica, and I think this sounds music to my ear whose community standards are though?  And perhaps this is also a note to my BPF team to add this to the report.  So I will go back to you, Bia Barbosa, and I know you had comments on the report which you are welcome to present to us.  Thank you.

>> BIA BARBOSA: Thank you.  I would just like to take advantage of one case you mentioned in the report to stress that we really need to be careful about regulation, even if I support forms of regulation regarding violence, gender‑based violence and violence against journalists.

And I would like to share an example what happened last year in Brazil related to that, and is one of the cases that you mentioned in your report.  In 2021, President issued a provisional executive order that intended to forbid the practice of content moderation except in justified cases under the pretense of allowing ample freedom of expression, communication, and thought in Brazil.

Although the provisional measure establishes exemptions by denying as a general rule the prohibition of content removal with sanctions of non‑compliance, the regulation created an online, would create an online environment prone to the circulation of harmful speech, particularly distorted speech of political gender violence, hate speech and misinformation targeting women and LGBT people.

Among the exceptions of the text, content authorized to be excluded or blocked by the platforms would be content that would be related to incitement of acts of threat of violence including reasons for discrimination for reasons are of race, color, sex, ethnicity and sexual orientation.

There was no mention of gender‑based violence among the listed possibilities, but would open a gap for the maintenance of this type of discourse on social networks and denied since the beginning the existence of such forms of discrimination.

The President of the Senate summarily rejected the executive order, but there is still bills in the Brazilian Parliament that propose this limitation through the operation of the social networks.  I would like to reinforce I think that Julia would be able to go further the joint declaration, but I just would like to comment two recommendations of the joint declaration of freedom of expression and gender justice, one says that in consultation with media organisations and representatives of women journalists, states should develop and implement prevention, protection, monitoring and response mechanisms to ensure the safety of women journalists.

Here we have a big problem because many journalist protection policies in different part of the world are not developed with a gender perspective and they totally ignore digital security protection measures.  When you look at the Latin America protection mechanism for journalists, they at the beginning they were able to look for threats on social networks, but they don't have specific digital measures to protect journalists after that.

In another recommendation is that they, of course, the Internet intermediaries should be particularly meaningful in the way that their service is automated or algorithmic process and business practice to increase user engagement target advertising or engaging in profile mainly amplify gender stereotypes, bias, and gender‑based violence.

And that the company should ensure that their content moderation in policy and practice do not discriminate on the basis of gender.  We are very, very far from this reality.  I believe that as activists we should take advantage of broad debate that is going on about regulation in different parts of the world to address this issue, especially violence against women journalists.  UNESCO, for example, should launch in February a global framework for regulation of platforms.

It will also be a document of recommendations, and we have already many recommendations documents everywhere, so it's not a problem not knowing what to do, the recommendations are there, but I think that is a multilateral tool that can be used in advocacy strategies in different countries to ensure the adoption of a more directive legislation.

The idea, for example, of risk assessment which is in the digital service Act in Europe is a topic that could be interesting for a more protective approach for journalists, as I believe UNESCO framework could go in this direction that the document is still open to contribution, so I think we could take advantage of this space to reinforce the recommendations that joint declaration brought us.

Finally, I would like to share with you an initiative that we are developing the Brazilian connected Steering Committee which we also hope can contribute to adoption of policies and regulations that promote diversity in the digital environment in the ICT sector.

It's called the gender and diversity agenda, a project that has been implemented since 2021 by the Brazilian Internet Steering Committee, the CGI, and the agenda has challenges and more than 240 proposals of actions to implement this challenge to face this challenge as our Committee, CGI the agenda is a multistakeholder document, it is not published yet.

It's building by consultations inside and outside of Brazil.  We organize it here at the IGF in a workshop sharing this experience.  I invite you also to know this initiative that we hope can foster a more protective regulation but also a regulation based on human rights standards.  Thank you very much.  I'm really sorry that I need to leave early.  I'll stay five more minutes but I have a panel that starts at 3:00.

>> MODERATOR: Thank you so much.  I think saying within the Brazilian space I will go back to you, Mariana Valente, and I know that you worked on bodily issues and I know that regulation is not on go terms with women's bodies.  I know you would have to address these issues.  Perhaps, you have slides to present, right?

>> MARIANA VALENTE: If we can screen them, technical team, thank you very much.  It's interesting to hear you all.

Julia's remarks on how regulations, one of the strategies for addressing state's positive obligations and Bia who has been a very important promoter of gender digital rights in Brazil she spoke a little bit of regulations gone wrong, let's say.

But I wanted to speak of this general framework laws that have been approved in Brazil in the past ten years to address, especially gender‑based violence.  And different aspects of gender‑based violence.

Just as an introduction I think there are two things that are important to say.  The first that legislation, regulations, this he have different roles and we know feminists know very well that one of them is symbolic.  So, for example, when the Supreme Court decided in 2018 that LGBT, speech against LGBT was encompassed in the antidiscrimination law in Brazil, it becomes really a narrative also strategy.  It becomes part of the strategy of the movement to say, look, this is illegal.  This is something not allowed.

So we know that symbols are important, but I will address the proper effects of the laws here, and the second thing which I think is also a general comment is that there is a lot of discussion in Brazil and a lot of criticism on the role of criminalization for or movements especially because we are in a very unequal country, we know that criminal law affects different groups differently, and especially black people are disproportionately affected by the criminal system.

So this is something that I think it also like a general approach to the issue because it's a backfire in itself.  When we approved laws that criminalize, that's always something that has to be taken into consideration as well.  And that said, I really just wanted to show this quick framework that I have been working on, and I don't think it encompasses everything, but I think they are the most important laws that have been approved in the past ten years to address gender‑based violence.

I wanted to focus on two of them which I think we can refer to as having had backfired, let's say.  The first one is one called Carolynna Dieckmann law, many of these laws have names of women because they followed controversial cases that were very mediatized, and this law was approved in 2012 after this woman who is an actress, she had her picture disseminated online and that case became well known.  It was one of the first very well known the intimate images being disseminated, and it generated a lot of public discussion and right after it happened this law was approved.

The law criminalizes hacking into other person's computing device, and if you can take the next one, please.  What I wanted to show is that although this law was approved to address supposedly to address this case and it carries the name, it doesn't really address her story because her device wasn't really hacked.

That's not really what happened.  What happened in her story was that her email was invaded and this law doesn't really address that specific situation.  So there was a big narrative in the media of how now there was, there were laws in place to protect situations like Carolinna's and it wasn't so.  One of the paragraphs of the law establishes that the penalty is increased if it's a crime against the Republic, President of the support and so forth showing that perhaps protecting women wasn't exactly the reason why this law was approved in the first place.  Sometimes there is a lot of narrative about something being protective of women's rights online and when you look into it that's not the case.

And then I would like to address another one that was approved more recently, and that's the stalking law.  This was also considered somewhat of a victory for a feminist movement that had been speaking of how these actions of stalking, of persecuting someone repeatedly they didn't find an expression in the legal framework because we are speaking of very, let's say, small actions that per se are not illegal.

So, for example, sending someone a message is not illegal.  Adding someone through another profile is not illegal.  When you look at them systematically, you will see that there's a situation there causing a lot of distress.

So this law was finally approved to address this, but digital rights activists at the time were already trying to address that maybe the definition here was a bit too broad and it could be used to, it could be used against activists and that's precisely what happened.

So perhaps you have heard of sleeping giants.  Sleeping giants is an organisation that it's not really an organisation.  So it's different groups in different countries, but it's a strategy, let's say, that has been used against certain profiles, certain people, certain companies to make it very visible when they are, for example, acting against democracy or against human rights and sleeping giants what it does is to make campaigns against certain actors, and what happened in this case that I wanted to show you.  I'm sorry that it's bad language, this TV presenter, he said live on TV that gays were a disgraced race, and then sleeping giants started to campaign against this person asking basically advertisers to stop advertising for the program, and he sued them for stalking because what he said was that he was being persecuted repeatedly by them, by this campaign online, and that that was harming his psychological integrity and invading or disturbing his freedom or privacy, and he actually got on injunction consequence sleeping giants which obliged them to stop this campaign.

So this is just another example of how we have to really carefully craft these laws when we are thinking of protecting, of course, the intention was clearly really good, but how do you clearly craft it out so it doesn't really backlash.

There would be many different things for me to address in these different laws as well, but I think for the sake of time, I will stop here, and then perhaps in the discussion, I can address other situations in which these regulations have gone wrong as well.

>> MODERATOR: Thank you.  I think it's essential to have a bigger picture of what's going on and it's good to have you here portraying the Brazilian landscape.  I know we have been diving a lot on gender and women's rights issues.  I would like to go back to the corporate conduct.

I know that you, Theo, spoke about the downstream due diligence.  Perhaps because we have, you know, an interdisciplinary audience, if you can tell us more about what does this mean, and how we can really conduct due diligence against the back drop of fragmented regulatory framework that overlooks these issues that we have been speaking on.

Thank you.

>> THEO JAEKEL: So to start, maybe it could be helpful to explain a little bit more about Ericsson, we are a vendor so we are not an operator, ISP or platform, so we are not facing those types of issues of, for example, Government requests and so on.  What we do is develop the technology, design and develop the solutions, the technology that enables operators, platforms and so on.

So one example of, I mean, of course, we have ongoing downstream due diligence in our sales process where we always evaluate what type of technology we are selling and what context, to what customer, how can it be misused and can we build in certain safeguards from a technical perspective, but also contractually with the customer, for example, limiting the use of the technology.

But some examples additionally of how we can proactively address these issues and that tie into the topic of women's rights specifically on tech is that we last year published a human rights assessment of 5G technology.  So as a network vendor, we, of course, are kind of the key player in developing and enabling 5G technology being then the new generation of mobile networks.  We wanted to at the early stage address potential issues of misuse or how already known risks for human rights and women's rights can be materialized in different or new ways with 5G technology or potentially exacerbated.

I will give you a few examples.  The full report is available on our website so you are more than welcome to read the full report.  Some examples that we looked into are, for example, the increased use of IoT devices.  So already with issues of, for example, domestic surveillance with increased use of IoT devices, that potentially open up for other types of surveillance or surveillance through new products, of course, increase that risk if we are already marginalized in at risk groups.

An additional issue with IoT devices is that the third parties that create these IoT devices not always have the same kind of strong encryption of, for example,  handheld devices who are not as used to deal with issues of privacy and freedom of expression.  So there is a role of raising awareness with the third parties on how to design the products with that in mind.

Another issue that is definitely not new to the space is network shutdowns, but from the perspective of 5G technology with more dense deployment of cell towers or cells, that also potentially enables more precise network shutdowns so while today maybe you need to shut down an entire region on or block in a city, this is targeting a building or a specific person, for example.

And the third one is network segmentation.  So the legitimate use of that is to optimize how the network functions and to make sure that the kind of focus of the network is diverged into prioritized areas and so on such as, for example, healthcare, but, again, misuse that could lead to, for example, throttling or more, again, precise limitations of who has accessibility to the network and services such as digital healthcare, education and so on.

So these are some of the, just some of the few issues that we addressed in the report.

>> MODERATOR: Can we please mute the Zoom?

>> THEO JAEKEL: We have talked about legislation, but what is also important in our industry particularly to address these issues is through standardization because that's at the early stage of where we agree on how products are developed, what kind of safeguards need to be in place, so that's an additional way that we are working to address these issues is in standardization bodies.  Thank you.

>> MODERATOR: Thank you, Theo.  I think it gives a very nice picture of how corporate conduct, different risk assessments in order to make sure they address a lot of human rights issues.  And I guess speaking of Internet shutdowns and free speech, I will go back to you, Julia, and I guess you would want to tackle the declaration.

It would also be nice if you can also talk us through how you see media regulation against the background of crackdown on journalists and shrinking of civic spaces and how can we make sure that, yes, regulation is necessary, but at the same time we do not create a chilling effect on freedom of expression.  Thank you.

>> JULIA HAAS: Thanks so much for the pertinent question and I don't think we have enough time to speak about the excellent points we heard in the course of the law hour.  I wanted to start with following up on something that Bia said before that really limiting and links to your question, limiting speech to protect speech.

So I think this is really what we need to acknowledge the notion of we need all limitless freedom of expression means that there is a de‑facto limitation of speech through violence, through business practices, through all of these different things we have been discussing, so regulation needs to take on this role of ensuring that free speech for everyone is available, right, and that participatory and inclusive freedom of expression is possible.

So I wanted to start with this point, but also kind of like recognizing that all of the things we have been discussing throughout the week have this gender perspective, and one of the points is really kind of lack of rule of law in the digital context.

So I want to maybe say two more things before I go into the joint declaration.  Thanks for mentioning it and then also going to your question about media regulation is really trying to consider the entire life line of a speech when we now will take the example of freedom of expression.

I mentioned before that we have to recognize every step, so it's really about access to technology, access to means of expression, digital divides, it's the question of literacy and access to education and these points even before we can look at who is even able to participate in kind of like the public sphere and freedom of expression.

Then we have speaking out and here we have a clear link to what Bia said before about speaking out publicly because you are a woman journalist or because you are a woman politician that was something that was addressed at this IGF puts the person at an additional kind of risk of being targeted and being silenced.

We also see there are specific restrictions of gender expression, and, for example, the reproductive health issue.

And we see the problems and challenges we see are amplified in a digital context where we have platforms that are really kind of prioritizing user engagement in view of advertising revenues, even if it's at odds with public good, with diversity, with accuracy which brings along so many additional kind of like challenges.  I'm trying to move away from challenges, but what can we do.

Indeed we see a representative on freedom of the media together with the other free speech mandate holders have issued a joint declaration that touches upon this horizontal issue of gender and gender justice in freedom of expression.  It puts forward recommendations for stakeholders including media outlets and platforms.  When we speak about due diligence, many of the things Theo touched upon, but looking at states and what needs to be done to remove structural and systematic inequalities and barriers that we see and discrimination we see reinforced in the digital context.

Two points in this context that we heard before when you mentioned also the morals, so it's a question of who are enforcing, who are sitting at kind of like the decision making power in different steps.  It's also very often public morals that are understood from a very patriarchal system.

It's very often used as an excuse to protect or paternalistic approach to protect women online by restricting their expression and possibility to speak out online.  This links back to this freedom always has to be inclusive because otherwise it's a privilege of some, which means it's not freedom and it's not a right for everybody.

I also brought a few joint declarations in case you want to grab it.  In the sake of time, just two more small things because I think it's important to touch upon the safety of women journalists online, which also links to the question about media regulations specifically where we have, like the OEC has worked for quite a while.  I also brought a guide because I think what we tried is moving away from recommendations.  We heard that there are plenty of recommendations and documents out there of what should be done, but trying to look at how should it be done.

How was it successful in a specific context?  What are the good practice examples we can learn from?  And seeing what the different stakeholders be it the legislative, the executive, judiciary but also law enforcement which is the first step of institutionalized impunity when we speak of online violence.

So there are all of these different actors, and we need this whole of society approach.  But realizing the safety does not only mean not to be killed as was pointed out by Bia, but also legal safety in all of these different aspects.

In this context, it's relevant also to mention monitoring of online attacks.  If we look at the cases of where really violence against women journalists escalated to horrific online attacks or offline violence or even assassinations of women journalists, you can really look at the history of attacks online beforehand.

So I think it's really crucially important and this is something that we are working on now together with ICFG and trying to identify indicators for escalation so we can move from this denouncing when something happened to really a preventive approach and saying online violence is first of all a kind of indication for potential offline violence later, and worst case scenario impunity.

The second point is platform governance, I mean, we spoke, it was touched upon a little bit by Theo, but in the online context, many, I mean, most of the power and gate keeping to information and gate keeping to information spaces is really held by a handful of cooperations of companies that are all based also in Silicon Valley.  So there are other issues to that as well, but there is also transparency, accountability and inclusiveness in the process that are being developed.

Here I want to point out one point that if gender and human rights is not considered from the very beginning, really from the question of designing and developing technologies and setting the rules for the spaces where people can speak out, it's very difficult to bring it in later and you will have all of this gendered impacts and effects.

So human rights due diligence is crucially important, and it has to include this gender perspective, but be clearly linked to mitigation measures that have this gender perspective, and here is where regulation plays in and states have to ensure that this framework for online speech is set, but not focusing on content.  I think this was made clear before, but really focusing on processes, so focusing on the reach of speech and not the speech itself.

I'll close with that, but we also work on this platform governance and I brought a few publications from this.

>> MODERATOR: Thank you, Julia, and I think what you brought up is an excellent summary that would pave us the way to closing remarks as we are trying to wrap up the session.  And I think as with the title of the session, which is our regulatory practices really friend or foe to gender and digital rights, I would be interested to hear from you what you think about in 30 seconds.  I'll go with Mahima Kaul first.

>> MAHIMA KAUL: To be very brief, I'm based in India, and this country can be very patriarchal in its view towards women and the right that women have to be online.  I do believe we need to protect women on the Internet and not from the Internet, which is sometimes the view that a very paternal Government can take.  I will leave it there.  Thank you for having me.

>> MODERATOR: Onica?

>> ONICA MAKWAKWA: I think I will briefly say that it's important that we continue to do this work to consistently recognize that online violence has offline impact and vice versa, so it's really important that we continue to not see online violence in isolation to what actually happens to gender inequality in our society in general.

>> MODERATOR: Thank you.  Theo.

>> THEO JAEKEL: Thank you.  So I will just reiterate the point of making sure that any kind of future due diligence legislation really captures these issues and then the other point is, of course, that any due diligence efforts include the affected and impacted stakeholders in the process.  Thank you.

>> MODERATOR: Thank you.

>> MARIANA VALENTE: I think my answer is if gender lenses are considered and if there is a good understanding of general context, social and legal.  So then definitely a friend, but it can be an enemy too.

>> MODERATOR: Thank you.  Julia.

>> JULIA HAAS: Yes, and it needs to be built on collaborative approaches so that it's really a preventive, protective, but in the end kind of an empowering tool of regulation, and one additional sentence is that women's rights are always an early warning sign.  I think we have to link it to the general and authoritarian trends we see across the globe and back sliding of human rights.  So this is an aspect of where we need to fight for human rights and linked to the broader discussion of human rights and democracy.

>> MODERATOR: Collaborative approaches, dismantling institutionalized impunity, taking a few value chain approach and protecting women on the Internet and not from the Internet.  I think all of those are really words that echo and have echoed during the session.  So now we are going to take a few questions from the floor or from the online chat if there are any.

>> AUDIENCE: Thank you for this panel and for platforming this conversation.  You spoke about the U.K. and the U.S.  I'm wondering what are your interactions with the regulatory frameworks that are, perhaps, hostile to products like Bumble, like I'm thinking of Pakistan or other countries in Asia or in the non‑west.

And secondly, how are you approaching protecting queer communities in countries or context where same sex relationships might either be heavily discriminated against or even illegal?  Thank you.

>> MODERATOR: We will take one more question.

>> AUDIENCE: Thank you very much for this great panel.  I am from Jordan, and the culture of honour is very present in the everyday life, even if it doesn't kill you, you live in the fear that it will at least harm you greatly.

And there seems to be a lot of encouragement for us to speak up online, which is really good, and I see increased funding for lawyers for some high profile cases, which is also good.  I was thinking what can online communities do for the time that women are wait willing for results for their trials as well as their lived afterward, because those two phases seem to be gaps in the programmes addressed and there is there is, it’s probably the higher risk for them because this is when they face the backlash.  Thank you.

>> MODERATOR: Thank you.  So I guess I will go to Mahima for the first question.

>> MAHIMA KAUL: Thank you for the questions.  We are starting to engage with other countries, so we haven't engaged with all of the countries out there, but certainly depending on where we find the conversation, I give the example of Australia because we have actually been involved with all new processes regarding how the sector is going to be regulated so it's quite drilled down in terms of the detailing.

In India, for example, you will find laws that have been passed that capture some of the tactics used against women online, but we have hosted a number of conversations that we hope to lead into more formal reports and things which talk about whether the development of tech policy and what has come out has a gendered view in mind, and if they measured the impact on gender and there are researchers who have done good work in India.

I'm sure some of the panelists are familiar with their work, withdrawing on their papers or their observations and having these like intermultistakeholder discussions with great participation from some of the think tanks in India.

As for making sure that we cater to all communities, as you know Bumble has started more with the binary in mind.

The user base has grown so we work with glad in the U.S. to make sure users can self‑identify and that translated into looking at how the matching is done so women make the first move.

And similarly when it comes to protecting different communities, we are working with groups to understand what further protections we can offer.  I can give an example of women in India, they wanted actually their initials in their pro foils and not their full name until they wanted to share it because people, you know, matches may be finding them in other places on the Internet and that was a protection that we gave in India first.

So it's a work in progress.

>> MODERATOR: Well take another question.

>> AUDIENCE: My organisation has been focusing on online gender‑based violence for a few years and we found a track that has already been talked here.  I want to raise really an issue that we face when we are trying to get the online gender‑based violence reported to the law enforcement, and the issue is there is lack of digital evidence that we can get from the digital platforms, and there is also an issue of the law enforcement trying to get the victim to surrender their gadget which poses a privacy issue.

So maybe is there any best practices from the digital platforms on having an evidence base situation or evidence‑based mechanism that can be easily accessed by the victim so they can provide evidence to the police and how maybe from the technical aspect, like the design of the gadget, how can we get through the digital forensic without having to surrender our own privacy when we are giving the police our phone, for example?

>> MODERATOR: Thank you so much.  Who would like to take Jenna's question?

>> I can start, I think we have been seeing a lot of cases of trials involving violence against women and domestic violence that got really media‑ized around the world, and the one thing I think is important to say, you were referring to online communities and what could be done in between.  I think it's really important that online communities, feminist movements, they provide support to the women who are going through this online, especially because we have seen from very media‑ized cases in the U.S. and Europe that campaigns and public opinion starts to shift to a very sexist sue.

And we have seen cases also from the Global South of trials that got very media‑ized in which the support, the feminist movement support for the woman who was under, who was on the trial was very important.  Be it for public opinion, be it for support.  One can say that also the cases end up being influenced by this public opinion.  That's a harder one to make.

Anyway, I think it's important that these discourses get mobilized.  And then just to quickly address the part of evidence, I don't think I can answer to all of that, but for sure the judiciary should have guidelines for access to evidence and keeping evidence safe, and that's a problem we have been seeing in many countries that sometimes people won't want to report something because they know that they are not really secure practices and perhaps something that they just want to keep private will get even more disseminated.

That's not the full answer, you were referring to platforms, but just plugging on that.

>> MODERATOR: Thank you.

>> ONICA MAKWAKWA: I think it's important that we continue the work, I think there is a piece of work that is going on right now on generation equality on the technology action platform that is looking at platforms themselves actually including some of the features in the design of the platforms and I think it's important for us to connect what is, the discussions here and monitoring what's happening in that work stream.  The case that I talked about when we were opening, we actually had great difficulty defending the young women in court who were being sued for defamation of character by accused perpetrators because the particular platform they used to out these perpetrators was, well, I use perpetrators was actually, they closed the account and took down the post.  So I think that based on some of what we have been watching there is a role for platforms to play, but it also goes to the issue of not overburdening victims of the violence with also defending themselves.  It is a form of secondary victimization where they now must continuously tell their story at every point and prove that this actually happened.

And fundamentally, I think it's important for us to connect this to what's happening in societal discourse.  It R it call stems from a narrative of not believing women.

And, therefore the platforms have to take play some role because they perpetuate this notion that women could be lying about this and, therefore, they have to come to the fold around issues of making sure evidence is available.

>> MODERATOR: Thank you Onica.  Thanks, everyone, for joining us here today.

>> MAHIMA KAUL: I think design is important.  There was a case in Australia which made is clear that if the perpetrator, alleged perpetrator unmatched, the victim was not able to access chat so was not able to go and report.  I mean at least from a Bumble perspective when we saw this in the news, we realized it was a design correction that needed to be made so that if someone has done something to you and they block you or they unmatch you and you cannot found them to report them, that should be available.

So, yes, I agree.  I think platforms should make it easier to give people what they need to go get the justice they want.

>> MODERATOR: Thank you, and thank you for joining us online.  Thanks, everyone here today for joining us on a Friday afternoon.  That's a wrap.  Thank you.