IGF 2019 WS #141 Best practices for child protection and sexual speech online

Organizer 1: Jyoti Panday , Electronic Frontier Foundation
Organizer 2: Ingerman Meagan, Prostasia Foundation

Speaker 1: Malcolm Jeremy, Civil Society, Western European and Others Group (WEOG)
Speaker 2: Catherine Gellis, Private Sector, Western European and Others Group (WEOG)
Speaker 3: Jillian York, Civil Society, Western European and Others Group (WEOG)
Speaker 4: Fauzia Idrees Abro, Technical Community, Asia-Pacific Group
Speaker 5: Takashi Yamaguchi, Private Sector, Asia-Pacific Group

Policy Question(s): 

How can children’s rights to participation, access to information, and freedom of speech be preserved and balanced with their right to be protected from violence, exploitation and sexual abuse in the online environment? How can Internet platforms of all sizes take a nuanced and better-informed approach towards content moderation and censorship, that does not over-censor legitimate sexual content such as art, fiction, sexual education material, and testimonials from survivors? How can these platforms fulfill their obligations under the United Nations Guiding Principles on Business and Human Rights to conduct due diligence that identifies, addresses and accounts for actual and potential human rights impacts of their activities, when it comes to measures they take for the protection of children?

Relevance to Theme: The session contributes to the theme insofar as it addresses the management of risks to child safety online, while also taking a broader perspective to ensure that potential solutions do not infringe the human rights of other stakeholders, especially those who are stigmatized and marginalized, such as sex workers, adult entertainers, creators and fans of independent media, sex educators, the LGBTQ+ community, and others who have legitimate reasons for communicating sexual content online. It also concerns the need for trust and accountability of Internet platforms, who make decisions about the moderation of sexual content based on their internal policies and terms of service. In doing so they frequently make use of resources such as hash lists and URL lists that are not made publicly available, raising questions about the accountability of actions taken using these tools.

Relevance to Internet Governance: Child Sexual Exploitation Material (CSEM) is almost exclusively distributed online, and a significant proportion of the sexual grooming of minors is also conducted online. Child protection laws such as FOSTA are directed specifically against Internet platforms, and in 2019 the UK government announced the introduction of a tough new regulatory regime requiring Internet platforms to assume a duty of care to keep children safe from online harms. Also in 2019 the United Nations Committee on the Rights of the Child released Draft Guidelines on the implementation of the Optional Protocol to the Convention on the Rights of the Child on the sale of children, child prostitution and child pornography, which contain a number of recommendations about the responsibilities of Internet platforms. As such, this session is deeply relevant to Internet governance.

Format: 

Round Table - U-shape - 90 Min

Description: Internet content platforms (such as search engines, social networks, chat applications, and cloud storage services) are frequently the first port of call for regulators seeking to find easy solutions to the problem of online child sexual abuse. But although platforms have made a vital contribution towards this effort and will continue to do so, there are at least three limitations of the approach that regulators are pushing platforms to take.

First, it tends to promote a “one size fits all” approach that overlooks the differences between platforms in terms of their financial resources and technical expertise. As the the Internet Watch Foundation (IWF) has testified to the UK government, "There is a myth that the tech industry is a-wash with money and the brightest and the best brains, with the ability to solve all the world’s problems and whilst that may be true of some of the larger players, there is a need to recognise that much of the tech industry in the UK is made up of small start-ups that do not have access to the sorts of resources Government think they do."

Second, when platforms are pushed into over-blocking and over-censoring, this frequently results in infringements of the civil liberties of minorities such as sex workers, the LGBT community, survivors of child sexual abuse, and sex workers. For example, the U.S. law FOSTA (Allow States and Victims to Fight Online Sex Trafficking Act) which was putatively aimed at making Internet platforms liable for the facilitation of sex trafficking, has in practice also resulted in the censorship of lawful speech, including sex education resources.

Third, an approach that pushes platforms into censoring any sexual content that they instinctively regard as “questionable” does not actually protect children, and could indeed harm them. Sometimes platforms choosing ​not​ to censor content is more likely to protect children from sexual abuse. For example, in response to FOSTA, threats of regulation, and public pressure, platforms have been found censoring child sexual abuse ​prevention​ materials and forums.

More broadly, United Nations Special Rapporteur David Kaye found in his 2018 report on the promotion and protection of the right to freedom of opinion and expression that the failure of platforms to apply human rights standards in their policies related to sexual content has resulted in the removal of resources for members of sexual minorities, and depictions of nudity with historical, cultural or educational value.

Currently, many platforms do already have child protection policies as part of their content policies or community standards, however these can be vague and unpredictable in their application even by a single platform, let alone between platforms. Smaller platforms may not have well-developed policies on this topic at all. Even in mid-size platforms, trust and safety teams are typically composed of members who deal with other forms of abusive content such as spam and fraud, but which lack dedicated expertise in child protection. Often, requests to block or restrict content are received from third parties, but are not adequately reviewed internally before being actioned.

Platforms of all sizes need to be empowered to be made more effective contributors towards child sexual abuse prevention, through a more nuanced and better-informed approach towards content moderation and censorship.

Unfortunately, to date two obstacles have prevented this from becoming a reality. First, many of the largest mainstream child protection organizations that have promoted platform liability rules as a solution to child sexual abuse have a broader agenda to eliminate adult content online, and they exclude perspectives of those who don’t share that agenda, such as sex-positive therapists and researchers, LGBT people, sex workers, and the consensual kink community. As a result, there has been nobody to speak up when these communities become casualties of censorship such as over-blocking.

The second factor that has prevented platforms from taking a more nuanced and better informed approach towards content moderation and censorship as it relates to child protection is the powerful sexual stigma that affects all who work in this area. Although approaches based on the prevention of child sexual abuse are effective, stigma makes it difficult for this approach to make headway against the emotionally more resonant approach of identifying and prosecuting offenders. It also makes it difficult to suggest balances and safeguards for child protection laws and policies that are necessary in a free and democratic society.

Prostasia Foundation will be convening a multi- stakeholder seminar and roundtable discussion on the roles that Internet companies can take towards the prevention of online child sexual abuse, in a way that is consistent with human rights and Internet freedom. The first phase of this convening is a full-day expert-led seminar and discussion with Internet platforms, along with representatives from marginalized stakeholder groups, to be held in San Francisco in May 2019. Following this, a self-selected working group will form to work online to synthesize the learnings of the event in a draft, non-normative best practices document.

This best practices document will become the input for a roundtable workshop that is to be held at RightsCon 2019, at which we will hold a multi-stakeholder facilitated deliberation to further distill the draft best practices document into a series of normative recommendations. Finally, the best practices paper and the policy recommendations will be presented at the 2019 Internet Governance Forum with the aim of socializing them within a broader community of stakeholders, and assessing the degree of consensus that they have achieved. In addition, we will be presenting a new report on the transparency and accountability practices of major platforms, consultants and agencies involved in online child protection.

Expected Outcomes: The objective of this project is to enable industry participants to ensure that their child protection policies and practices are scientifically sound, and that they fulfil their obligations under the United Nations Guiding Principles on Business and Human Rights, which require companies to "Conduct due diligence that identifies, addresses and accounts for actual and potential human rights impacts of their activities, including through regular risk and impact assessments, meaningful consultation with potentially affected groups and other stakeholders, and appropriate follow-up action that mitigates or prevents these impacts."

By facilitating a dialogue with experts and stakeholders who are normally excluded from the development of child protection policies by Internet platforms, we aim to make these policies more evidence-informed, and more compliant with human rights standards. In concrete terms, this will be evidenced by improved accuracy in the moderation of sexual content. Specifically, participating platforms will remove more material that is harmful to children and has no protected expressive value, and less material such as lawful, accurate information on child sexual abuse prevention. The ultimate result of this will be that more children are saved from child sexual abuse.

In addition, four tangible outputs will be produced from this workshop and its preparatory events:

1. Best practices paper:​​ The best practices document prepared in between the first and second face-to-face convenings will record the messages shared by experts, stakeholder representatives, and Internet platforms at the first convening in San Francisco. This document will include references to source materials and will guide participants at the second convening towards the development of key policy recommendations.

2. Policy recommendations:​​ A set of policy recommendations will be finalized at the expert-facilitated follow-up event at RightsCon. Although the intention of this document is not to standardize terms of service related to child protection across the industry, it may include a set of model terms of service for Internet platforms with respect to child protection that smaller Internet platforms can easily adapt and use.

3. Transparency and accountability report​​: This inaugural report on the practices of Internet platforms, software vendors, and content rating agencies will become an ongoing resource for those who are affected by the child protection practices of these bodies, and provide an aspirational standard for improvements in their accountability and transparency.

4. Advisory network:​​ The process will also result in formation of a standing advisory network of stakeholders, with secretariat support from Prostasia Foundation, who can provide advice and feedback to Internet platforms on their child protection policies and their human rights impacts.

Onsite Moderator: 

Jyoti Panday , Civil Society, Asia-Pacific Group

Online Moderator: 

Ingerman Meagan, Civil Society, Western European and Others Group (WEOG)

Rapporteur: 

Ingerman Meagan, Civil Society, Western European and Others Group (WEOG)

Discussion Facilitation: 

The session will be divided into three parts of approximately equal duration. During the first part, the content of the best practices paper, the policy recommendations, and the transparency and accountability report, will be outlined and questions from the onsite and remote participants will be taken. During the second part, our diverse expert panel will react and provide their perspectives and further insights, and will invite an interactive discussion from the local and remote participants. The final 30 minutes of the session will then be devoted to intensive facilitated group deliberation on the policy recommendations, to assess whether the degree of consensus that they have achieved among the session participants. During this final part of the session, the best practice recommendations developed during the preparatory meetings will be discussed point by point in a roundtable format, facilitated by the onsite and online moderators, and with note-taking by the rapporteur.

Online Participation: 

Using the official online participation tool, our remote moderator will take questions and comments from remote participants. The remote moderator will be called upon in each round of questions taken from the floor, so that to the nearest extent possible, remote participants receive parity in treatment with those who are present in person. We will also use the official online participation tool to provide links to presentation files that are being displayed in the session room, so that remote participants can load these on their own computers, rather than having to view them via the webcast video.

SDGs: 

GOAL 5: Gender Equality
GOAL 16: Peace, Justice and Strong Institutions