Organizer 1: Alexander Schaefer, German Federal Ministry of Justice and Consumer Protection
Speaker 1: Gerd Billen, Government, Western European and Others Group (WEOG)
Speaker 2: Chan-jo Jun, Civil Society, Western European and Others Group (WEOG)
Speaker 3: Ingrid Brodnig, Civil Society, Western European and Others Group (WEOG)
Speaker 4: David Kaye, Intergovernmental Organization, Western European and Others Group (WEOG)
Panel - Auditorium - 90 Min
What role should Internet platforms and governments play in defining the standards for acceptable content online in the light of freedom of speech? How can globally accepted standards be developed? Where is the middle ground between increasing demands for proactive content policing by digital platforms and the necessary neutrality and legal certainty for platforms?
GOAL 4: Quality Education
GOAL 8: Decent Work and Economic Growth
GOAL 9: Industry, Innovation and Infrastructure
GOAL 16: Peace, Justice and Strong Institutions
Description: Participants: - Gerd Billen, State Secretary, German Federal Ministry of Justice and Consumer Protection (confirmed) - David Kaye, UC Irvine School of Law, UN Special Rapporteur on the Right to Freedom of Opinion and Expression (t.b.c) - Karine Nahon, Associate Professor, The Information School at University of Washington and the Interdisciplinary Center Herzliya (Israel) (t.b.c.) - Chan-jo Jun, Specialist lawyer for IT law (confirmed) - Ingrid Brodnig, author, activist and Journalist (confirmed) The workshop will begin with a presentation by attorney Chan-jo Jun, who rose to fame by supporting victims of hate speech online and instigating legal proceedings against Facebook. The other speakers will then have the opportunity to share perceived similar and/or different situations. The representative of the German government will then give a short overview about the Network Enforcement Act, and explain the reasons why the German Parliament passed the law, which introduced compliance obligations for social networks when dealing with complaints about illegal content online (Gerd Billen). The other participants will be asked to discuss other available instruments and strategies to fight harmful content. In particular, discussions will focus on what safeguards should be applied to secure freedom of speech (esp. David Kaye). Finally, it will be debated how chances stand to develop internationally accepted standards on how to deal with harmful content. During the workshop the audience will continuously have the opportunity to share their views and ask questions.
Expected Outcomes: A possible outcome could be the conclusion that it is a joint responsibility of all stakeholders to ensure a free and safe Internet from which harmful content is removed swiftly and effectively. However there will remain different opinions on what instrument will be the most appropriate to reach this aim. Nevertheless it should become clearer what is understood by harmful content and that there should be certain limits for the removal of content in order to preserve freedom of speech.
An onsite moderator (still to be designated) will lead through the Workshop and will make sure that the audience can give their views and ask questions.
Relevance to Theme: When the World Wide Web was developed in the 90's, hopes and expectations were high that it would be a space where people around the world could communicate freely and safely. However, in the last years it turned out that in particular social networks are often misused to distribute hate speech and that social networks are a place where harassment and bullying takes place. As a consequence, trust in the Internet was shaken. Although social networks in the first place denied their accountability for harmful third party content, governments and civil society urged them to remove harmful content from their platforms. In some cases social networks agreed to participate in codes of conduct, in others legislators introduced a legal framework social networks have to comply with. In this context, two aspects need to be observed: Firstly, measures need to be efficient to stop harmful content. Secondly, measures need to find a balance between the protection of human dignity and freedom of speech. If these rules are respected, trust in the Internet can be restored and accountability of the providers established.
Relevance to Internet Governance: At the centre of the debate is what the roles of governments, the private sector and civil society respectively are when dealing with the challenge of hate speech online. There are different views about who should be responsible to set the rules for keeping the Internet free from harmful content. The possible instruments vary from private standards over codes of conduct to legally binding rules.
Usage of IGF Tool