IGF 2021 Lightning Talk #35 Australia’s approach to regulating online harms – navigating the great balancing act of the digital age.

Tuesday, 7th December, 2021 (08:00 UTC) - Tuesday, 7th December, 2021 (08:30 UTC)
Conference Room 6

eSafety Commissioner
- Commissioner Julie Inman Grant, eSafety Australia - Dr Julia Fossi, Director of International Strategy and Futures, eSafety Australia - Ella Serry, Manager, International Engagement and Capacity Building, eSafety Australia


- Commissioner Julie Inman Grant, eSafety Commissioner, Australia

Onsite Moderator





Live presentation

Duration (minutes)



As the world's first eSafety Commissioner, Julie Inman Grant will provide an overview of Australia’s approach to regulating online harms, including targeted content take downs in conjunction with significant prevention and proactive change initiatives. Commissioner Inman Grant will outline how eSafety serves as an important “safety net” when harmful content falls through the cracks, taking a look at the delicate balancing act of tackling online hate and misogyny that silences victims, while at the same time preserving free speech - and deep diving into the challenges that encryption, unethical AI and decentralisation pose to the safety of us all online. Content moderation is complex and multi-faceted. In addition to government regulation, it requires digital platforms and services actively fostering a safer online environment. The Commissioner will discuss how Safety by Design, eSafety’s world-leading initiative developed in conjunction with industry, can play a crucial role in assisting companies to do this. Finally, the Commissioner will highlight the need to balance the powerful but necessary tensions between privacy, security and safety, outlining eSafety’s position on the often vexed issue of online anonymity. She will also discuss the importance of securing harmonisation across jurisdictions to avoid the emergence of a “regulatory splinternet” as governments develop online regulations which suit the unique environment of their jurisdiction, while also cooperating with each other to create a harmonised environment. The session will focus on the following policy questions (based on the IGF2021 issue areas and narratives): • Emerging regulation: market structure, content, data and consumer/users rights regulation o Content moderation and human rights compliance: How to ensure that government regulation, self-regulation and co-regulation approaches to content moderation are compliant with human rights frameworks, are transparent and accountable, and enable a safe, united and inclusive Internet? o Protecting consumer rights: What regulatory approaches are/could be effective in upholding consumer rights, offering adequate remedies for rights violations, and eliminating unfair and deceptive practices from the part of Internet companies? • Trust, security and stability o Ensuring a safe digital space: How should governments, Internet businesses and other stakeholders protect citizens, including vulnerable citizens, against online exploitation and abuse? o International standards: How should international standards address the different requirements and preferences of governments and citizens in different countries? And touch on the following issues (also based on the IGF issue areas and narrative) o Violent content, content moderation, hate speech, freedom of expression, interoperability, decentralisation.

The first component will be a presentation delivered online via a live stream with accompanying slides. The audience will be able to send in questions in a live chat section as the Commissioner is speaking. After the presentation, our online moderator will select questions asked in the chat for the Commissioner to answer. This will enable meaningful and real-time interaction with audiences.

Key Takeaways (* deadline 2 hours after session)

Globally, we’ve reached a tipping point. Governments are starting to wake up to the fact that what’s playing out online is not only a threat to individual citizens, but also an existential threat to democracy and civilised society. • Our goal is to avoid a patchwork and fragmentation of online safety legislation, governance arrangements and national online safety measures.

Call to Action (* deadline 2 hours after session)

Governments, industry and civil society need to find a way to balance fundamental human rights in the digital environment, and stop thinking of privacy, security and safety as mutually exclusive. A healthy system relies on all three existing in a natural symbiosis.

Session Report (* deadline 26 October) - click on the ? symbol for instructions

Commissioner Inman Grant’s lightening talk was an exploration of the evolution of harms on the internet. She noted that today’s internet has brought humanity a myriad of benefits. The Covid-19 Pandemic has seen the internet become an essential utility as the whole world turned to it to continue to work, learn, communicate and be entertained.  

But along with the good, we also need to acknowledge the bad. Unfortunately, the internet has also become a highly enabling environment for many forms of abuse.

 There’s the relentless online bullying of children, targeted misogyny, hatred and racism, the unchecked spread of disinformation and misinformation, terrorist weaponization of social media, and the most horrifying of all, the grooming, sexual exploitation and abuse of children.

Commissioner Inman Grant remarked that globally, we’ve reached a tipping point, and governments around the world are starting to wake up to the fact that what’s now playing out online is not only a threat to individual citizens, but also potentially looms as an existential threat to democracy and civilised society. 

 The Australian Government recognised this challenge, and established the eSafety Commissioner in 2015. Commissioner Inman Grant took participants on a deep dive through Australia’s approach to regulating online harms.

The discussion then turned to the operating in the complicated environment of content moderation. The Commissioner noted that content moderation discussion are often characterised by binaries views of the need to uphold privacy and security, or safety and protection. It is vital that the technology industry and wider society start to reframe how we view our most pressing online problems.

The Commissioner implored participants to stop thinking of privacy, security and safety as mutually exclusive. A healthy system relies on all three existing in a natural symbiosis. People often talk about the absolute right to freedom of expression and freedom of speech, but what about when this free speech veers headlong into the realm of targeted abuse and online harassment? She asked the audience, shouldn’t those on the receiving end have an equal right to exist free from online violence? Absolute free speech often ends up taking away the free speech of others as they are driven offline by a barrage of harassment and abuse.

Digital rights protectors also have a moral responsibility to broaden this discussion and take a more nuanced approach that at least gives equal billing to the rights of those who are most at risk online alongside the rights of freedom of speech and privacy. Governments, industry and civil society all need to find, a way to balance a set of fundamental human rights.

The conversation then shifted to emerging issues, tech trends and challenges, because technological change will always outpace policy, to be effective, it is imperative to stay a step ahead of tech trends and challenges and ensure that a lens of ‘safety’ is applied to emerging issues.

She noted the balanced and nuanced view to emerging technologies and trends taken by the eSafety Commissioner – weighing up both the risks and benefits new innovations could have for the safety and wellbeing of the public, but also providing a critical lens on how these could be used to abuse, harass or harm individuals - and then pointing to solutions where they exist.

Questions and answers focused on a range of key issues and themes, including the need for international collaboration to address online harms, how to regulate the metaverse, the need to protect women from online harassment and abuse.