IGF 2021 – Day 1 – Lightning Talk #35 Australia’s approach to regulating online harms – navigating the great balancing act of the digital age.

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.



>> COMMISSIONER JULIE INMAN GRANT:  Hello, everybody.  Thank you so much for this opportunity to speak to you today.  As Australia's safety Commissioner heading out the world's online safety regulator, I am really excited to share with you Australia's approach to regulating online harms.  During a time when there's global recognition of the importance of online safety recognition and how that might be done to make sure we're preserving freedoms of expressions, but also allowing the innovation and technology to thrive.  During the present, a please feel free to send any question in the chat function and my moderators Julia and Ella will respond directly to you.  There will be time in the end for questions.  Next slide.

It really wasn't all that long ago that the internet was still in fist infancy and the world was still very much in an analog state.  In fact, while computers have been networked in one form or another since the 1960s, many of the online threats were discussed today at the Internet Governance forum.  It wasn't really till the 1990s when affordable personal computers met the 56K modem that the Internet really took off.  These game changing devices finally brought the Internet to the masses.  Working with tech during the 1990s, it was a time of great promise and also a time when any time of taxation and regulation of this burgeoning industry was framed as an entitlement to innovation and growth.  Tech companies were moving fast and breaking things and wanted only one thing from government.  To stay out of their way.  Governments from all around the world were happy to oblige.  Then they are vastly different to the one that we know now.

Today's Internet has brought humanity a myriad of benefits.  The COVID‑19 pandemic has seen the Internet become an essential utility as the wonder turned to it to continue work, learn, communicate and be entertained.  But along with the good, we also need to acknowledge the bad.  Fortunately the Internet has also become a highly enable environment for many forms of abuse.  There's the relentless online bullying of children, targeting misogyny, hatred and terrorist weaponization of social media and the most horrifying, the grooming sexual exploitation of abuse in children.  I think there is no doubt we have reached a tipping point and the governments around the world starting to wake up to the fact that what is playing out online is not only a threat to individual citizens, but lose a threat to democracy in Civil Society.  The riots gave the world a small taste of what this future can look like.  Because polarization and multiple victors of abuse have become (?) online and it can be difficult ton where to start.  So why then is Internet safety becoming such a popular idea and a growing focus amongst governments?  I think there's a really practical reason for that.  It's really about the markets failure to self‑regulate and address issues of user safety.  Right now the burden falls squarely on the shoulders of children in vulnerability communities.  Right behind them are the parents and seeking to protect at‑risk voices.  They're at that top of inverted pyramid while big fix is currently at the bottom.  We need to flip that pyramid so there is responsibility at the top and with children and marginalized communities at the narrow end.

It's sad for me that the very companies that we once seen as beacons of internet freedom decades ago are now seen as facilitators of an unsafe Internet and now serves to silence diverse voices.  They have the smarts to be a major part of the solution to making the NET safe are and more inclusive.  They have the way.  Many of them just don't have the will or frankly the leadership to make ‑‑ the safety and well being of the users a priority.  While there is a burgeons tech safety sector, there is no solution coming from the market.  So governance is to keep their citizens safe have had no option but to intervene.  But we need to insure that blunt force options that hinder the benefits of technology are not the go‑to approach.  We create a splinter net of regulatory requirements across the globe.

So ahead of the pack in this regard, in 2019, the governmenting to the world's first online safety.

Earlier this year, our parliament passed the online safety act which not only strengthens our existing powers but also provides eSafety with some mutuals and arsenal to help us better protect moral Australians online.  A citizen focused agency like ours has not been easy.  We have had to write the play book as we go along.  We're seeking to fill in more pages.  In just six short years, we believed we have set up a colorful, practical model.  Prevention, of course, involves developing best practices resources and programs to prevent the online harms from happening in the first place.  And the research team is critical in fighting this out.  Next pillar is protection and this covers work of the investigations divisions who have run many regulatory schemes.  We can provide direct support through the application of a rage of civil powers to ‑‑ range of civil powers to take down harmful content.  Whether sexual abuse content or the non‑consensual sharing of intimate videos.  The third pillar proactive and system change involves leveling the online plague field by encouraging tech platforms to take greater responsibility for user safety.  Putting the protections in upfront rather than retrofitting them after the damage has been done.  And to us, there is about long‑term cultural change that goes to the heard offer Safety by Design initiatives so that companies are amassing the risks, building it in and baking it in rather than bolding it on.  But I will discuss that later.  We also know technology will outpace policy.  So actively shaping the future by understanding tech trends and challenging, looking at the positive use cases but also surfacing and mitigating risks for citizens.

There's no question that COVID‑19 has affected all of our lives in profound the ways.  This is also in the data we have collected in the last two years.  There is a need for us to be citizen across the globe.  It is more of this began interacting the Ron line fuel.  Online harms have become some charged.  In Australia, we saw spikes across all our reporting areas of eSafety.  During our 2020, our online content scheme received 21,000 at the population of 26 million, the majority involved ‑‑ that was a surge from 11,000 to 21,000 reports.  We also saw a 114% increase in reports of image based abuse, which is not surprising given that people did turn to digital intimacy tools, but we also saw a rage of extortion scams that Australians were falling prey to.  At one point, we had a 600 per increase of reports to our office.  The cyberbullying of children decreased to 30%.  And sadly, the elevate the levels of online abuse have shown no signs of abating in 2021.  We believe what we're seeing now represents an alarming new normal.  Now statistics are not unique to Australia.  Many other countries see the online harm causing to their citizens.  Here are some of the voices of Australians we helped.

>> I was harassed online all through school.

>> My ex used my phone to stalk me.

>> We were mean to another player online.

>> I wasn't sure if online classes were safe for my students.

>> Someone shared a nude photo of me online.

>> It made me feel exposed.

>> Ashamed.

>> Trapped.

>> It made me feel angry.

>> You are not alone.  If you're online, eSafety can help.

>> ESafety helped me respect others online.

>> ESafety helped me learn to protect myself online.  Helped me a more confident teacher.  Helped me get my picture moved.

>> COMMISSIONER JULIE INMAN GRANT:  It seems that sometimes buffering isn't perfect either.  What sets Australia art part is what we're doing to combat the numbers this includes world leading new online safety reforms passed guy the Australian government in July.  The online safety act becomes into effect.  The new laws we provide enhance powers to significantly boost all protections for Australians.  So our existing cyberbullying will join out not just social media, but this includes dating sites and the online forming platforms.  It is to bring about the new world versed adult cyberbullying.  We're trying to really balance freedom of expression in the period and opinion on one side with that harmful targeted harassment that is really meant to cause harm and silence voices.  We'll also strengthen our image based abuse scheme by enabling us to rapidly and more expeditiously remove that content.  Sites must respond from 48 to 85 hours.  We get nose images and videos down from sites all over the globe.

So the act also carries significant finds for offenders and makes provision for happy penalties, for platforms and abuse like this can't happen.  And happy to answer more of your questions about the diverse tools as part of this act.  But I want to turn now to Safety by Design.

Along with our new powers, there will be an underpinning regulatory regime that gives us a modern affect of tools to address online harms.  Run ‑‑ unless we're making the platforms and the internet infrastructure we're all using safer, we never are going to regulate our way out of this.  We will end up playing a game.  A quarter of this is world leading initiative to change the industry to designing more mindfully.  It makes good business sense.  So we sat down with more than 180 organizations and companies with Civil Society and others to look at how we could come up with a set of principles that everyone can agree to.  And the idea as I said before is to help built safety into products before they go into mart rather than retrofitting them after the damage has been done.  In 2018, we cage up with a set of three principles working overnight with about 80%.  We will principals are only as good as they're implemented.  Otherwise they're just principles.  We also sensed principles fatigue.  So in speaking with some the companies, we decided to take that step further because while there are lots ‑‑ there was nothing in the safety space.  Over an 18 month period, we developed, coded and weighted two sets interactive assist ant tools.  One for start ups, which is very educated and takes about an hour.  We also have one from enterprise companies and these tools got us through a set of questions covering everything from leadership to internal policies to the best practice moderation and accountability measures and then asking them about what systems, processes are in place.  It spits out a health check or impact.  32 countries have now accessed the tools and health checks.  Because they're educated, we don't want any company, particularly a start up to be able to say we just didn't know how our platforms would be weaponized.  The whole idea here is to lift safety standards and hopefully people will be able to better comply with laws and keep their user safer if they're out liking tease tools.  Just ‑‑ utilizing these tools.

Just wondering how, when and where we will choose to yield them.  I am here to tell you by applying a hard based lense and taking an approach to everything we do.  We continue through the regulatory schemes and to give you an example, 90% of the youth based cyberbullying cases that we settled in terms of removing content have been done collaboratively without us having to remove a notice.  But at the same time, we won't hesitate to use available powers at our disposal if companies report playing along and weren't ‑‑ and aren't really tack Lynn this harmful content and living up to the people processes and technologies that they're evangelizing.  So we'll continue to apply these in a fair, transparent and proportion at way.  Next slide.

So many of you joining today are really in the same boat in our operating in a very complicated online environment.  Sadly today, the debate is often characterized by binary views to upheld security and protection.  It's my belief that we have to stop thinking of private security and safety as exclusive.  A healthy ecosystem relies on all three existing in a natural way.  Without each other, the stools will fall over and all three legs need to be strong and work in tandem to support an overall safety environment.  And yes.  There is a natural tax between them, but it is certain tension that's healthy.  People often talk about freedom of speech.  When they go into the targeted abuse, shouldn't they have the right to exist free from online violence?  In my experience in Twitter or elsewhere, often ends up taking away the free speech of others.  People tell us they need ‑‑ particularly that kind of abuse targeting developing in diversity communities.  I believe the digital rights protectors also have a moral responsibility to broaden the discussion and take a more nuanced approach that the least gives equal billing to the rights of those that get more risk online alongside the rights of prime of speech and privacy.  Governments, industry and Civil Society need to find a way to balance and recalculate these fundamental sets of rights.  Next slide.

I would like to leave you with thoughts.  And for us to be effective, it is imperative we all stay a step ahead not only of electrons and technologies, but also the new paratype shifts whether the meta verse and deraised at .3.  Try to make for suggesting risks and the innovations that can be for the public.  But providing a critical lense and then pointing to solutions where they exist.  We share the insight with the ‑‑ there is an insight into how Artificial Intelligence creates the deep breaks.  Nobody was interested.  Now you can't pick up a technology rage and not see something from encryption.  In the paradigm shifts what, can possibly go wrong in the metaverse when you have (inaudible) build on headsets together.  We're looking at immerse us technologies as well.

In outlines how we believe we knew to be looking critically at governance models and building on safety, execute of.  We need to learn from the lessons of the past.  If there's no responsibility or accountability, then there may be no way for remediate or each online harms.  We need to think about how to shape the future so they can have the test online difficult about our multi‑later approach.  I look forward to taking your questions.

>> ELLA SERRY:  Thank you so much, Commissioner, for your presentation.  We do have a question in the chat.  It said as a safety's points, have they been growing to encompass news and responsibility.  I wonder if you have learned during this process that you would share with other regulators around the world?

>> COMMISSIONER JULIE INMAN GRANT:  Right.  That's a great question and, um, I was in the UK last week meeting future regulators this week.  I'm going to Brussels tomorrow to have these precise discussions.  So I think they're going to want to be interesting.  They will be different approaches being taken.  And believe the values is by providing citizen‑best we're looking at processes and systems approach where a duty of care approach where they won't be providing individual complaint schemes, but taking big enforcement actions around systemic weaknesses and challenges.  So on the one hand, I think you can play a constant game of pack‑a‑mole, but that works with citizens at the gold face gives us a Richmond of invites to how people are weaponizing vehicles, where the systemic failures are.  Regulators looking at this approach, you can do a lot of stuff through open analytics and behavioral in the like.  But not seeing how the platforms are being weaponized is valuable intelligence that they won't have at their disposal.  At the same type, we can't keep playing a game of wack‑amole and only be remediation to what happens.  There is something called the online safety expectations which will be a part of our legislation.  It will allow us to compel transparency reports from the companies whether periodic or on specific issues.  I think that transparency and accountability is really what's missing and what the whistle blower has brought to everyone's attention.  Much more radical transparency than anyone expected.  Does that answer your question?

>> ELLA SERRY:  There's a comment from Nigel.  Your work is really important.  The legislation will benefit from the experience of others, which is great to hear.

>> COMMISSIONER JULIE INMAN GRANT:  Yeah.  Thank you, Nigel.  That was a great opportunity and DCMS did an incredible job of organizing that and throwing them surprises.  We've had wonderful conversations with DCMS, but with Day Melanie DOS and by necessity, they're going to have to be global collaboration.  We want to be on a trajectory, but governments will take different approaches to all of this.  We need to make sure this regulation is workable and that companies can comply around the world.  They have been regulated for decades and, um, you know, I think if we look back 50 years, when cars weren't invent immediate seatbelts.  The car manufacturers pushed back then, but now you wouldn't able to get into a car without safety bags.  Safety is by design and that's a premium feature and something that companies complete over.  Of course, it is dictated by international standards.  We mayed up going with the deck safety.  You're not ‑‑ even if you're a small cafe, you're not allowed to poison people or make them sick.  I think we have to take a different approach with start ups and the like.  I think don't it has to be ‑‑

>> ELLA SERRY:  Thanks, Commissioner.  There are a couple other questions.  Thank you for the excellent and informative session.  One other question that's come through in a policy matter.  Someone is interested to know about the safety risks associated with extended reality and what are some of the key considerations we might need to think about in regulating the work.

>> COMMISSIONER JULIE INMAN GRANT:  Saved by design is going to be really important with the technologies, I would go to look at our friends.  We take an interesting process.  We, of course, look at research, you know, wherever we can find it and then we compare that with investigative data.  It ends up being across a disciplinary approach.  One of the things we're concerned about is the weaponization of abuse.  And what we call by Raul it wasn't very funny.  We're talking about Bill Donnics and AR and VR.  So hyper experienced hyper pleasurable sexual experience, you can have a very frightening and awful experience of sexual assault at the same time.  It is important to predict, but I find in my 30 years in working in this area that human beings have this potential for misusing and weaponizing technology that we might not even contemplate.  Again, the message here is let's objectively shape the future rather thank leading the future happen to us and then have to clean up the mess afterwards.

>> ELLA SERRY:  Thanks, Commissioner.  We have one final question from the audience on soot in Poland.  Could I please hand over to the IGF team for that question?

>> Thank you very much, commissioner.  Very good presentation.  I have a question.  I myself am a female that's a humanitarian.  Online abuse is being objected to women.  I wanted to find out what is your experience in making online a safe space for open parliamentarians?  What could be done in partnership to make sure we have more women politicians online leads to other women allocations not to be online and that contributes to the already huge dental divide and by not having as legislatures online, it is what defeats the person of that connective history.  Thank you.

>> COMMISSIONER JULIE INMAN GRANT:  100%.  First of August, thank you for your bravery and I'm so sorry you have experienced this.  I would say that's a universal experience of women in any kind of spotlight whether they're politicians, whether they're journalists and in fact, we'll be putting out some research very shortly where we spoke to women and 35% said they sell sender so they won't cop people line review.  They think that's the way online does business.  A lot of men don't understand frankly is that the I ‑‑ it is ways that men don't experience it and it is menacing and harassing and I in fact believe that it entrenches gender and equality and that is what it is designed to do.  In the study we're going to be releasing, 25% of women didn't take a promotion or a more senior leadership role because it would require them having an online presence and they just didn't want to deal with that.  We really need to tackle.  This we'll be tackling this, of course, through our online abuse scheme but we also had some programs and some cyber abuse guidance.  We had a program called women in the spotlight and it is social media and self‑defense training we'll be delivering to parliamentarians.  I have to run to catch a train to Brussels.  But Ella and Julia will remain on and can answer some questions.  I do recommend you go to our website.  We are trying to really shape the future and I might add that it's not only women that are disproportionately targeted, but those that we call intersectional factors in Australia.  You identify as LGBTQI or you're more likely to receive online harassment.  It has to stop.  We know that humans are in the frame and the platform just provides the amplification, but if we allow that to continue without, we will normalize that behavior.  We won't have the societal change that we really need to see.  Thank you so much, everyone.

>> ELLA SERRY:  Thank you so much, Commissioner.  It was great to hear from you today.  I think this session is due to finish now.  However, if people do have follow‑up questions, would like more information, please do not hesitate to get in touch with the team.  The e‑mail address is [email protected].  Thank you so much for your attendance.  Thank you and enjoy the rest of the Internet Governance Forum.