IGF 2020 - Day 4 - OF18 Safety by Design - implementation and impact

The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

 

    >> MODERATOR: Good morning, good afternoon, good evening.  A welcome to Session 18 at the Internet Governance Forum 2020 focused on the work spearheaded by the Australian eSafety Commissioner Safety by Design.  Before I have the pleasure of introducing you to the panel, I would like to run through logistics and housekeeping points with you all.  Your cameras and your mics should be turned off but just in case, ensure that that is the case.  And to please use the Q&A function at the bottom of your screens to post questions to the panelists that we will collate and post to the panelists at the end of the panel discussion.

     The session is due to last 60 minutes.  But before we start the panel discussion, and before I make the introductions, we have a short video for you from Plan International Youth Advocates which really bring brings to life the youth vision statement developed and generated as part of our consultations with children and young people as part of the Phase 1 Safety by Design initiative which articulates the expectations and visions young people have for the online services that they use.

     (Captioned video)

     >> MODERATOR: On that note, I have the pleasure of introducing you to the panelists today which is chaired by Australia's eSafety Commissioner Julie Inman Grant.  She has spent three decades working in P&A and safety and working in senior public policy and safety in the tech industry at Microsoft, Adobe. 

We have Dieter Carstensen who is Director and Head of Child Safety at Lego and oversees the child safety development and governance and development and innovation of the Lego’s digital experiences.

And Stephen Collins who is Director of Public Policy at EMEA at Snap, Inc. and has held a variety of policy and regulatory roles and worked in government and academia. And Professor Amanda Third who is at the Institute for Culture and Society and the co-director of the Young and Resilient Research Centre at West Sydney University. 

I have the pleasure of passing over to Julie Inman Grant to provide you with the role and functions of the safety and Safety by Design initiative before chairing the discussions amongst the panelists.  So over to you, Julie, thank you.

     >> JULIE INMAN GRANT: Thank you so much, Julia.  And I want to thank Julia and the team for making this available and pioneering Safety by Design as a world first initiative.

     I thought I would start by explaining a little bit about eSafety and what we are and how we do it because we are a unique organization and we are working close which with neighbors to build capacity and capability.

     But we approach things in three different domains, if you will.  So we start with prevention through research and education and awareness and through some deeper programs.  Some include e-safety women where we know that technology facilitated abuse is an extension of coercion and violence that women experience through domestic violence situations.

     We have a range of other programs and education to make sure that we are aiming towards real behavioral change.  We have to remember that online safety fundamentally is behavioral.  It is the human interaction on the platforms that are going around us.  We need to address the behaviors.  We also need to help parents, carers and grandparents and others, role modeling the right behavior and engaging with children in their online lives.  We know that 81% of Australian 4-year-olds have access to a digital device.  It starts with us the parents and needs to continue throughout a child's educational journey.

     The second phase of what we do is what we referred to as protection.  So the protections are regulatory schemes where we take down seriously harmful content.  And that ranges from everything from abhorrent violent material and new powers received in the wake of the tragic Christ Church shootings as well as child sexual abuse material and we work closely with partners in Hope Network and law enforcement to get those images taken down.

     We have the world's only legislated cyber bullying scheme targeted at children.  So when a child experiences serious cyber bullying and reports to the social media site and that isn't taken down, we can compel the social media site and fine them if they don't take down content that is harassing, intimidating, humiliating, or threatening a child.  And we have the world first scheme around image-based abuse.  We don't call it revenge porn because we don't want it to lead to victim blaming and shaming.  We have a 92% success rate in terms of getting intimate videos and images taken off of hundreds of websites and this applies to everyone.

     And then the area we will focus on today is what we call proactive change.  So how do we minimize the threat surface for the future?  You know, we continue with prevention and try and, you know, prevent the bad things from happening in the first place and protection to take down content and play that game of whack-a-mole or whack-a-troll, if you will, after the damage is done.  Really what we want to do is make the online world a safer place.  We do that in a few ways.  And if you go to eSafety.gov.eu you will see the information and one is through tracking investigative trends and what kind of abuse and threat trends are we seeing?  What kind of new technologies are on the frontier?  How can we harness the technologies for good?  And how might they be misused to cause users harm?  So you will see in the international future section a range of briefs everything from end to end encryption, coaxing, harmful youth-on-youth behavior, a range of issues that may be of interest to you.

     But primarily what we are here to talk about is Safety by Design.  And Safety by Design is something that I tried to bring to those that I worked with in industry more than 10 years ago.  It has been a pleasure to be in the role to understand what makes industry tick and also understand what the limitations are and when they can do more.  And fundamentally we know that if we are going to get companies to change the ethos of how they develop technology, from moving fast and breaking things and profit at all costs to one that places human rights and ethics at the core of the design processes. We need to bring companies along on the journey and has to be meaningful cultural change which starts at the top with leadership and a journey rather than end point.  Constant innovation and constant investment and even creating, you know, competition or cooperating for being the safest platform out there.  We see cars and car manufacturers actually competing on features around product safety.  And I think that the road analogies do hold a lot of promise when we are talking about Safety by Design.

     You know, when we get into the car these days, we expect that the brakes are going to work.  The air bags are going to deploy, and that the seatbelts will be effective.  It has been 50 years now since seatbelts have required to be manufactured in cars.  We wouldn't dream of buying a car without seatbelts today.  And these are guided by international standards.

     Now, we don't have any of those same requirements for the technology companies.  Yet online casualties happen every day because they are not necessarily consistently building in safety protections.  They are rather bolting them on.

     So what Safety by Design says is that you need as companies to assess the risks.  We have known what the harms have been for a long time.  We need to understand how our platforms are being weaponized and engineer out the misuse and build the protections in at the front end.  And really that makes good business sense because it takes -- it can take a very big reputational revenue and even a regulatory hit when companies consistently fail at building in these protections on the front end when they could have been anticipated.

     So what again we were hoping to achieve, we sat down with more than 60 organizations about two years ago.  And this, of course, included young people themselves and in fact the youth vision statement on our website is very, very powerful.  Some of the insights that young people brought to us were I think things that we as adults didn't think about.  The way that they inhabit their online worlds is very different to the way that we conceptualize our online experiences.  So those considerations are really, really important.

     We agreed with industry and others of three major areas of principles that Safety by Design would focus on.  The first what is the service provider responsibility.  The second is user empowerment in technology.  We don't believe we should put the onus for user safety on the users themselves and we want to give them the maximum control to be able to enjoy their online experiences.  And finally, transparency.  I think people would argue it is more selective than it is meaningful or even radical.  Encouraging that to transparency in terms of how safety features are used and what is the uptake and what is their impact and what can we learn from the interventions so we can replicate these and make them more broadly available to others in industry and ultimately users around the globe.

     We went from the principles and in speaking with the companies we decided and they told us it would be very helpful if we could develop an assessment tool for the companies to help them assess where the risks are to surface up best practice and that is where we are today.  We are coding two separate assessment tools.

     We agreed on the five modules with a range of companies.  A number of companies have trialed it as an audit tool.  And this should be available in the coming months.  One for startups themselves and the other for medium tier, medium sized and larger companies.

     We are also working with the VC and investment community to help use their influence and interest in ethical investing.  But also in minimizing investment risk by providing due diligence clauses and using them as a lever to encourage startups who may be thinking about the next round of funding just getting their companies up and running.  And really stimulating user growth.  But we want to help prevent that tech wreck moment that so many companies have experienced and make sure if there is a consciousness about things potentially going wrong and you can build in at the front end it is going to be better for the companies themselves but most importantly the users.  Again, I guess just to underpin that Safety by Design is underpinned by human rights and ethical framework and the youth vision statement.  I would like to stop there and start turning over to some questions.  Because I'm getting tired of hearing myself speak.  You must be, too.

     (Laughter)

     >> MODERATOR: So why don't we open it up to all of our panelists.  And I would like to really start with Amanda here.  Tell me why you think that multistakeholder consultation is so important and why in particular should the voices of young people play such a central role in policy development in the online space?  They are consumers but not in the way that adults are.  So what can we learn from them?  And how should we be engaging them?

     >> AMANDA THIRD: It is a fantastic question.  I think it is important that we come together across government, across technology companies, not NGOs like plan national and bring children along with us because we all bring different perspectives to the question.  I think it is really important when we are thinking about children safety in particular and let's remember that children do have special protection needs.

     They are growing and evolving and far too many children are exposed to risk all around the world.  Think we need to bring children into those conversations because we all do bring different perspectives.  Indeed, my own work around the world has shown that when you talk to children they often have a very different sense of what the risks of harm might be online and what to do about them.

     Often they focus very much on the sort of social risks of engaging with their peers and with family members online.  And they don't necessarily think to the risks that adults normally, you know, that spring to adults' minds.  And I think if we are going to really respond to the conditions of children’s lives obviously we need to pull them in and talk to them about those experiences.

     But I think what we need is a child-centered approach as opposed to a child-led approach.  Obviously, children can't change the world on their own.  We need to work across sectors and we need to pull and draw down on all of the wisdom available to us to really achieve this end of keeping everyone safe online.  What I really love about the Safety by Design initiative is that it bakes safety in at the beginning.  And I would love to see us expand the idea of Safety by Design to lots of other different aspects of digital life.

     Citizenship by design, for example.  Resilience by design and really keen to take the idea of centering design at the heart of everything we do.  Because I really feel like if you work at the front end of the process you work collaborative and across sectors, we can really make an enormous difference in children's lives.

     >> MODERATOR: Thank you.  And do you think we can balance all that -- all of those different imperatives if we are engaging in child-centered design and how do we prioritize?

     >> AMANDA THIRD: It is difficult.  And yes, everyone comes to the table with different kinds of priorities and I think what we have to be careful not to do is to just pretend the differences don't exist.  I think the differences are very productive and if we channel those in our conversations, that actually confronting those differences and grappling with them together, you know, can push conversations in new directions and lead to the sorts of innovations we really, really need to make in the sector.

     So I'm -- I'm considering, you know, I know consensus is wonderful -- I know consensus is wonderful, but I think it is okay to have a little bit of conflict. And we don't need to always smooth out the edges because maybe sometimes the edges are really useful in pushing our thinking.  But obviously if you are going to work in that mode that requires deep levels of trust.  And I think we all have to work hard at building that trust across sectors between adults and young people. And, incidentally, young people say to me all the time they want nothing more than to feel trusted by adults of all kinds in their lives.  But I think, you know, I think we have to build that trust between the adults in this game as well.

     And that takes commitment.  But I think, you know, we have got a common goal and we can get there.

     >> MODERATOR: Great.  That trust does take a long time and certainly you can't develop resilience and grit without some adversity, can you? 

I think I will actually turn over to Dieter now.  So, Lego comes at this from a very different perspective.  You have been child centered, that is the core of your business.

     But clearly there were different considerations at play when your company decided to go digital.  I think the first ever blog that I wrote around Safety by Design was around Lego's approach.  And I would love for you to talk us through sort of your approach and your methodology and how did you bring the company along as this was a big step and what steps did you take to help you get there.

     >> DIETER CARSTENSEN: Thanks, Julia.  The Lego group is a facial recognition manically owned company primarily rooted in the physical Lego brick. And when we developed the digital portfolio, we wanted to replicate the safety and security and high quality we are experiencing from the physical brick because when you look at Lego, regardless of digital, it should be the same proposition and experience you are getting with it. 

When we discuss with the developers, they understood this was a necessity and we went about it to look at the risks.  But I think the starting point from where we came is that really our license to operate needed to be compliance with baseline understanding of what needed to be done to protect children when they are online.  But it did not really address the other part of engaging with them, the opportunity to inspire them to become good digital citizens and leverage technology and the community they are in, to be positive forces for good and for change.

     When we looked at the Safe by Design principles and the discussions we had also with you in the past but also other stakeholders was really how can we on one side create the digital experiences with that in mind that we need to prevent the incidents from occurring and providing the platform for deeper engagement that enriched both our experience delivery but also the experience that children would have on our services but build the trust by parents when they have children downloading a Lego or engaging with the Lego digital experience they feel comfortable with the proposition we are developing and delivering?

     In that sense with all of the aspects inside the discussion room, we clearly looked at what we wanted to develop and what we could develop.  Because there isn't exactly 100% match of what we wanted to develop and deliver.  Because some of it came with high risk.  And there we also needed and always needed to find a balance between what is really the right risk appetite for this specific aspect and how do you mitigate them and empower children to become actually ambassadors and empower them to take action if needed and where needed?

     So it is always a balancing act for us.  And I don't think there is one size that fits every single experience. And that is why I think the Safety by Design principles we have been part in helping to shape some of the wording around is a good tool for us to indicate these are the principles we need to be looking at and questions we need to address.  So when we enter discussions we have the questions at the table. 

The way we then execute and the way we develop at the end that depends on who the audience is, what type of engagement we have.  But it always is and I think some of the children said in the statement in the video it has to be safe but it has to be transparent and we have to be empowered to make the most out of it.  Those are the principles that we go in.  And then, of course, using a tool that we also referred to now, the self-assessment tool is going to be important for us to help the journey go further.

     One point and I think you mentioned in the entry which I think is really critical, it is not a feature.  Safety by Design is not a feature, it is a culture.  If you do this by culture you are two steps ahead of most of everybody.  If you do that right, you do it from the beginning and you always succeed.  It is necessary.  That is the key to open up the right Safe by Design principle base solutions as we go forward.

     >> MODERATOR: How do you build that culture if you don't come from a plan -- that is probably not a fair question for you because you come from a culture that is all about safety.

     But, you know, you may be working with other companies in your field that don't embrace safety the same way.  What is the secret sauce?

     >> DIETER CARSTENSEN: Access to information and children's voices and understand what the end users are wanting from you, not from the regulatory perspective but actually what the consumers expect us to deliver. 

It is a stronger proposition to engage with helping to make the most out of our joint time together in the digital experience.  If we are told by the regulator you need to do A, B, and C, we will do that.  That may be a base requirement but doesn't inspire to innovate great and go further.  Listening more to the end users, bringing that into the dialogue with the partners externally, it is a strong starting point.  If you don't have the Safety by Design principles as the language we can share, we can all agree to, then I think the path is open to you.

     If the culture is not matching at the beginning, well maybe it is the wrong partner you are with.

     >> MODERATOR: That's right.  You mentioned principles versus prescriptiveness, if that is a word. 

So I thought I would move on to Stephen because you know, I have really been impressed with Snap because they have really put a huge focus not just externally but internally they are actually living the values.  Julia and I spent some time there.  Not only Safety by Design, but privacy by design. 

There tends to be a natural and necessary tension that exists between security, privacy, and safety. I would love to hear, Stephen, how you navigate that because you have to provide all of those things for the users, particularly the most vulnerable users.

     >> STEPHEN COLLINS: Thanks.  There is a superficial tension that meets between privacy and safety.  If we see them at opposite ends of the spectrum, one dichotomous of the other.  I'm not sure -- that is the obvious thing to do, but I don't think it is a very helpful approach.  See them instead two sides of the coin.  Pretty much everything by design as Amanda was saying.

     What we have done from the outset and this comes from the founders who still run the company, I mean not unlike Lego really in a way, we have the people that created the product initially are still there driving the product and have an underlying philosophy of how they wanted Snap and Snapchat as a product to be.  That runs through the product design process.  We have product by design and Safety by Design principles alongside one another.

     They are not rigid.  They are principles.  This is what is great by the Australian Safety by Design code, if I could be a bit flattering, Julie, it allows different companies at different stages of development to adhere to those principles.  Also different audiences to be addressed and to be made safe.

     The way that Lego addresses safety will be different than the way that a platform that is primarily geared towards adults, for example, would address safety.  There is a kind of a risk assessment that needs to be done on a constant basis.

     And I think we see this, you know, really well in the principles that you guys have put together in Australia.  And so what we are doing at Snap is first we had the Safety by Design and privacy by design principles overseen by the P2 team.  Privacy and product.  They are part of the legal department and work with privacy and product engineers and sit down at the beginning of any new feature design or new product design and go right through the product life cycle with the engineers.

     With the idea that it is much better to build in safety and privacy from the outset rather than to address it, kind of try to retrofit something while bolting it on when it goes wrong once it is released.  Of course, there is sometimes tensions around the safety privacy piece, but the flexibility afforded by design principles makes that a lot easier.

     One thing I would say, I guess, is I think there is going to be really increasing tension when we see a clash with privacy and safety around knowing your customer and kind of identity verification procedures.

     It is becoming a very, very popular thing to say, both on the kind of government side and in the kind of third sector as well as industry.  And so how we reconcile the fundamental rights of users to safety and security, on the one hand, but also to anonymity and privacy online like they enjoy in the offline world on the other hand is going to be I think one of the big discussions going forward.  One way to achieve that is to go back to Amanda's point of everything by design we are starting to think of ethics by design as a more cultural underpinning for safety and privacy by design.  I'm not able to say that much about it now because we are still thinking about it internally, but that is something that we want to do to create a broader foundation for how we develop other by design principles. I will stop there.

     >> MODERATOR: I think that is really interesting. And I think you're right, we are talking about balancing a range of fundamental rights. And you know, there was a lot of controversy when eSafety was established around the idea of censorship and taking down certain kinds of content because it was seen as potentially undermining freedom of expression.

     Now, that's culturally held.  So many of the platforms are American where, you know, unbridled freedom of expression is a deeply held value.  What I found working for Twitter which was a platform for free expression is the more targeted abuse I saw specifically of women of Aboriginal and people of different ethnicities and religion and people probably at more risk in the real world were at risk in the online world.

And what was happening with the targeted vitriol it was effectively suppressing their voices.  If you are promoting voices without protecting them, then you are actually undermining that particular value.  And now I think we are, as you say, seeing a balance and seeing a real conflict which I think is a false dichotomy.  I think that privacy and safety can live in harmony.  We need to make sure that the powerful tension is right.  This is certainly what we are seeing with the EECC, the E privacy directive if it goes through without derogation, it will be illegal to scan for child sexual abuse material. And I have seen arguments about this being the slippery slope for mass surveillance when really you are talking about detection tools that are privacy protective.  They are looking for known images of children, you know.

     You will have a much more invasive privacy experience just using a commercial e-mail or social media platform than you will through a scan of these images.

     So we do need to make sure that we are having robust discussions to keep these in balance.  And I think we do see a lot of companies that make quite a virtue of being private, but I think we can't give them a free pass when it comes to safety- because it isn't okay that Apple only has 205 of reports when they have millions of users and Amazon eight compared to Facebook 16 million.  Facebook is being criticized, but they are using technologies in their systems to scan for child sexual abuse material and it is getting picked up and reported.

     So yeah, we really need to surface some of these I guess imbalances to make sure that we get that right.

     >> AMANDA THIRD: I think that is a really interesting point, but I think often what happens is that we feel like we don't have -- the debates we have get stuck in binaries.  It is privacy or safety or this or it is that.

     I think what I really like about rights-based approaches is that -- and coming from a child rights perspective myself -- is that actually children according to the convention have provision, protection, and participation rights.  And actually and no rights are anymore important or given more value than others.

     And so actually the task that the Convention sets out in relation to children is balance the provision protection and participation rights and to actually work through the tensions and the challenges and the difficulties.

     So I really feel like, you know, it is -- you know, obviously there are great reasons to mobilize a rights-based approach that is about fulfilling our obligations, you know, ratified by states to deliver on children's rights but actually it as conceptual tool.  A really important and valuable conceptual tool there we can use to open up the conversations beyond the either/or and think through the complexities and tensions that arise.

     >> MODERATOR: Now relate that, Dieter, if I can go back to you.  You know, Safety by Design is not going to be a silver bullet.  And I often say, not trying to undermine the work that we do, but with the issues we are seeing online we are not going to arrest or regulate our way out of it either.

     So what do you see as both the potential but also the limitations of initiatives like Safety by Design?

     >> DIETER CARSTENSEN: I have been using tools like this in the past. And the more that we get stronger comprehensive end-to-end thinking tools that are really addressing and rooted in the approach of human rights, children’s rights, ethics and safety, I think we are getting much better and much more competent in really nailing the problems at the beginning of the design phases which means it will be quicker for us to develop and publish experiences.  Users will be much more willing to engage, test, learn, but also navigate in different brands that are adhering to these kind of principles.  Navigating the digital environments.

     And even looking at connecting to the legal systems easier if they are talking the same language.  The convergence is not a single brand populating -- it is looking at ecosystems you will only be allowed if you are going through this kind of stuff.  And a strong side is really -- and I like the mentioning of this -- the engagement with the venture capital and investment environment. And that is a strong and intimate place because I know there has been talks of developing new codes around how to combat child sexual abuse materials and how companies are addressing the challenges.

     But again, and I must go back to the point we also and always looking at let's say the harm approach, but we don't look at how also we are building competencies and citizenship and good user technology and how to empower children to become not builders of tomorrow but citizens of today because they are and they shouldn't wait six or 10 years to become fully fledged internet engaging people.  They are people in their own right and have opportunities and we should be delivering this to them.  The balance between protecting but also promoting and providing the provision is, as Amanda said, really important.  Those are great.

     And the culture factor is fantastic.  I think it needs to be there and this tool is the language you can share with our partners to create the culture.  I think the failure or the risk, if I say, is that we are all as a company we witness a lot of initiatives on a global scale trying to address from different angles and the map of what tools are available, which tools would you be using, they cannot be -- they have to be maybe more coordinated linked to each other so we can use the right tools for the right purposes and it is more universally applied instead of having sectors by regions or language or so forth.  A bit more coherence would be great, and this is a strong tool and world-leading project.  So we are supporting it fully and want to take it further, absolutely.

     >> MODERATOR: Thank you.  I mean I -- we are seeing a proliferation of principles and, you know, I think that is a good thing in many ways because there is a ground swell of support for this, you know, not being prescriptive and being principle based. 

I still see a real risk in the stovepiping of specific issues so we have voluntary principles around child sexual abuse material, we have got principles around the initiative around transparency and violent extremism and terrorist content. If you’ve got a set of principles for each of these harms without looking holistically about the range of harm that any given platform can give rise to because, again, it comes down to it is people interacting in a social sphere. So just as things can go wrong in the town square or on a dark street with real life crime that is what is going to happen in the online world. And it struck me when Zoom had the tech cautionary tale or the tech wreck COVID moment where they scaled beautifully from 10 million daily users in December of last year to 300 million by April.  But the Zoom bombing or the meeting interruptions that happened and some were quite egregious, they had to take things offline and they did get some great privacy and security specialists to be working on them.  And I think they are really focused on the problems.

     But to read in the press that the CEO said wow, I really hadn't thought about this online harassment problem until this happened.  I thought really?  How can you build a technology like this?  But taking him at his word and thinking the way a founder thinks about their technology and their ability to change the world, you know, that was, you know, additional justification, you know,  that sometimes the VCs or the investors are the adults in the room thinking holistically about the success of any technology or platform. And this guidance, we can't take it for granted this is what founders are going to be thinking about. 

Maybe I can turn over to you, Stephen, because you know, I don't know in that was the initial, what was the initial inspiration for Snap and when did they have their eureka moment?

     >> STEPHEN COLLINS: Some of the features that were there originally are privacy and Safety by Design.  So things like disappearing messages.  Data minimization.  We only collect the data that we need to provide the service and when we don't need it any longer we get rid of it.  Purpose limitation for data.  Defaults set to off.  Like user location is set to off unless the user wants to share.  Those kinds of things have been there from the start and we have been fortunate that our founders were so kind of farsighted on those things.

     Even there it wasn't like something you sit in a room and think about it.  It was a reaction to what was already in the market and trying to build something better and something that they would like to use.

     So that is where we have got to.  The one thing I would say, though, is it can't just be about design, product design, creating people safe.  We need to get away from a notion that users are essentially passive victims.  What we need to do is provide them with agency so that they, too, can interact, you know, with safer products, of course, but have the media literacy and understanding how to also on their side keep themselves safer.  It is much more difficult for younger children, of course, and differs as people get older and the agency, of course, increases.

     But that kind of we mentioned, you know, a holistic approach.  I think also here we need a holistic approach which isn’t just about designing good products.  It is also about teaching people how to use them in not only a safe way but in a responsible way.  One of the things I think we have forgotten to do in the kind of rush to embrace technology over the last 20 years or so is to teach kids in school digital citizenship.  Everybody is taught how to behave offline and react kindly with fellow citizens.

     Seems to have been lost in the rush to embrace technology.  I would like to see governance in particular and education departments of governments think a bit more deeply about how to instill a form of social contract, digital citizenship into the minds of younger people so we are not always reacting to harms.

     That maybe harms can be lessened for better behavior and great respect for each other.  That sounds simplistic and maybe it is.  But we are not going to succeed just by making things safer and safer and safer.  Go back to the car analogy.  There are still people driving cars unsafely and still having accidents that they don't need to have.  We still have a driving test to get to a certain standard to drive the cars.  But I think it is broadly there. We expect people to obey the rules of the road when -- something equivalent and the more holistic approach to give people more agency than we are currently giving them would be really helpful.

     >> AMANDA THIRD: And that is particularly important for children because we know that children need to experience risk to really develop the skill set to manage the potential harms and maximize the opportunity so yeah, absolutely.

     >> MODERATOR: And we often talk about the four Rs of the digital age, Stephen, teaching respect, responsibility, digital resilience and critical reasoning skills. And think about how it all today disconcerting fact and fiction with disinformation and misinformation and understanding when someone is trying to socially engineer them or that -- even what we see among the young people being bullied through imposter impersonation accounts, we have to give them the tools to understand and learn and make good decisions.

     And that has to happen a little bit at home, but we need to integrate it more fully into the education system throughout the child's educational journey. So I'm 100% in agreement with you there. I have been told we should probably move to some of the questions from the floor. Catherine or Julia, will you be facilitating those?

     >> JULIA FOSSI: Thank you to those posing the questions.  Warren Matthew asking about how important do you think cultural issues are for child safety and can one approach be applied in a global context?  Perhaps to Amanda for that one.

     >> AMANDA THIRD: Can I jump in and thank you for that question.  Cultural difference and cultural diversity is critical to the ways that children make sense of being online and make sense of safety more broadly.

And, in fact, some of the work that my team and I have done has been to really -- working in over 70 countries now to understand actually how do children think about safety?  And what we found is that actually safety means very different things in very different contexts.

     That will come as no surprise to any of us, I don't think.  What this means is there is a real challenge there for Safety by Design.  I think what we need to remember is that when we implement Safety by Design it is not in a sense, you know, it is not a fixed recipe that you just roll out.  It is a process that you go through.  Or a culture, as I think Dieter was saying earlier on.

     That you create around a design initiative.  And so and this is actually really, you know, another reason why multi-stakeholder approach is really critical is to be able to build products in diverse contexts to be -- for the design to actually be informed very directly by diverse contexts in order that we can shore up or, you know, build products that can anticipate the kinds of risks that people might encounter.  But also that the ways we do that really articulate with how people think about safety and diverse contexts.

     So is it critical?  Absolutely.  Can one approach be applied?  You know, as a sort of blanket solution?  No.  But I think we need precisely processes like Safety by Design that can get us to where we need to get.

     >> MODERATOR: I would also add that context and culture matters so much.  I think so much of what we see falling through the cracks through content moderation that we are able to advocate on behalf of users is they are missing the cultural context or they are missing the context of the youth-based cyber bullying.  So a post on its face may not see violative, but if you know what's happening in terms of a conflict in the school yard and how this is an extension, it matters alot.  We see this with image-based abuse.  A woman of Muslim faith if she is in a photo in a bathing suit or without a proper hijab that can be damaging if it is shared non-consensually depending on her culture.  Whether it is intended or not, there is a bias in algorithms and depends on who is programming the algorithm.

     And then the machine learning it is tested on.  So culture, bias, all these things, I think it is part of a safety where because we want everyone to be safe, particularly those that may be most vulnerable or least understood.

>> DIETER CARSTENSEN: Maybe a bit challenging to the two statements here. We don't differentiate between a Danish, Australian, Chinese, African American child user.  We want to protect them equally and give them the right tools to tailor what is right for them and their family and culture and context.

The starting point, and I think that was one of the calls from the youth voices they want to be giving Safe by Design the highest settings on by default.  You can then lower them and make the proactive choice and open up at your will.  That I think is really empowerment in its true sense.  We would provide the tools and the starting point is the soft entry and open up when you feel comfortable and competent to do that.

     It may be an extension of what you are trying to say, but we do treat everybody equal.

>> STEPHEN COLLINS: I will jump in as well.  One thing to note here, I don't think Lego and Snap are typical of the industry, so it is great that we are on the panel.  Look at our public content with design for ages 13 and above.  None of our content is adult or public content is pre-moderated or curated.

It goes through human moderation before it hits the platform.  That’s not typical, it’s atypical. I would not suggest that is the only business model, just as Dieter wouldn't suggest the Lego is the only business platform which have defaults set to off.  The one size fits all piece isn't going to work for everybody. 

We created something that works, develop for the users which is pre-moderation and curation but it also means then that -- curation.  But also means that free expression and free speech not quite on Snapchat to other forms.  Yours will be different to the likes of the big social media platforms.  How they all then address Safety by Design is for them to decide. And where those defaults are set, for example, is again for them to decide based on, you know, assessing the risk, looking at principles and how they best develop those for the particular audience.  Back to the slightly cliche no one size fits all.  You have, like I would say, two good guys from the industry here and it would be a different conversation I think if we had other platforms in the conversation.

     >> MODERATOR: Well, can you tell me how can we actually get other platforms to buy into this?

     >> STEPHEN COLLINS: From my perception, a bit of carrot and a bit of stick.  This is my personal view, and I don't know whether it as company view but my about personal view that voluntary codes are attractive because they are kind of easy to sign up to and make companies look good. 

Mandatory codes, on the other hand, you know, are there to ensure that the codes are actually adhered to.  There is not much point I think of having a code if there is no enforcement mechanism to back it up.

     And what the Australian government has done well is to have a carrot, which is hey, you guys, this is a really helpful code, Safe by Design code, really helpful code and safer happier users and you will make money and it will be great.  And the stick is if you don't do it.  We have been flexible and non-prescriptive, you decide how you want to commit the principles.  If you don't and fall short and consistently fall short, there is Julie Inman Grant with a big stick and you will be in a lot of trouble and I think that carrot stick approach works. I'm more skeptical of codes, particularly ones that are such a high abstracted level seem to cover everybody and seem to say everything but at the same time say nothing.

     So that is where I would be.  I think it is carrot and stick.

     >> MODERATOR: I agree.  I think a lot become a dirt floor than an aspirational ceiling.  And so, you know, we do see some situations where companies are signing on to things that they are already doing or can achieve rather than having a real stretch goal which I think is better for the user and better for bringing up other companies.

And I actually have my prediction is that the mid-sized companies, the Snaps and Legos and the Zooms and even Match that they will probably end up leap frogging the really big companies in terms of some of the innovations and really embracing safety the way that we believe all companies should because, you know, I mean the mature companies should have done this by now really.  I don't think any one of them can say they have been a paragon of virtue where Safety by Design is concerned. 

Time to wrap it up and thank everybody for tuning in and thank the panelists for an engaging discussion.

     We have taken note of the questions people posed and we will be able to respond to people.  But just to say that if you do have burning questions, please don't hesitate to kind of reach out to us at Safety by Design at eSafety.gov.au and we will respond and direct the questions to the panelists.  Because I know a few have been posed to them.

Thank you. And I would like to thank the panelists for the engaging discussion this morning.  Thank you to all of you.  And thank you all to the attendees as well for joining in the conversation, and I am sure you got a lot from the conversation and I hope that you will go on to eSafety.gov.au to find out more about Safety by Design.

And we would love to share the journey with you and encourage you to ask questions moving forward.  So thank you very much to the IGF for hosting us and allowing the session. And I would like to thank Plan International and their youth activists in particular who put together the video for us.  We had a few questions about the video and that will be put on the website in the very near future and you will see socials going out from the eSafety account as well.  Thank you so much.