You are here

IGF 2018 - Day 2 - Salle XI - WS211 Technology, Suicide, and the Mental Health of Youth

The following are the outputs of the real-time captioning taken during the Thirteenth Annual Meeting of the Internet Governance Forum (IGF) in Paris, France, from 12 to 14 November 2018. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

     >> MODERATOR: Hello, everybody. Can everybody hear me, the microphone on?  Welcome to the session.  It is an unfortunate topic that we are forced to deal with in the world.  I think we can in the next 90 minutes get a better understanding of the problem not only of teen suicide, but self‑harm in general mental harm and specifics.  And hopefully have a solution.  When we leave here we can take something that is dark and sad and have an optimistic approach as to how collectively we can do our best to turn the tide. 

What prompted us to do this session is that for many years the suicide rate in general and especially among teens was on the decline.  I have about in the Internet safety world since the '90s.  I'm the old man of the panel and probably the room.  I was optimistic to point out as the Internet rose in popularity, it was an inverse relationship with any of the problems associated among young people.  Less crime among youth, less violence among youth and even bullying that we think is an epidemic was leveling off, didn't go up over the Internet.  Since 2014, there has been an unfortunate reversal with suicide.  Not a huge spike, but significantly ‑‑ statistically significant growth.  Looking at centers for disease control, U.S.‑based ‑‑ let me see if I can get the number up, my computer is not working.  Well, it went up by almost 20 between 2000 and 2016.  So we're seeing an increase in the numbers of people that are taking their own lives among youth.  So why is that?  And what can we do about it?  There is a lot of speculation.  There are those who put the problem squarely at the foot of technology, the Internet and social media.  And it is very easy to hypothesize and say, well, more people are being cyber bullied and more people are having to deal with image issues, go on Instagram and think everybody is beautiful, except them.  They go on social media and find out all their friends are having parties and they're not invited.  It is easy to say that's the cause.  But it is not clear.  We don't know that it plays no role, but based on the research I've looked at, there is no definitive proof that that is indeed the cause.  There are other things, economy is better and worse, mental illness is a factor and often a factor in suicides.  And you have to look at why two people can experience exactly the same trauma and one person is resilient enough to bounce back and thrive and the other person is severely affected, extremely depressed and may harm themselves and take their own love life.  What is different between the people.  The events can be similar.  You have to look into the factors of how a person responds to adverse conditions.  All of us on one level or another will face throughout our lives.  There are a few that go the ultimate desperation and others eventually bounce back.  These are important questions I would like the panel to think about and folks in the audience.  It will be time for questions, answers, comments and hopefully smart contributions from people in the room.

So I'm looking forward to that.  Now, one of the things I have decided to do is make my life easier and partially because there are names on this panel that I am going to mispronounce no matter how hard I try is introduce everyone by first name and let them give you their full name and also talk about what they do, who they are.  I will ask them questions and we will break into a discussion.  So I think I will go in the order we're seated.  Jeff Collins who I can introduce by full name is my co‑organizer of the event.  Thank him for many of the things that go right today and blame me for everything that goes wrong.  Jeff, talk about yourself and why you felt strongly about doing this.

     >> JEFF COLLINS: Sure, thanks Larry.  Thanks to everyone for coming.  I thought I would give a little background on the company I work for.  Most of you probably aren't familiar with our company.  Our main product is called after school.  It is a social network for high school students, only in the U.S.

And I'll go ahead and give a little back story.  I think this kind of gets to the reason ‑‑ the explanation of why we care about this topic

So with tech companies ‑‑ we're a small startup, we have been 15 people.  When our two young cofounders started the company, I think like many tech companies, they have very positive goals, they're optimistic people.  If you look at the huge companies, like Facebook, Google, it is the same story.  I mean, you know the founders have good intentions and you know, they still do, but now we know that the reality is there is a lot of unintended consequences that happen from something as simple as a social network. 

Our app was founded about three years ago.  It allows high school students to communicate anonymously with other students in their high school.  It was a tweaked version of a different product they had in college.  They were trying to introduce it into high schools.  It took off.  It exploded.  There were 200,000 users in a week, it was like one of the most downloaded apps.  It was three people working at the company.  And their whole intention was to improve the high school experience and allowing the ability to communicate anonymously to be used so kids could talk about things that are difficult that they're not going to talk about on a public network like Facebook where everyone can see them.  Things like mental health problems, coming out as gay.  So a lot of that happened.  I'll talk later about what we see today. 

But in the first few months, as we were exploding, in terms of user numbers, we also saw tons of cyber bullying, people threatens to commit suicide.  The founders, who at the time were 23 and 33, you know, they didn't really know how to deal with this. 

So we took the app out of the app store, essentially put it on pause for like four months, and essentially built in a whole bunch of safety features that are standard for most social networks.  So you have moderation, technical moderation.  If someone types something like I will shoot up the school, that will automatically be blocked, but teams of human moderators and they're looking for posts that violate guidelines and blocking them. 

So we kind of shifted and instead of allowing everything, we decided pause of the sensitive nature of the audience, young high school kids, we would take a lot of things down.  So that is kind of the short story of how we got interested in cyber bullying.  That led us to this issue of mental health. 

We are in San Francisco, Silicon Valley, which is pretty well known now that there has been some suicide clusters there.  Very affluent area, lots of young people have committed suicide over the years.  There has been a lot of focus on this interplay between social media, technology, mental health of teens and teen suicide.  So we have connected with a number of the expert organizations there, Stanford University has a great program looking at this.  And yeah, so that's ‑‑ I guess I will leave it at that for now.  And we'll get into things we're working on later.

     >> MODERATOR: I'm curious about the moderators.  It strikes me as a lot of work on the individuals how do you equip them when to take action.  Obviously, you don't want to respond to every possible indicator or you refer everybody to a suicide prevention line.  How do the moderators know when it is time to intervene and what in fact do they do?

     >> JEFF COLLINS: That's a good question.  So, actually ‑‑ one of the things that we do to use anonymity in a positive way.  As I mention, people will talk about sensitive things.  They could talk about being depressed.  And we actually have technical algorithm if they may be talking about something that suggests they have a mental health problem, if that is detected.  They get a pop‑up if they ask a student if they want to communicate with a mental health professional.  But we put it in kind of teen language, like do you want to talk to machine about this.  And if they say yes, we refer them to crisis text line.  Crisis text line is ‑‑ now a really big organization, that works with Facebook and even city governments to help prevent people who might be trying to jump off the Golden Great Bridge.  They do counseling over text.  It may sound odd at first, but that is how young people communicate.  By detecting things kids have written, we have sent some 20‑some thousand kids to crisis text line, they have conversations with counselors and 30 active saves, which they define as someone that is about to commit suicide to prevent them from doing that.  That is the technical part. 

On the human moderators, that is a tough issue.  Especially larger companies, essentially they use teams of moderators overseas.  This is like ‑‑ we used to be troubled about outsourcing of Nike making their shoes in Indonesia, but the tech industry does something similar.  The really difficult images.  We don't have the most difficult because we're teens.  If you think about beheadings, it is moderators overseas in the Philippines, Honduras that are reviewing these things.  The way we do it is they rate different posts ‑‑ they decide to elevate them to an American, so I think you get to a good point that we and others need to do more, and the think the larger companies are to ensure that the moderator's mental health is actually, you know not worsening.

     >> JEFF COLLINS: I will ask you another question, first, I want to poll the audience.  How many people here work for private company for industry of any kind?  Very few.  How many work for government?  How many are NGOs?  Academia?  Anything else?  What?  Okay.  Bob.  Independent programmer.  We have a very famous American programmer here, he wrote the very first spreadsheet, honored to have Bob Franks in here. 

The next question, you're a private company, you know, I presume neither of your founders are psychologists, I know you are a lawyer by trade.  Why is this your business?  I know the answer to this, but I want to hear it from you.  Why you?  Normally if people commit suicide, governments or medical institutions take care of them now a business that is nowhere in that business is preventing suicides.

     >> JEFF COLLINS: That is a good question.  The short answer is because we're kind of on front lines, we get block back from media, parents, schools, teachers, if there is a lot of bullying happening or if someone commits suicide and they're posting on your network and you didn't detect it, it will be horrible for you in the media.  That is kind of the business side.  But I think there is increasingly in the industry a realization that, you know, there is ‑‑ although we don't have conclusive evidence of causation between use of technology or social media and suicide ‑‑ I think all of us know here and we will hear from a teen in a second.

We know how much young people, our kids our friends use social media and we see how much we ourselves use it.  I think there is an innate understanding that this is changing the way we think.  With young people it can increase the types of pressures they have. 

So I think we feel a responsibility that it is really an ethical responsibility that we can't just promote kids, you know, using our app and other apps all day, all night and, you know, using it some negative ways.  Going back to the way I started.  The founders have positive goals using communication for good.

That is what we want to do.  You can't just go forward blindly.  In order to accomplish that, you have to work with the diversity of people, like we have here today, to figure out what's happening, to look at the science, and then to try to take steps to do things in the right way.  In your app, on your service so we can use technology for good.

     >> JEFF COLLINS: Thank you, Jeff.  The next speaker is Phillippine, I will let her give her surname.  She is younger than she looks.  I will not give up her age.  She's obviously mature for her age from the a little time I've known her.  She has experience with friends and depth.  I will turn it over to you Phillippine.

    >> PHILLIPPINE BEMIJE:  Hello, I'm Phillippine Bemije.  I'm French and American, I'm bilingual and I'm in 8th grade.

     >> MODERATOR: Okay.  Talk about your experiences.

    >> PHILLIPPINE BEMIJE: So in the previous year, I had a friend who wanted to commit suicide because she didn't feel well in her own skin and was really depressed.  So what we tried to do is we tried to help her.  So we went to talk to the school counselors to ask for advice, to let her talk to them and to try to explain why she was feeling this way.  And she's okay now, she is happy now.  It was really nice for her, but I think that she ‑‑ that social media had a positive and a negative impact on her. 

For example, the negative impact would be that I think she was able to watch things to do with self‑harm and things that made her very depressed.  But then social media kind of let her heal by being able to talk with friends and knowing that she's loved and she's not alone.

     >> JEFF COLLINS: Keep your mic open, you're not done yet.  Um, was it hard to reach out and help your friend?  Did you think about it or was it obvious to you?

    >> PHILLIPPINE BEMIJE: I mean, I wasn't the first person to know about it, but we all decided together we had to do it.  If something would have happened to her, we would have felt awful and we couldn't have been able to live with ourselves.

     >> JEFF COLLINS: Maybe this is an unfair question, because you're not a psychologist, at least not yet.  Why do you think some kids are more vulnerable and other.  Say two are bullied, one is depressed from it, and the other blows it off.  Your thoughts?

    >> PHILLIPPINES BEMIJE: The personalities.  Some are fragile, take things at heart, others push it off and don't pay any mind.  I think it just depends on the people.

     >> JEFF COLLINS: You in if a sense have proven this is true, but is there things the community can do to help the fragile people become less fragile?

    >> PHILLIPPINE BEMIJE: I think on essentially media there are always people posting things to do with self‑harm.  I don't think it should be okay to post things.  Everyone else that watch this can get bad ideas.  So I think those posts shouldn't be allowed to be posted.

     >> MODERATOR: Getting back to the notion of the upstander versus the bystander, the fact that you didn't stand by and watch your friend do something horrible.  What can young people like yourself do to make that contagious, encourage others to be like yourself and stand up?

    >> PHILLIPPINE BEMIJE: I mean, I think we just have to talk about it openly.  Because if we don't talk about it, it will just become a growing problem.  I think that if everyone just on the media, everyone starts talking about it more openly and freely, it will be ‑‑ it will be considered more okay to talk about than something that we shouldn't talk about.

     >> MODERATOR: A lot of wisdom.  Thank you so much.  Victoria, your last name, where you work, what you do and why this issue is so important to you.

    >> VICTORIA MCCULLOUGH: No, yeah, hard to follow.  I think you Philippina this raises a lot of important challenges and a lot of platforms have to face and look ahead to.  I'm Victoria McCullough and a work with Tumblr on most of the social impact work on this space.  Just a few things to mention for those that might not be familiar with Tumblr.  We're a blogging platform, very focused on a lot of the content you will see is anything from photos to personal blogs to more journal‑like entries to just general content creators.  We focus a lot on artists and content creators and really supporting their efforts to express themselves.  Very focused on freedom of expression and really having our users bring their whole selves to the platform.  A couple things to point out because it feels important to the conversation.  The features on the platform that are unique compared to other platforms but I think really play a role in this conversation.

A few of those are one, similar to Jeff and after school, we place a strong importance on the ability to be anonymous.  Early on, for the last 10 years, we saw a lot of users that did not feel comfortable in their own skin whether it was the family or community, and Tumblr provided a space they could be anonymous and be themselves.  I don't know if we saw this coming in the last 10 years, but also reflected in our community, there is no public follower count on our platform.  I think most of our users who come here and come to Tumblr talk about not necessarily having the vanity factor.  They come to really connect with people.  One of the mantras and our missions is around connecting people with their passions.

I think a lot of that is due to Tumblr providing a space that doesn't necessarily mean you are competing with other friends or even connecting with people that live in your town.  You are mostly connecting around things you are interested in. 

I think the other feature that is really developed as ‑‑ from the other two has been just more of our users have become more comfortable online, they often sort of speak to ‑‑ I think this next generation coming up uses many, many platforms and we'll say they're often using other platforms, maybe Facebook or Twitter and a reflection of the outside self and Tumblr is more of a reflection of the inside self.  With that challenge, sometimes the younger users, the inside self may be struggling.  May struggle with depression, anxiety, maybe still figuring things out.  We have a large number of L.G.B.T.Q. youth on the platform that we are trying to cultivate our relationship with them and really cultivating opportunities to reach out to them. 

For so many living in areas where they don't have actual resources, that has been something we place a large emphasis on and really seen that community grow over the full life of Tumblr and it is really shaped how we as a company think about this outreach in this space.  I think as we have seen users really struggle, we have focused on a few different things and then I will pass it off.  Things I think we need to continue to invest in.  One is we recognize as a platform that we have the challenges.  There are proactive things we can do.  We focus on interventions and providing things not unlike crisis text line, it is hotlines where users search, self‑harm terms or whatever.  Really putting in interventions so it prevents them from coming to that material.  We take down and mindful of moderating content that could be harmful.  More importantly really putting resources in front of our users.  Then the last piece is in addition to really working to shore up the trust and safety resources and making sure that they're being as responsive as they can be and providing a lot of the crisis resources that might be helpful, especially when a user very much in crisis and may be having suicidal thoughts.  We as a company have tried to really invest from a proactive standpoint around investing in campaigns around mental health.  Several years ago, I believe 2015 this were several ‑‑ a rash of teen suicides.  You saw companies across the board trying to invest to make their platform safe and really invest in erasing the stigma around mental health conversations.  So we launched something called post it forward several years ago.  I am honest about this.  We thought it would be gone in a couple of weeks.  We thought this was a campaign we would spend, provide users, bring users together to talk about the conversations and bring resources to the table and shed some light and maybe it would evolve into something else.  And it turned into one of the most powerful places on the platform.  And really, that pass it forward effort has really become part of our D.N.A. as a platform.  There are things, whether it is partnering with mental health organizations that can help us better provide a safe place for users or if it is just taking a proactive stance around in the U.S., that is part of the larger healthcare conversation and more importantly, we're making it a safe space to label some of the challenges that are raised with mental health whether we talk about depression, anxiety, eating disorders, we want Tumblr to be a space that is comfortable for every user to talk about some of the challenges.  As we move forward and looking into next year and beyond and as we are seeing suicides increase and certainly mental health is something that is increasingly challenging, I think for this next generation, we're looking at additional ways to invest in some of the mental health resources and really partner and show our own responsibility.  There are a lot of things we can't do, and can't do effectively.  But we can partner with NGOs to provide and set up supports with how we work with parents to how we work with schools. 

So those are ‑‑ that is the lay of the land for us.  I think what we see are challenges, that we can't solve everything.

     >> MODERATOR: You mention your service is anonymous.  I wrote an article called anonymous is not synonymous with ominous.  Others didn't agree.  They said it is bad.  And you made a deliberate decision to create an anonymous platform.  Why is that a safety feature versus a bug.

     >> VICTORIA MCCULLOUGH: I won't say it is perfect every time.  There are examples where it doesn't work.  As Tumblr has involved we have seen an incredibly proactive community.  What happens often is that when the community witnesses bullying happening or bad behavior they will off police themselves and will often tend to either collectively support someone that they see being victimized and really get involved.  That is where it has been I think over the course of several years, it has really been a place that has built up a community of people that will step up, speak out and sort of protect and to a certain extent ‑‑ there is another term that they use, but will often run‑off some of the bullies and really sort of collectively work together to do that.  That is where we have seen some of the better examples.  That is something that comes with time.

     >> MODERATOR: That is the next question.  Phillippine talked about she and her physical community gathered and helped this young woman that was exhibiting suicidal tendencies, how does that work in an online community.  How can you be a friend like Phillippine that maybe you have never met and may live in a different part of the world than you do?

    >> VICTORIA MCCULLOUGH: We work with groups that ‑‑ you mean user to user?

     >> MODERATOR: You see something, have a feeling this person needs help, you may know them, may not.

    >> VICTORIA MCCULLOUGH: There are a couple of groups that will encourage users to do something as simple as you seem like you are not feeling great.  Language that feels innocuous, doesn't run the person off, but it is as simple as saying, hey, are you feeling okay.  And then as it escalates and we worked with partners in the past that helped us with the language.  If it does escalate and something you can't from user to user address and you haven't been able to help they provide ways, languages, resources and you can respond and say hey, you should call this person, this is a crisis resource to help you.  There is a really person on the other side.  A counselor that can really provide that effort and invest in that.  That is some of the language that we encourage our users to get comfortable with.  More importantly, we get them really familiar with mental health organizations in the regions they live in, to help train them.  They are particularly helpful.

     >> MODERATOR: The next speaker, I won't try the last name, the first name I have as Heffna.  The reason I brought her here, I realized we were lacking an important person on the panel to talk about mental health.  She's got a background some mental health and she's the active in the Internet platforms.  Give me the proper pronunciation of your name.

     >> MS. HEFFNA:  My name is Heffna.  It is an NGO working with stakeholders for example, government working in education and well‑being of students from kindergarten to upper secondary school and onwards, really.  The aim is to strengthen the relationship between homes and schools for the progress and well‑being of the student.  Positive experience and connection toward school in education are important protective factors in the lives of teens.  Bad experiences from schooling can be predicted for problems with mental health, substance abuse and dropout.  We think the relationship is important, it is good, and students have a positive experience from the schooling. 

We have some nurse educational material, help line, more.  One of the larger projects we connect is the safer Internet which is part of the network.  We have the no hate speech project as well, from the European council.  With these projects we are working with youth, parents and teachers.  And other stakeholders who are concerned about the online well‑being of children and youth.

We also have a help line connected to the project.  Run by the Red Cross.  And a hotline conducted by save the children Iceland and police that collaborate with us on this. 

I used to teach in elementary school about a decade ago.  The changes of the environment of children and teenagers in 10 years is enormous.  Phillippine described it well.  And I'm happy to have a youth representative here.  Very happy to have youth.

There is media driven speculation on the role of social media and mental health of young people in island.  Most ‑‑ in Iceland.  Most of the news tell us that there is, well, a negative connection.  But the research, it shows a clear connection, but we don't really have concrete evidence on causal factors.

And more research is needed.  But I will report on recent research from the Icelandic center for social research and analysis in Iceland, called ICSRA that correlates this but we don't know the causal factors although it is evidence that it is important how young people choose to spend their time.  And we need to take care of the basic needs.

But since we're here talking about the Internet of trust, I think we also need to recognize the role of media in creating or diminishing trust in our society.  Most of the news revolve around something negative, threatening, dangerous.  The media isn't necessarily discussing a positive outlook on society or fostering trust and well‑being. 

Suicidal rate in Iceland is quite high, especially among young people.  It is one of the most leading cause of death among young men in Iceland.  This has raised concern and efforts are being made to meet this need.  We have been strengthening prevention and resources, but it is also interesting that when most suicidal attempts at the peak of the economic boom in Iceland right before the crisis.  So maybe the boom before the crisis was more of a stress factor than we realized.

As you all know, there are many factors contributing to depression and suicidal thoughts, but our approach is prevention needs to start early, and we need to screen classes.  We need to pay attention to rights and reports of anxiety, sadness of young people.  The success and well‑being of youth is the main concern of our organization.

     >> MODERATOR: Okay.  Perhaps with your psychologist hat on.  There is a notion when there is a suicide that follows cyber bullying ‑‑ we have had ‑‑ I don't want to say many, even though the suicide numbers have gone up.  It is a few hundred thousand people.  Someone will if ‑‑ go on television, and she committed suicide.  Somebody distributed a naked picture and she committed suicide.  Unfortunately the person that committed suicide can't say anything.  How much weight do we give to any one incident. We hear about people committing suicide and something bad happened to them.  Does is it fair to say there is a relationship?  Yes, there is a correlation, do we have a reason to know that did or did not cause the suicide.  Like I asked fill peep, some people can have the same experience, survive and others take their life after that experience.  Two questions.

    >> VICTORIA MCCULLOUGH: Of course,  ‑‑

     >> MS. HEFFNA: We don't have the person here to describe it to us.  How resilient are the kids that are meeting and facing the challenges.  And one of the best things to do in prevention is build resilience with kids.  We have to think of new ways of building resilience because we are living in a totally different society with different challenges.  And we also have to give ourself kids the ability to grow and develop, they have to meet challenges and take them head on themselves.  We, the parents can't save everything for them.  They have to grow and build this resilience, there is a big project starting in Iceland, it is with the upright project.  The researchers at University of Iceland will develop materials and teaching modules for elementary schools in coordination with the director of health and they have an aim of building resilience and stronger self‑image. 

As we know, there are many different contributing factors in suicide.  Each case differs as well.  It is a combination of D.N.A. and environment.  You never know, really what we can do is try to open up the discussion like Phillippine said, so it is not taboo.  And offer ways for people to seek help when they feel badly.  And as we're going to discuss here, technology can open up new ways in that sense.  Especially for people that are vulnerable and don't have good access to healthcare.  So there might be new ways in opening that up.  Also in assisting healthcare professionals in tackling the problems.

     >> MODERATOR: Thank you very much.  Our final speaker, don't go away because we will have more of a discussion.  Everybody here will have a chance to talk.  Again, I'll let her give her last name and what she does.  Monica is a hero.  Monica is here, we had another speaker in mind Karuna Nain that had an Indian passport and didn't get a Visa in time.  Fortunately, the Brazilian Visa came through.  Thank you for being here.

    >> MONICA ROSINA: My name is Monica Rosina, I am a product manager.  I have a hard job in filling Karuna's shoes.  She works in California and develops the programs I will talk about.  You are in worse hands, but I promise I will try to do my best. 

I did not need a Visa.  That's why I was able to come so fast.  Ha‑ha‑ha. 
I will not introduce Facebook, I am sure in the family of apps I'm sure how many of you know how it works.  I want to say how many I am to have you here Phillippine.  It is an honor to hear from you.

I will base my opening remarks on a few things you said, if that is all right.  You mention that social media can have both negative and positive effects.  And I agree with that.  I'm going to try and walk you all through what we do in order to mitigate the negative effects that it might have, but also how we can strengthen the positive effects. 

So you mentioned that on the negative side some of your friends see conflict online that might give them ideas, right? 

So on Facebook, we have what we call our content policy.  And our content policies, they're the rules of what we allow and don't allow on Facebook.  That is a really hard job.  And in what it regards to safety and especially suicide, we do not allow for any content that glorifies self‑harm or suicide of any kind.

So we are getting better at removing that content.  Even before it is reported to us.  So the primary way in which people can ‑‑ in which that content is spotted in Facebook is because our community report that content.  Anyone is able to report anything they see on Facebook.  It is usually the three dots on top of a post or picture in which people can you know, say I'm reporting this because it has to do with self‑harm or something like that.  But we also use our algorithms and they help us by ‑‑ usually suicidals or self‑harm materials.  The post is usually made by a human being, but the algorithms help us filter the content to us.

So we have over 20,000 people globally nowadays working on the content.  They speak several languages.  They are in every region of the globe so they can work and take action 24 hours a day, seven days a week.  Whenever we see content related to suicide, it goes way up the line.  Because it is a real priority for us.  Removing the bad content is really important for Facebook.  However, then I'm talking about content that gives people the wrong ideas.  Content that glorifies or incentivize people to self‑harm or suicide.

But on the other hand, we find, because we talk to experts all the time, that whenever people are in distress, especially if they're using Live, Facebook Live, cutting that line of communication can be worse for the person.  Because it removes any chance that that person has of getting help.  I get the questions a lot when I talk about the issues of why don't you cut the feed to potential suicide lives.  It is important to keep that open.  Because the person could reach out in that moment and that moment will be crucial to help that person.

So on the bad side, we try to mitigate bad content by enforcing content policies and removing bad content but we also want to provide support.  Sometimes providing support means leaving that channel of communication open.  People come to Facebook to share what matters to them with family, with friends.  We do believe that we are a community where people look out for one another.  And Facebook can be a place where people can get help through groups and, you know, through the communities they grow.  We're constantly working on building tools that can help people get more information and also getting help.  We have partners all over the world.  We have nearly 100 partners with whom we work with.  And they work specifically with safety issues and so in most countries, we're able to provide a link, a channel of communication to a local organization that might be able to help that person.  So we have several tools.  We have, you know, chat window that might pop up.  We provide a user that we might think is a potential suicide with more information.  Friends can get through.  And we're also ‑‑ we're always, all the time, making sure that a user that might be a potential suicide knows that this is the local organization that he or she can reach out to and also within the community.  I will stop here.

     >> MODERATOR: Thank you.  So unless I'm missing something, it seems to me Facebook has sort of three tools when it comes to helping people thrive on the site.  You have the community, friends, people who you interact with.  You have your professional human moderators.  And increasingly you have AI, algorithms, software.  We know you sometimes intervene.  You mention software can help find content that can glorify suicide.  What about spotting somebody that is in trouble as quickly as you spot fake news to some extent or child sex abuse images and things that you have done a good job at keeping off the site?  Can you find somebody that needs that help?

    >> MONICA ROSINA: We are constantly working to improve our algorithms and how they work.  We learn from our mistakes and hope to get it right.  We would like to move from the place where we can get help.  Not just in the time when the person is considering suicide, but way beforehand.  I want to say algorithms help us spot, but we always have a human being behind every decision we make.

     >> MODERATOR: Other thing about reaching out.  Sometimes you can get false clues.
People can say things that look like they're in trouble and they're not.  On the other hand, you don't want to err in the other direction because suicide is not something that you can reverse.  It almost gets back to the question of the  ‑‑ I asked of Jeff.  Nobody elected you or funded you ‑‑ my own little nonprofit connect safely.  We were interested in what we jumped on, introduced.  And expanded on it safely.  You are a company out there providing a service.  I will leave it there.

     >> JEFF COLLINS: Suicide is the second leading cause of death for 15 to 29‑year‑old people, and growing annually.  A lot of the people are within this range.

We feel it our responsibility, it is to keep the platform as safe as we can.  We believe providing help is better than staying on the safe side and not doing anything.  I would rather run the risk of having a false offer to help than not taking action when that action was indeed needed.  As I said, if we take up ‑‑ we offer help.  It is a false positive, our users usually get back to us with comments, they feel annoyed and with that our algorithm gets better and moderators, too.

     >> MODERATOR: Jeff, you had a comment.

     >> JEFF COLLINS: Just reflecting on why companies have to work on the issues, it is a good question.  Because we're not experts on mental health or suicide.  So we're not really the ones that should be making the tough calls, so yeah, we err on the side of caution.  I think about this, I think the larger problem ‑‑ I am saying this from a U.S. perspective is we have to do something because we're on front lines.  A lot of times where we're failed here is the government.  In the U.S., we have this decentralized system where the federal, national government isn't setting policies on things like cyber bullying.  We see an ad hoc system where states who control policy on cyber bullying and how to work with young kids on mental health issues they deal with it very differently.  That is a problem.  What we see are the systems that are not dealing with the problems in a holistic way.  You can contrast that to other countries.  What Europe is doing is great.  The Stanford center on youth mental health and well‑being.  There is an organization that is standing up a system to model the way Australia and Canadian works on the youth mental health.  Australia's is called life is for everyone.

We have system where we send young people to crisis text counselors.  There are some points where you are dealing with suicide where a young person needs in‑person help from a psychologist, psychologist.  There is no really system set up for that.  The only way to do is if you go to a mental health office. 

They have integrated texting with physical centers where young people can go and get in‑person help.  There is a lot that has to be done by governments and we're lagging behind.

One other thing I wanted to say is I think the point Heffna made on resilience is one of the most salient points on the topic.  This is what all the cyber bullying experts have realized, the real way to deal with it and tackle it is improve the resilience of young people.  That is tough.  To improve resilience, you need to let people handle that is their open.  When you are a company, you will have people criticizing you and all over you and that doesn't look good on your company.  We err on taking things down.  This is on the societal issue of helicopter parenting.  There has to be something we can do.  We think about this in our app, we want to build resilience, but we don't have the answers.  We work with academics researchers to find Houston to deal with these things where there is not an answer yet.

     >> MODERATOR: I will ask you to weigh‑in.  Let me frame something.  I was thinking about Tumblr because you mentioned ‑‑ you talk about art, expression, L.G.B.T. community.  It strikes me you have a lot of edgy content and you have a community that includes people that are somewhat marginalized, which would typically, there is a higher risk.  L.G.B.T. has a higher risk of suicide than others, for example.
How does Tumblr deal with it.

    >> VICTORIA MCCULLOUGH: We touched on part of this.  It is partly companies investing in the cultural shift around talking about mental health.  That is the other thing here.  We see artists come on and they use their artwork to express themselves and around challenges with mental health and how they deal with it, but it is a way for them to think about getting help and helping others.  What we're doing as a company is say we want this to be a safe space to talk about every element of mental health.  Name it, find resources, and we want you to talk about it here, because you can help someone.  I think by providing the space and features, we are helping the cultural shift around mental health.  Platforms need to be part of that.  One thing that we have done in the last year with a partnership is a film about the S word.  It was about ending the silence of suicide.  It was focused on suicide survivors and stories.  We worked with them to develop content and get it on the platform because are diseases you ‑‑ you can't always ask, it is so tragic, but you ask what was the thing that sent you into a dark place that you maybe resorted to self harm.  Getting some light shed from the real survivors was a powerful tool that sparked the conversations.  How we think about it at tumbler is we want to be the cultural shift and woe want to help and making the platform a safe space, but the space that often will be the first place that people use the language ‑‑ it is interesting to watch millennials to gen Z.  Millennials and our parents above us would talk about oh, I'm feeling down, I had a down year.  Millennials would start to name a little more.  Gen Z kids are really starting to put names ‑‑ I don't know if this is your experience, to say I am depressed, I am struggling with anxiety.  I think they need to go along with them.

     >> MODERATOR: Among your peers Phillippine, how common are the conversations.

    >> PHILLIPPINE BEMIJE: I we think talk about it in general.  A friend is feeling depressed if they go off to a close end.  I have friends that walk up to mow talk about self‑harm, because we just need to talk about it, because we're more comfortable talking to each other than others I think.

     >> MODERATOR: We will open it up in a minute (?).

     >> MONICA ROSINA: You were talking about why should companies do this, I think it is obvious they have a corporate social responsibility.  Of course, companies must take part in this as a part of society, really. 

The public sector should also do their part.  They should review their policy on a regular basis.  And base that on evidence based data and be sure to provide adequate funding for research and education.  But as I said, the corporate responsibility is a huge thing.  These two actors also need to work with the grassroots, the NGO, people working with the people in question.  And I think it is also extremely important to ask children and youth, the ones we're talking about, to ask them, find out the reality they're living in. 

We did that this Iceland around 25 years ago, or so.  I don't know if you heard about the Icelandic prevention model.  We were the worst in Europe with drinking and drugs among teens.  Now we're the best.  In '98 there were 43% of 15 to 16 years old that were drinking in the last 30 days.  Now it is 5%. 

The recipe for that was everybody came together and collaborated.  Academia, it is all based on research.  It is the real of teens and I was a teen back then.  Then we had a discussion between the stakeholders.  And the last thing, not the least important, it was a community‑based level.  Not the whole country approach.  You had research done in each community, each municipality and you used the data to work on the specific problems as near to the youth as you could.  This is the thing we're trying to sort of translate over.  To better Internet use and trying to reduce anxiety and loneliness and the challenges we face.  We tried use the same approach.  The positive thing is the information from 2018 shows we had a steady upward trend in anxiety, especially with girls and depression and it was correlated with the use of social media.  Now it is going a bit down.  We have some reason to believe we're on the right track.  I think one of the most important things is that the discussion was opened up in society and everyone got on board, like we can with the Icelandic prevention model.

     >> MODERATOR: (?) Platform, but your platform has long prided itself on what is called real‑name culture.  Since we got the plug for anonymity, and why Facebook finds identity such an important thing for the platform.

     >> MONICA ROSINA: Sure, we just believe that people are more responsible when their real selves are online.  There are peoples that people can go in ‑‑ there are apps that people can go in and use not their real name.  But in it Facebook, we believe people will be more responsible and held accountable if they're using their real name.  We investing enormous Lee ‑‑ enormously, to remove fake accounts.  Most of the issues comes from fake accounts.  I want to use the opportunity much the open mic to address one last point.  Which is we ‑‑ we would not be able to get anything right if we didn't work with experts.  We are not experts in suicide prevention.  We have a full, big dedicated team to that.  But we rely on external partners, academics, Civil Society, psychologists and NGOs that work on this issue to work with us so we can provide more safety and Facebook can be a welcoming place for everyone.  There is still a lot that we don't know.  There is still a lot the experts don't know about.  And that is why we're pledging this year alone, $1 million towards research.  To better understand the relationship between media technologies, youth development and well‑being.  I think that is one area in which there needs to be more numbers, so we can work better.  And sometimes, you know, develop tools that might not be good for business but good for safety.  Instagram now allows you to track how much time you spend on the app and Facebook, too.  I personally never realized how much time I spend on Instagram.  When I look at the numbers which Instagram shows me in a graphic, I was like, you know, maybe I spend too much time on Instagram.  I can now set a limit.  I set that limit to 15 minutes per day of how much time I'm on the app.  Business might look at us and say it is not in your business interest.  I am truly proud to work at a company that can say we're taking this step because we believe people will be more responsible if they know the limits and we're providing tools for that.  So I'm no longer on Instagram for over 15 minutes a day every day.  I'll close with that.

     >> MODERATOR: Congratulations.  Go ahead.

     >> AUDIENCE: I'm wondering you say it might not be good for business.  Isn't it good for business to keep customers happy and you have happy parents maybe and happy kids might it be a good thing for business?

     >> MONICA ROSINA: Absolutely.  I think we need to look at business through different lens.  Our numbers, internal numbers shows that people are happier when they're engaging actively and not passively consuming content.  And yeah.

     >> MODERATOR: A question from the audience or comment, go ahead.  Introduce yourself, please.

     >> AUDIENCE: My name is Alina I work for the Dutch government.  Thank you for coming.  I'm glad to hear from social media platforms on this topic.  I have a question concerning Facebook.  You mention that you were proud to work for the company and that Facebook really also invests a lot in humans, in the people that work for the company.  And then I'm wondering what you think about the documentary using human moderators to remove harmful content, what do you do to prevent suicide amongst this group.  That is a good question.  Thank you for asking. 

We have a lot of people working on content that can be extremely traumatizing.  So our policies within the company is to provide full support mentally, you know, with the help of psychologists and having more time off to the people who are working with this.  And people choose to have this job.  We never say you now have to do that.  That is one important thing to be clear.  I have personally been involved last year with a lot of material that involved child exploitation and I met a lot of mothers ‑‑ I'm a mother ‑‑ that choose to take that job and look at those images because they believe that, you know, it is their way of contributing so that that material won't be in our platforms again.  I don't have the mental structure to pursue that job, but ‑‑ first of all, people choose to do it.  And when they do, they have all the support, especially from human resources and psychological help so they can help.  It is a really hard job, I'm extremely proud of the people that do that for us.

     >> AUDIENCE: Thank you.  I also notice that the jobs are carried out by people in underdeveloped regions of the world.  Don't you think that these people see this as a way to get an income and they don't really have a choice for this.

     >> MONICA ROSINA: No, I don't see it that way.  We have people working in different regions much the world because we need to have people working on all time zones so we can take action fast.  I would disagree with that point of view.

     >> AUDIENCE: Okay.  Thank you.

     >> MODERATOR: Yes.  Woman and then ‑‑ go ahead.

     >> AUDIENCE: Hello, I'm Diana from Chinese YMCA.  There is high amount of suicide especially among young people, because of the academia.  Many miss the report, keep reporting the case of the student come late suicides.  Therefore student may misunderstand the (?) want to tell the general public.  Is this for leading suicides of children or one work.  How should we deal with this?  Thank you. 
     >> MODERATOR: Anybody want to it take that on?

     >> If I understand you right, I think you are talking about how we talk about suicide?  And how we talk ‑‑ that is also very delicate issue.  Because discussion about suicide can go either way.  It can make someone think about it.  Or it can make someone open up about it.  So we have to be very careful how we talk about suicide.  We shouldn't maybe paint the picture of the world too dark.  Like I was talking about media.  We have to have a balance.  There are many good things in the world as well.  If we take few cases and make huge news about them, it might have a negative effect.  So that's right.

     >> AUDIENCE: Thank you.

     >> JEFF COLLINS: In the U.S., we have guidelines for journalists on how to cover, how to write about suicide.  The problem is that's not many journalists actually follow them.  Stanford put together a conference and it was difficult to get representatives from big papers.  But the kind of things that the guidelines cover is that when someone is writing an article on suicide, they should be factual about what happened but shouldn't go into additional details about the person but should spend time on providing resources that are available.  There is a debate among academics whether or not it can be helpful or harmful to report on suicide.  Most experts seem to think it is not beneficial to report on every suicide in the community because there is a follower effect.  But yeah, I think ‑‑ you raise a good point.  Even if there are standards that not a lot of journalists are aware of them or follow them.

     >> One more ‑‑ just one more that I ‑‑ one more point.  Because I think this is super important.  This is just ‑‑ it feels like such a small thing on the topic.  One of our organizations has really encouraged us to even talk about when we think of ‑‑ think and talk about someone dying by suicide.  As opposed to committing suicide.  I think changing really down to very detailed things in changes how we talk about suicide specifically.  I think using the term died by suicide as opposed to committed, it suggests this is a criminal act.  It is a small, but can be a significant thing.  I think in changing the conversation.  You do, you bring up a good point.  There are small things.  I want to mention that.  Because as someone who has family members that did die by suicide.  We use committed suicide throughout our family.  I think that is a small thing.  But something I think this group of people can be part of sort of thinking about how we use the right vocabulary.

     >> MODERATOR: Not to glorify somebody who died by suicide.  Lady GaGa put up pictures of young people that died by suicide and then some experts got in touch with her and educated her and she's now completely changed that practice.  She's got a much better understanding.  Obviously, you mourn the loss of anyone that dies but you don't glorify somebody because it could have a contagious effect.  Gentleman there, you had your hand up.

     >> AUDIENCE: I'm a software developer from the Netherlands so I'm not a professional in mental health.  I see the trends in the platforms where automatic mental health techs is being deployed in the content.  In case a user is flagged so to say, they're guided towards mental health specialist.  Is this the same priority for physical health instead of mental health to detect whether users are for example, spending too much time with ‑‑ I know when I'm on reddit, I would rather stay on the couch instead of start exercising.  And do software industry compromise our user's health for short term goals like impressions.  Thank you. 
     >> MODERATOR: I sometimes wear an Apple watch, which reminds me to get off the couch or chair to walk around.  It is interesting I have never seen Facebook tell me, hey, Larry you have been on your butt the last 45 minutes posting, maybe you should go for a walk.  Maybe that is something you should consider.

     >> MONICA ROSINA: That is great advice, we'll take it back to headquarters.

     >> MODERATOR: Give this man credit, not me.

     >> JEFF COLLINS: Good point.  I worked at a big corporation, the computer would remind us lock us out of the keyboard, stand up do exercises.  It seemed silly at first, but it is a good point.  We on our app will run positive campaigns.  We do things around body image, mental health, civic engagement, but I think you're totally right.  There is no emphasis on tracking, yeah, physical health sitting there using the app like we do with the mental issues.  Being, you know, good corporate citizen is something we all should think about.

     >> MODERATOR: Gentleman right there, yes, sir?

     >> AUDIENCE: Thank you very much.  My name is Andre (?) I came from Russia, (?) economics.  We have big issues concerning suicide because previously we had like a game in social networks when some people engaged children into committing suicide.  It is called the blue whale.  You can read something.  But here I'm much closer to maybe the legal issues.  Maybe could we provide some responsibility?  Because in Russia, committing suicide or ‑‑ not committing suicide, but engaging in committing suicides is subject for a blocking or extradition block from the website.  But usually the website block is more used in political maybe situations or like to engage in political censorship, not into proper.  But what do you think legal measures could be taken to prevent suicides or prevent like games like this in social networks without infringe on human rights and freedom on the network.

     >> MODERATOR: I want to comment quickly on blue whale.  I have done research on that and others from others, aside from few other websites in Russia, we've been unable to verify whether it exists.  There are many researchers looking into if it is an urban myth.  People have done follow‑up as a result of the story coming out there.  But it does bring up the broader question, thank you very much, of what is the role of companies in making sure that their platform ‑‑ you addressed it a little bit on Facebook and Tumblr the role of companies to ensure your platforms are as healthy as possible without censoring free speech?

     >> JEFF COLLINS: Thank you for raising the blue whale question.  We have an expert in the audience.  Chad was the founder and president of Safer Net Brazil.  He followed up a lot on the response from Congress to the blue whale.  Maybe you should chat later on.  But what we did see is a spike because the media was talking so much about it.  If you were to look at Google trends, there was a huge spike on search for that were related to suicide right afterwards.  I think, you know, journalism standards on how to talk about the issues are extremely important to have in place.  On our side, we were fast to act and to identify images reeled to the blue whale.  Larry is extremely right.  There hasn't been ‑‑ like, we haven't been able to verify that there has actually been suicided connected directly to the blue whale.  But we were aware of it.  And we worked fast to identify any trends.  But to Larry's point, it is ‑‑ I think it is the hardest job on Facebook.  How do you or any social media, for what it matters, how do you balance free speech and having to remove content that can be harmful to your users.  Sometimes we get it wrong.  We don't always get it right.  We're learning every day.  And we're trying to do our best by our audience.  For sure.

     >> MODERATOR: Way in the back, you had your hand up, yes.

     >> AUDIENCE: Hi, I'm not going to tell you my real name.  I don't think it influences what my opinion is.  So people commit suicide sure when bullied and influenced by the media, people primarily commit suicide when they find themselves isolated alone and nobody to talk to.  Now, Facebook would love to tell me to use my real name and use a persistent identity so they can create a tenured profile of who I am serve my targeted ads and maybe sell it to third parties when Facebook tanks.  That is great from your perspective.  From the grand historical perspective, adults have failed to protect children, especially from things like sexual abuse.  Most likely you are to be sexually abused as a child by someone you know or someone in your family.  It can be really hard to talk about that and to go on Facebook and to discuss that with your real name and real identity, like ‑‑ that's really hard for some people.  Anonymity is very important.  I'm like one year older than a millennial.  And the first social network I knew was IRC.  That was a wild and dangerous place but one went on IRC with an identity that could be burned.  If you started to be cyber bullied or embarrassed yourself or your identity was making you uncomfortable, you could choose a new one and come back.  It allowed me to seek help for my mental health and yes, I did meet these pedophiles that existed on the Internet.  He only knew me by a nickname, my IP was masked.  Yeah, it was no danger to me, I even had a chance to troll the person.  So I really don't ‑‑ I can understand how using real names makes people bully less and harm others less.  But the net ‑‑ I don't see a net benefit for having a ban on real names on social networks.  As I said I don't use my real name on Facebook and I only change it if I get in a fight with someone.

     >> MODERATOR: I think you illustrate the fact that one‑size‑fits‑all and happen you spoke.  It is fine that you use that name.

     >> AUDIENCE: I have a question for whoever wants to answer this question.  It seems like there are three types of experts here ‑‑ I mean, platforms.  Two platforms are Annan ‑‑ anonymous.  Two aren't anonymous.  For you Phillippine, when your friends, do they reach out to help first to their friends?  I mean, or to a platform like Facebook?  They can't use Facebook or other platforms?  Do they want ‑‑ like, with your friend, did she go to you first or reach out on the social media?  I'd be curious if after school and Facebook and Tumblr like, who gets the most reaching out?  Like what is the steps?  Who do they contact first?

    >> PHILLIPPINE BEMIJE: So I thought that my friend ‑‑ she didn't really go on social media first.  She wanted to talk to us, because I think she felt more safe because I feel like the thing with the Internet is you're not always ‑‑ it is not always sure about anything.  So it is like when you talk to your friends, I think that in our case, maybe not for everyone, but she talked to us because that's how she felt more comfortable talking to.

     >> If I could just comment on this, because we often go into schools with seminars for kids about better Internet and seminars on bullying and friendship training, because that is the new approach we're taking towards bullying, starting early with prevention, to work with relationship, communication and this friendship training and changing the atmosphere in the classes.  After we go into the classes and especially when we have our youth panel doing peer‑to‑peer education, there are often kids that come afterwards to talk to them.  And tell them about their troubles and ask what can I do or can you help me or do you have any good advise.  So the peer‑to‑peer relation is really important, as you mentioned before.

     >> MODERATOR: We have time for one or two more questions.  Way in the back and then I'll get to you, way in the back, yes.

     >> AUDIENCE: Hi, can everyone hear me?  Awesome.  My name is Baldip, I am an academic, I teach English literature at a university in Germany.  I have two brief comments to make.  Firstly, so, this semester I'm teaching a course on ‑‑ called individual in society and the age of technology.  This momentum of the Court is not ‑‑ of the course is not to draw judgments on whether digital technology is good or bad but to understand who have we become as people in society given these infrastructure.  In the first lecture, I asked my students to write me and tell me why is it that they chose this course.  Why are they here?  These are students in their early 20s.  Some are around 19 or so.  Something that was common to all of the writings was that when they were in high school and they were teenagers new to the Internet they were just absolutely confused by the entire space.  A lot of them faced mental health issues from being online.  One student told me that she actually liked it initially because she could put an account on Instagram and post makeup photos and her friends would never know because she didn't have to have the classmates as the audience.  She made a Facebook profile and Facebook linked her account and everyone know about the content on Instagram and bullied her about the makeup and photos.  It got me interesting.  I feel like it is not enough to improve the online spaces we need formal segments, lessons, trainings in high school spaces where we take our time, take our resources and really train students in using online spaces and technology.  I feel like it is kind of like driving a car, if you don't know what you are doing, you will harm yourself and others on the street.  Some suggestions to what the trainings could do is first of all using social media for positive personal growth.  Lessons on being safe online, making stronger passwords, lessons on being anonymous online without harming others and reporting online abuse.  I feel like a lot of cyber bullying boils down to microaggressions and online abuse that people ignore until it is too big.  Kids need to realize online abuse is real, valid and something that can be tackled.  We need to know about legal and extra legal measures they can take so they don't feel like they don't have an alternative but to commit suicide.  The second thing ‑‑

     >> MODERATOR: We are actually just about out of time.  If you can finish up quickly.

     >> AUDIENCE: I wanted to point out a gray area, which is life after a suicide attempt.  I don't think we addressed that enough.  Because I think the support systems that you need for someone that is thinking of suicide are very different than what you need for someone that tried to commit suicide and survived.  If you could talk about that a little.

     >> MODERATOR: Thank you so much.  Unfortunately, we're out of time.  Jeff has an interesting statistic from the online poll.

     >> JEFF COLLINS: We were running a live Twitter chat.  We can't share live because of the system malfunction.  We had over 100 people in the online chat.  A bunch of suicide prevention experts.  So one of the questions asked was can technology aid in preventing suicide among youth.  74% said yes.  12% no.  15% said they done know.  So that is kind of an interesting ‑‑

     >> MODERATOR: By the way, that is not idle speculation, I know for a fact some people are alive today because of platforms represented that this room.  I know there is a great deal of help social media can do.  On the other hand, I have to applaud Phillippine for the help she did for her friend.  I think we can deal with this problem technology didn't create the problem, can't solve the problem, but it can help.  I want to thank everybody in the audience, thank the panel.  I hope everybody learned as much as I did.  Thank you very much. 

Contact Information

United Nations
Secretariat of the Internet Governance Forum (IGF)

Villa Le Bocage
Palais des Nations,
CH-1211 Geneva 10
Switzerland

igf [at] un [dot] org
+41 (0) 229 173 678