Welcome to the United Nations | Department of Economic and Social Affairs

FINISHED TRANSACRIPT

 

EIGHTH INTERNET GOVERNANCE FORUM

BALI

BUILDING BRIDGES ‑ ENHANCING MULTI‑STAKEHOLDER COOPERATION FOR GROWTH AND SUSTAINABLE DEVELOPMENT

OCTOBER 24, 2013

9:00 BALI

WORKSHOP 58

ITU‑UNICEF JOINT OPEN FORUM

 

********

This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.

********

 

>> THOMAS HART: Good morning, everyone.  

Thank you for coming to the open forum. There was an amazing conversation on sleep patterns, I hope ours is equally interesting.

So, we have a great panel here today. We hope that the audience is ready.

The primary reason for this open forum is to get your views on the industry guidelines that we have in looking to our partners for at least a few months now. So, we're looking for your feedback, we also posted it online so that your feedback, UNICEF, we'll be telling you more about that.

Let me quickly introduce the panel. I'm from the ITU, General Sectarian.

We have UNICEF, Dominique Lazinsky, Susan from the Internet Watch foundation, John Carr, UNESCO. We have the European broadcasting union and Kim Sanchez from Microsoft. Thank you everyone for joining us.

I'm to start the presentation.

So, my colleague Carla was to be here. Unfortunately she couldn't. She's asked me to do the presentation. I realize that the focus is on the guidelines. I'll very, very quickly run the presentation. It is on the online child protection initiative.

So, quickly, cybersecurity, we have been working in this area for some time. In 2005 the information society as well, we got the mandate for acting as a facilitator for security. In 2007 the Global Security Agenda was launched, which is a frame of the cooperation and we have launched many initiatives. The two big ones are impact and assisting other countries for establishing certs and other actions and the child online protection initiative.

I'm going through this very, very quickly. Five minutes is all I'll take.

The Global Cybersecurity Agenda, the national cooperation tries to look at a holistic approach. We have the harmonizing of the national laws, technical and procedural measures, operational structures, helping countries establish in traditional frameworks, training awareness, and international cooperation frameworks.

Child Online Protection Initiative, of course, launched in 2008 and under the GCA we're trying to bring together stakeholders from across the board, all groups. We're trying to raise awareness, deliver practical tools to help governments, educators, parents, sharing knowledge and experience.

We have partners from across the board, under the initiative. International organizations, UNICEF, of course, UNICREE, and many others. Civil Society, of course, and that's one of the most valuable partners. We are grateful for the support we get.

Private sector, you will see that many of them are here, supporting us, working on the guidelines, Microsoft, GSMA.

We have had a discussion on this. We have ‑‑ our patrons, global champions, people that are really motivated to support us and people with global reach. We have the president of Costa Rica that's been a supporter of the initiative. We have the former FCC commissioner. We have Excellency ‑‑ the First Lady of Nigeria that was recently appointed as the Child Online Protection Champion for the Africa region.

The guidelines: So the first type of guidelines were developed in 2009. Again, working with our partners, and these guidelines are ‑‑ you can find them on the ITU Website. We have guidelines for children, parents, educators, industry and governments I think.

So, this was 2009, we realized the world has moved on and it is time to update the guidelines. The first guidelines we decided to update was the guidelines for industry. The recommendation for this came through our working group which is a group of membership at ITU membership and it is open to all stakeholder groups. We constituted a working group and we are grateful for UNICEF for taking the lead in the guidelines, taking oral charge of the guidelines.

Of course, we have gone through the process of revising it, we have a stable draft but we really need input from all stakeholder groups, and we're doing that by us going to different forums where we can meet stakeholders and also putting it online so that we get feedback from the widest possible audience.

Some other activities, national surveys, statistical frameworks and indicators in 2010, working with developing case studies and collecting country profiles for what countries are doing on child online protection. We have prepared a draft, we have presented that to the recent working group meeting in October and what we're trying to do now is get the countries to validate it so that we can post it online. It gives you the focal point for the political country, for child online protection. It gives you the various legislative measures that a country has implemented and other initiatives that the country has done. It is a valuable resource. I encourage you to contribute to it.

National Strategy Framework, this is one of the big focus areas. Helping countries deal with a national framework, so you have a more harmonized approach in countries to the issue. You need your law enforcement agencies to work with your hotline. You need your Civil Society members to work with the government and you need ‑‑ you need different government agencies to work together. You need a national strategy framework especially in developing countries. We're working with countries to help them establish this. Some of the recent ones we have worked with, Nigeria, Cameroon, Ghana, Sierra Leone and we're active and looking for your support.

We have just started a project in Ethiopia, an organization with Facebook, it is a 6‑month project. Primarily on building safety for the schools.

We recently had a big global youth forum which many of you have participated and contributed to in which we had be smart, be safe which is a primary online protection, and we are grateful for UNICEF for their active participation and support in the organization of this.

We had the trainer program, and we also had a global competition there. We have put everything online. You will be very interested in going to the Website to look at this.

We have a working group on child online protection. It is one of the more formal groups with an ITU but, of course, because of the topic, the conversations are quite formal and intense and attractive in nature. It is a place where the 193 countries, the focal points, plus industry, plus Civil Society, everyone can come and discuss public policy issues, technical issue, policy issues, capacity building, training awareness initiatives, and come out with good solutions to implement.

Some upcoming events, I know that many ‑‑ I spoke to a few yesterday who are going to a political workshop. Telecom is coming up, we had planned to have a child online protection representative there. We're working on a cybersecurity conference on the 2nd of December. We're planning an African Child Online Protection Summit in February.

Some other planned activities, as I said, the working group, a very active group, we discuss these and come up with talks on helping countries. These are some initiatives. Again, I invite to you contribute to these.

Thank you very much. I'll pass on the floor. Thank you.

>> Thank you very much. You can help me with the slides as you have access to them.

Just before starting really I would like to mention a few words about UNICEF and why we're actually so ‑‑ why did we volunteer to take a lead in editing the guidelines.

Part of the explanation is actually in basically the structure of UNICEF. Actually the team that's been most involved in this initiative where I'm also a part of, it is the Corporate Social Responsibility Unit of UNICEF which is a fairly new unit within UNICEF. Our primary mandate is to engage with the private sector on issues related to children rights and business.

I don't know if you're familiar with the principles related to business and children rights which were introduced last year. That's the framework within which we operate and engage with the private sector through the directly or through different kinds of platforms. We also engage with the governments on the children's rights and business agenda and we now have a general comment number 16 for the convention of the rights of the child which specifically looks at the issues of business and children's rights.

We had a kind of good reason to get very engaged. Obviously UNICEF has been part of the platform for quite a while. The main participants have been more from the program side of UNICEF, the Child Protection Unit.

That's the background. We're excited about the guidelines. We hope that the broader consultation beyond just the group of the members of the initiative will interest people and we'll get feedback and commitment for the guidelines.

Let's have a look at the slides. It is wonderful to have a male support staff here helping with the slides.

Just as ‑‑ I'm happy to be able to be here today to introduce you to the guidelines. Basically I'm going to explain a bit about the background and the development of the guidelines and then give you the kind of overall concept of what the guidelines look like today and speak about the general part of the guidelines and I have our colleagues here who are going to speak about the specific parts which are for the different sub sectors of the industry.

So, let's start with the background: This is repetition probably from my colleague from ITU. I think a key thing this morning, when we started working with the ICT Industry two years ago, when the unit was set up, it was the challenges. A lot of guidance available, right, left, center.

Same time, technology is moving fast forward and a lot is going on in the regulation area. When you look at the convergence of what happens with technology and all of these issues and the guidelines around there it is difficult to stay on track. Obviously normally the way that the U.N. organizations work, we have our little silos and we start to work on our own guidance for whatever we see fit, but so we as being part of the initiative, we thought this would be a great opportunity to work together with industry experts and colleagues at the U.N. to actually start putting together a framework which would be more meaningful than the existing guidelines that were there from five years ago.

That's a bit of the background.

The other thing, which was kind of a key trigger, if you can have the next slide, please.

It is also the kind of broadening discussion about children's rights. It is not only about children's rights in terms of the protection of children, but it is much broader as a discussion. There is a very strong participation element when this comes to the children rights in the virtual world which also needs to be brought into the framework of how the private sector should engage in this area.

More specifically on child online protection, I think the audience is well aware of the number of initiatives and challenges with the legislation and the global nature of the issues. Also some of the ‑‑ you know, where UNICEF comes from, very much thinking that, you know, this is ‑‑ whatever happens in the real world should be able to be possible in the online world and the parents and educators are all being very important. It is much more challenging in the online world. Lots of issues and challenges and all of this, this is the discussions behind the revision and the work we started on the comments on the revision of the guidelines.

Some of the key topics that we're being discussed and some of them came through in the previous guidelines already, more specifically kind of highlighting the core risks when it comes to child rights.

Obvious topic, child sexual abuse, this is the word we ‑‑ the U.N. prefer to use as opposed to child pornography which has a bit of a different tone to it with a consent element. It is all about the content, about conduct, about contact. The raising discussion is really about the ICT value chain. How do you define it? How broad do you think it has to be? If you look at the value chain and start adding all of this additional services, not only in terms of applications or content, but also E‑commerce, et cetera, how broad can the area of responsibility and kind of activity in terms of child protection be?

So, what came through from this discussions was an element not only looking at the protection element, but also the importance of the ICT industry as a driver for children rights and how the participation element of children rights could be incorporated.

Here are some of the topics related to that. I'm not going to repeat it all.

So that's really the background for the development and the side of the discussion, and I think it has been a very fruitful process and here you see the number of institutions will be a part of the discussions and contributed and basically form the new framework for this guidance.

I think it's ‑‑ looking back at the original guidance documents, I think there were challenges on the fact that especially when you look at the membership structure, the private sector, it is not very well presented there. Having the opportunity now to have a broader consultation on the guidelines and specifically looking at the subsectors of the ICT industry is relevant to really nail down the detail and gain agreement from the industry on these issues.

So, what's happening now, basically the current draft of the guidelines is available. It is just posted on the Business and Human Rights Website for consultation which we're announcing here right now. So, as you see, the IGF logo is underneath.

Basically, during the course of the next ‑‑ roughly the next month ‑‑ we're looking forward to getting feedback on the guidelines. The guidelines are there. There's also a questionnaire which provides you a framework for providing input and if you have any quarries both ITU and UNICEF are available for any consultation.

There will be another kind of presentation about this at the ITU Telecom World '13 in Bangkok.

What are the guidelines? As I mentioned, there was discussion about how to structure the new set of guidelines in a framework which sort of respects the advances in technology, the convergence of the challenges but also the regulatory framework.

One of the aims of the guidelines is to set the scene really strongly within the U.N. guiding framework and as to how they were introduced. That comes through especially the first section which talks about the policies and management processes. So not to look at just child protection in isolation of everything, but to really frame it within the other rights‑related processes and policies that the company has.

The second section is looking specifically on the child sexual abuse content.

The third section, the general guidelines, looking at the safe and age appropriate environment. Very much content, the contact, as well as the conduct elements and how to make sure that the private sector can provide an environment which is as safe as it can be.

Then, the fourth section, it is about more the advancing of children rights through the children themselves, parents, teachers that are key elements in this. Finally it is about the positive use of the technology to further good citizenship.

These are the core elements and I'll drill a bit more in detail in the different sections. Just to repeat in terms of the ‑‑ what the aim of the framework really is, it is to have kind of a broad accepted framework which obviously is going to be a generic framework which will need to be taken in to the national context when it comes to legislation, et cetera. What we're really seeking to have is the private sector participation in this initiative to take leadership in the initiative.

When we looked at the different industry participants, obviously depends on how you want to define the ICT value chain. You can go as broad as you like. These were the industry subsectors that we started to look at which then formed the part which is now the subsector specific guidance, if you look at the next element.

We have six specific checklists for the subsectors within the framework. That's the structure of the guidelines.

More specifically, I already mentioned the policies and management processes section, that really ties in to the broader framework of corporate responsibility within the organizations and how the child rights element as well as the overall child online protection fits in to that. The way the U.N. guiding principles call on the private sector, it is about the policy development, it is about the due diligence to identify the impacts within the operations by consultation of stakeholders including young people and then go through the due diligence process to also report and be transparent about the issues and how they are being tackled.

The next element about the child sexual abuse content, that's very much the legal compliance framework. There's references to terms and conditions, to notice and take down processes, and really really the goal is to work with the national law enforcement and national hotlines.

The core kind of ‑‑ let's say ‑‑ online environment related part of the top line general guidelines, it is about the provision or the implementation of technical measures that are needed depending on the subsector of the ICT industry that you operate in to make sure that there is a framework in place as good as it gets and communicating this very clearly and also using the age verifications and the age classification systems where those exist or try to push forward initiatives to define those and make them standards.

Parents, caregiver, teachers, they bear responsibility when it comes to children's online protection or participation as well. There's an invitation to really look at this area and also provide technical solutions for those who are engaged in children's well‑being and educational activities. The last section is really looking at the broader ‑‑ broader children rights elements and in terms of supporting the participation elements, protecting the Freedom of Expression of all users and promoting good practices for advancing children rights on the online world.

That's a very quick introduction to the ‑‑ I don't know if it was very quick ‑‑ in any case, it was an introduction to the general part of the guidelines.

Now I pass on the microphone to hear about the sector specific elements.

>> Hello. Fiona from the Foundation in Australia. Thank you for welcoming our input into these principles.

I have been searching that site while we're online and it is difficult to find. Do you have any further guidance?

>> Sorry. It is ‑‑ I got a message while we were starting, that it is online. So I don't have the URL yet either. I'll provide that by the end of the session.

>> Terrific. Thank you.

>> MODERATOR: Just to add to that, we'll give the link to the site from the ITU page and the UNICEF page. It is all a work in progress. Probably by the end of the day everything should be there.

Just the order in which we'll present the guidelines, Dominique followed by Suzy, John, Kim. Thanks.

>> Thanks.

I'm Dominique from the GS Framework. Thank you for the overview. That was a thorough, in‑depth overview.

I'll touch briefly on the mobile operators and the checklists that were discussed within the opening. I don't want to spend a lot of time, I would rather have more time for discussion and questions.

The GSMA does a number of different things in this area and unfortunately my colleague Jenny who is participating closely and wrote the guidelines in conjunction with UNICEF couldn't make it today. I'm filling in for her. I'll try to be ‑‑ answer your questions as well.

A thing we undertake, the mobile alliance against child sexual abuse content. That's one of the many participation groups we're involved in including the working groups as well within the ITU.

In terms of the checklist, you will see that the mobile operator versus a particular section and again, I'm going to touch briefly on it, a couple of things that we promote is the fact that there's a world of opportunity for youth right now.

Actually a number of you have received the most recent survey and I have some shorter versions if you want to take a bit of a pamphlet, we survey a number of countries to look at children online and their activities. We realize in the mobile industry, there is a lot going on.

We believe in self-regulation and cooperation in that self-regulation especially with a number of groups including INHOPE which was mentioned.

Our checklist includes issues ‑‑ we believe we have to integrate child rights in to all corporate policies and management processes. You will have more details when you see the report.

We have also developed standard processes for cooperation with businesses with law enforcement. We collaborate to create hotlines, if there aren't hotlines as well as to work in conjunction with hotlines and provide training.

We have mechanisms and recommend those in the organization to report illegal content and again to cooperate, to work with investigations.

The other thing is, it is important to be very clear as mobile operator around the terms of service and terms of condition and what will or will not be tolerated and how that mechanism for reporting content when consumers see that, how that works as well.

We have a process, we recommend a process for the mobile industry in terms of notice and take down as well. Coming up with guidelines and information relating to the content, how that particularly works. In terms of interacting with consumers a bit more, we'll touch upon that. We'll be developing ‑‑ mobile operators should develop clear house rules, what's acceptable, not acceptable in terms of what is legal and not legal. Not only that what is acceptable in the terms of service including what content would be banned as well as, you know, things like swearing and bullying. Nothing particularly new but we go through that, highlight these issues and what will ‑‑ what will constitute a breach of terms as well as a suspension of service.

The important thing around all of this, as I mentioned probably briefly, is to be transparent and to be very clear and communicate all of this to both the consumers as well as internally to the people that are working at the mobile operators.

Finally, just briefly, mobile operators should complement technical measures with education and empowerment as well as enforcement. These activities, again, nothing particularly earth shattering but reminding that the activities are important in terms of, you know, educating both parents and children about what they can do and what is, you know, good and bad behavior and what is legal and illegal online and what services are age restricted or appropriate as well as how to be safe online.

If you have any questions, please let me know after.

>> Thank you very much. I'm delighted to be here on behalf of the Internet Watch Foundation.

We're the U.K. Hotline for Criminal Content, reporting that, and we're proud to be a partner in the initiative. I have a 2‑minute film to show you if the sound works which just explains what the IWF is and the reason I'm showing this, this is the approach that we're taking with the ITU and country‑wide templates that we have created. We'll give it a go.

>> When a child is sexually abused, it effects them for life. When they're filmed or photographed being sexually abused it leaves them exploited by those who seek their images.

By reporting these images, you can help stop this exploitation.

At the Internet Watch Foundation we work with the online industry to eliminate images of child sexual abuse from the Internet. Each year we receive thousands of reports to our hotline. We need your help.

If you see something online which you think violates the laws on child sexual abuse, it's important to report it to the IWF. Your report will be treated confidentially. You can choose to leave your details if you'd like to know the outcome of your report.

Our highly trained analyst also assess the images and videos you report against UK law. If we find it is to be potentially criminal we'll take steps to ensure it is removed.

The U.K. based web pages, the foundation works with police to confirm we can take action. The U.K. host or ISP then will be notified to remove the content and they do this typically within 60 minutes.

If it is hosted abroad we work with other hotlines through INHOPE, The Association of Internet Hotlines To Take Action. If it is hosted in a country with no INHOPE hotline the IWF works directly with international partners. While the content is awaiting removable, the Web page address is added on the IWF URL list which is voluntarily used by the online industry to make sure that users didn't see it.

Sometimes the analysts see abuse of a child that they don't recognize. When this happens we take action to find the source of the image and notify the relevant police body.

Our efforts result in success stories of children rescued from their abusers. With the collaboration of the IWF, the wider Internet industry and police, the U.K. is now a place to host the content. It is our vision to achieve this worldwide and eliminate the child sexual abuse content online for good. The continuing support of the online industry, government, police, other partners and, of course, the public, makes this an ever‑achievable goal. Together we can make a difference.

>> Right. Thank you very much.

The reason I showed you that, we very much advocate a multistakeholder partnership approach and removable of the sexual abuse content. We came in to working with the ITU we're delighted to be partners with them, where we looked at ways to do this in other countries.

So, we have worked to develop a country‑wide assessment template for countries without a hotline where we can go into those countries and we can bring all of those stakeholders together. We were in Cameroon ‑‑ I was in Cameroon with Jenny Jones from GMCA and Carla from the ITU, we were on a joint workshop with six African countries that were mentioned before. The point of this, then we can assess what their needs are, we can work from the ITU country assessment which covers a whole issue of online protection and then we can use that to drill down to actually focus more on the child sexual abuse side.

We have now gone in, done the first of our country‑wide assessments for ITU in Uganda recently completed that, where we met with all of those organizations, with Civil Society, with the police, with the Internet industry and one of the areas we need to do is raise the issue, it is part of the education and understanding of how and why everybody has their role to play in the fight.

I think getting people together to understand that alone, on our own, we cannot tackle this. Together we can deal with it effectively.

We have done the assessment in Uganda, developed a standard report template to come back, report back on that country‑wide assessment. The mechanisms we recommend when going in are very clear. If a country has ‑‑ is hosting a lot of content or has a lot of content to report, they may well need their own hotline.

If they need their own hotline, they need to be working with ‑‑ they can set it up on their own or work directly with INHOPE which is the international hotline. They'll be given all supports for that and we also other portals where peck can actually set up their own local home page and directly report to us at the IWF and we'll analyze the reports on an individual basis which is a cost effective solution for many developing countries.

The ITU partnership really does welcome involvement. A thing I wanted to say, when we engage in the process, we were welcomed with open arms. They really do want Civil Society industry, all of us, to work together with them. That's been a very positive process for us. Very much something that I hope everybody ‑‑ if you're not engaged with it, that you do engage. It is really a really effective process. I think it is really starting to bear fruition.

Thank you.

>> JOHN CARR: Good morning.

John Carr, I represented the European Alliance for Child Safety Online.

We were involved with the drafting of the guidelines from the very beginning, in fact, from the time that the first edition came out. It is good to see the ITU joining up in a more organized way with UNICEF and sustain that initiative.

Certainly from my travels around the world, I travel extensively, there is ‑‑ you know, a great deal of interest in what the U.N. agencies are doing. People pay attention particularly the developing world to what the U.N. has to say and do and obviously two bodies like ITU and UNICEF have got a great deal of weight and prestige in many, many different parts of the world. We're very glad to be a part of this process and I'm going to talk very briefly about the Wi-Fi component of the guidelines.

I think it is the shortest ‑‑ I think we can claim to have the shortest in terms of the number of words, but I hope it is nonetheless interesting and important for people. The experience that we describe in the recommendations that we have made, the roots are really what happened within the U.K.

This is briefly the story: So when 3G started to emerge, when rapid access to the Internet via mobile devices started to become possible back in 2003/2004 in the United Kingdom obviously kids started to go online through their mobile device on a much larger scale than previously. It was technically possible before, of course, to go online through your mobile but the speeds were so slow that in reality not many people did. The arrival of 3G, faster connectivity, it began to change that.

With that change, more and more handsets started to become available which as I say, enabled anybody with a handset to be able to connect online from that mobile device. The point that we made to the mobile phone industry at the time was that whatever view you take about the theory of kids being supervised by their parents or by their teachers or by the librarian, in the old world of fixed Internet access, that simply seized to be a practical application when kids began to do that through a mobile device. I'm happy to say that the mobile phone industry accepted that point and accepted also as a consequence that they should do more to try to help kids stay safe when there is very little possibility in practice, as I said, of parental supervision, teacher supervision, support from the librarian. So, they began from the first of January, 2005 by default to block access to all adult content on the Internet that may be accessed through the mobile device. Every one of our mobile phone networks do that. One finally came on board recently thanks to the pressure from our Prime Minister David Cameron who’s given a great deal of attention to this general question of child safety on the Internet.

Now every mobile phone network in the U.K. by default blocks access to all adult content. If you want to get access to adult content through your mobile device, you can, what you have to do, go through an age verification process. It doesn't take very long. It is not very difficult to do. It acts as a brake, a way of ensuring only adults are getting access to that content.

What happened next, of course, was the emergence of Wi-Fi. All handsets were available with the Wi-Fi connectivity built in and more and more companies, Starbucks, railway stations, hotels, more and more places made Wi-Fi available. We're seeing the kids with the same mobile device, the mobile phone couldn't get access to adult content through the mobile phone going in to Starbucks, going into a railway station, another major retailer and simply switching from the mobile phone network to the Wi-Fi provider and gaining access to absolutely anything that was out there. It completely undermined the investment that the mobile phone companies made in all of those security measures that they have taken. It made it a very simple and trivial question for them to bypass the connections.

We started to speak to the Wi-Fi providers, actually before the last election, before the last general election in 2010. It was only after the general election of 2010 when our new Prime Minister made clear that this for him was ‑‑ and for the government was an important issue that the Wi-Fi providers finally began to take notice and get involved in serious consideration.

What happens now, this was finally made clear about four, five months ago, is all of the major Wi-Fi providers in the United Kingdom by default will block access to pornography and many of them will also by default block access to other types of adult content. The difference with Wi-Fi providers, however, is that you will not be able to get that lifted. It will be on for everybody.

By the way ‑‑ I should have said ‑‑ this is incredibly important, this only applies where there are Wi-Fi services provided in a public space where children and young people are normally be expected to be found. That obviously covers Starbucks, cinemiles. So if the Wi-Fi is provided in a casino in a bar, in a strip club, a sex shop, whatever it may be, obviously that rule will not be applied. You won't expect to find children or young people in there.

The whole purpose of this was around children, it wasn't around censoring content that the adults may want to get in and out. By the way, part of this was not just about what children themselves may access ‑‑ imagine you go in to Starbucks and you're sitting at a table with the guy next to you, he has the tablet or a Wi-Fi device and he's bringing up disagreeable content that you can't actually avoid seeing. It is not just about access and who may access what but who may be exposed to what that was taken into account.

So, we're suggesting, again, it is out for consultation, that Wi-Fi providers outside of the U.K., other parts of the world may adopt a similar approach. It is not as if, you know ‑‑ absolutely nothing to do with censorship, nobody is asking for the content that's there but in public places where children, young people from present, Wi-Fi providers need to think about the consequences of making that provision available.

A last, quick point: We had one example fairly on in the U.K. where a Wi-Fi provider included a website in their blocking list that provided advice to kids about sexual health. Some idiot in the filtering company had decided that this was pornographic, maybe they were, you know, maybe not well educated or God knows what it was, this was a very bad move. It was drawn to our attention. We raised it with the mobile phone network and they changed it within 48 hours.

A lot of kids when they go out of the house they may want to access certain content through the mobile or the Wi-Fi connection that they wouldn't want to necessarily access while in the home. It needs to be done sensitively, needs to be done properly. We think it is important for the companies to consider.

>> Jack ‑‑ you know your career is a nightclub singer?

>> Too late. I can't change now.

So, I represent the European Union, the public service broadcasters, not only in the public service but over Europe and we participate in the society in conjunction with the World Broadcasting Union that is the association of the union of all the unions of broadcasting in the world.

We ‑‑ we are involved in this exercise with the ITU since the 31st edition or the ‑‑ of the guidelines. We have been participating in this exercise since the beginning. We're thankful to the ITU for the initiative that's very much in line with the prerogatives of the social service broadcasting.

As you know, each Public Service Broadcaster has a number of obligations that are part of his identity and among this, there is a special attention to be paid to the children, to the audience of young kids and children because, of course, there is specific attention to be given. What you cannot read in the slide is Public Broadcasting Services Providers, this is the part of the guidelines that we're working off.

The broadcasting of a specific issue, because in the past we ‑‑ we worked in a safe environment more or less, easy to keep safe our environment, you can use signals on the screen that could advise the parents about the programs that are difficult to be seen without the parental guidance or we simply use the time in order to protect the children from the risk. We're sure this program may not be accessed by children. But with the online world, we don't ‑‑ with the video demand system, with the catch‑up TV, with the various providers that work with the broadcasters, make the programs available any time of the day from whoever, this, of course, this kind of protection, the system of protection that we have used in the past, they're not any more efficient. We have to move to a different solution.

As you see from this, we try to make a number of checklists that we suggest to our members. This is a compilation that's made from a number of recommendations that are made by broadcaster, some are advanced more than others in the online world. As you know, the progress within Europe, of the different broadcasting transferring to the Internet world is very different there.

Countries where digitalization is very much advanced and there are others where the transfer of the access through the Internet, it is going very slow. This mainly comes from countries like the U.K., Scandinavia countries, where this transfer into the Internet and the online world is very advanced.

As you can see, there are a number of recommendations, some are general concerning when you go online as a number of precautions you need to take. For instance, the distinction between what is the broadcasting contents and what are the contents that are not coming from the broadcasting. This is very important.

Another one, the age verification. We have a problem, not all of Europe have the same rules, when you transfer this to a global level, it is complicated. We have a number of recommendations, this specifically is for the public, this specific audience, has this number of attention. For instance, listen, respect children at all times, don't patronize them, this is very important.

One of the rules that we have suggested to members, is that, for instance, there is a need for when you have interactive online services like chat or a possibility to exchange, interact between the audience, the young audience and the online world, that online has to be always with ‑‑ in the presence of webmaster that could intervene and eventually remove or ban inappropriate behaviors that, of course, could happen when there is a live interaction. There are, in fact, a number of chats where the children could be involved that are closed when ‑‑ that are not accessible when there is no web master present online.

So I don't go into all of these rules, these are some suggestions that we give to our members. Of course, for some of them, this doesn't mean nothing, they're not yet at this stage of the online consumption, the online transfer of the programs or services. This could be seen as a model that could be used as much as the progress of the digitalization and the transfer of the online world is progressing.

>> KIM SANCHEZ: Good morning, everyone. Thank you for being here.

I would also like to commend the ITU and UNICEF for taking on this project. I think it is very important to get this work out.

I'm Kim Sanchez. I'm with Microsoft and with a particular organization, Trustworthy Computing.

Our team focuses on education and awareness effort to help all of our consumers worldwide understand the risks that are out there, what they can do to stay safer online.

So, I have been asked to talk about the application portion of the checklist. The title of it is content providers online retailers and app developers. I'll just go through kind of the high‑level points that we're trying to make here and why.

So there was an article I was looking up, in 2010 wired Magazine wrote the headline the "Web is dead, long live the Internet." The point they were making in that piece was about ‑‑ it is all about apps, you can live all day on all of your devices just using apps. You don't necessarily have to go on to the Internet to the Web to find what you're looking for. There is an app for weather, an app for checking your time zones, currency exchange for those of us that travel, Facebook is an app, Instagram is an app coming soon to this phone! There is really an app for everything.

The concept we're looking at here with the guidelines is what should developers be thinking about as they provide these apps. I think a lot of app developers, some are good companies, some are very small, kind of what we call mom and pop shops. Just a couple of people developing these applications and they're not necessarily thinking about privacy or safety or security as they build these applications and what we want to get ahead of is bad things being harbored on the applications and what we're concerned about, in the U.S., Congress has been looking at Apple, iPhone, tracking, very much privacy specific. They're going after the big guys first, but it won't be long before the big guys are being made an example of and then the smaller developers will follow suit around this.

I'll talk about high level, what we want to accomplish instead of things that are in the checklist.

Content providers: We're asking them to take the following actions and we want them to think about balance. We know that the Internet is a fantastic place for all of us to be. It is just woven into the fabric of our lives, we can't get away from it, nor should we. There is something called balance and it is good to be outside every once in a while, kick the kids out, even for us to take a break from whatever device we're using. Getting that concept across.

We're asking about a process for handling child sexual abuse images, that they collaborate within their organizations and certainly with law enforcement when it comes to illegal content being reported and discovered on their applications. We would like processes and tools in place to identify the images when they're found. Then to certainly have removable and block process to stop the proliferation of the images.

We want to make sure that the developers are working within the organization and they're passing on illegal content to hotlines and to law enforcement for criminal investigation.

We get to the notion of developing a safer and age appropriate online environment. In the gaming world we have the entertainment software rating board, Peggy in the EU. And, of course, when we go to movies, there is a rating for the movies, if it is PG13, R, that type of thing. We're kind of thinking about that in the app space. I know that ESRB and I believe Peggy is looking into this, we want to make sure that we're very clear and we have an external label describing the content that the app ‑‑ who the app is suitable for especially if it is children that you can expect if it is for kids you will not have some of the more explicit content that's available.

We would like app developers to ensure transparency in terms of pricing the services and information that's collected about users. So, this really goes to complying with the relevant laws concerning the privacy of minors and in the U.S., again, we have child online privacy protection act, really born out of trying to keep advertising from kids many, many years ago. I know that that's become the de facto law, if you will, in most parts of the world where it means anyone under 13 can't use a lot of the sites or get parental permission to go online and to certain sites.

So, we want to make sure we're where possible we can adopt age‑appropriate verification methods to prevent children from accessing ‑‑ methods to prevent children from accessing risks of inappropriate contact, content may exist.

There is a lot of detail in here so I encourage you to go, check it out, you don't want me to read all of this. Trust me.

Then, most importantly, that notion of education and guidance, awareness. We have had a session yesterday talking to youth, one of the best things that came out of there, a young woman said parents just need to be curious about what their kids are doing online. I couldn't agree with that more. You know, you ask somebody about kids' games, what games do they play? I don't know. Oh, those violent things! They're not necessarily engaged with kids. I think it is a huge missed opportunity to really understand what's going on.

We do want parents to be curious and involved not just on what their gaming, what they're doing, interacting with online, the devices and services that they're using. I think keeping the conversations open is very important. There is some language in here about that as well.

Lastly, promoting digital technology as a mode to further positive ‑‑ to further positive specific engagement. Making sure that we offer a lot of rich, compelling content for age‑appropriate kids. For younger kids, they'll be more on the entertainment side. As kids are older, we need to think about content that's just not about entertainment, that's not all there is out there. What's going to challenge them? Maybe think about starting a business, standing up for something that they believe in a cause.

It is fun to talk to some of the youth here, particularly the kids from the Netherlands. There is a 17‑year‑old who has two, three businesses that he's working on and he's got venture capital money. Wow! That's so impressive, but really cool. I think it is that concept of making sure that we're not ‑‑ that we're educating about the risks that are out there but also promoting the Internet and the apps as a way to really ‑‑ you know, change the world. Why not?

That's it. Thank you.

>> Hello. Is it okay if I stay here? Okay.

Good morning, everyone. I do apologize for joining you late. I was presenting in another session at the same time. Happy to be here.

I missed the first spread where they probably explained the whole concept of the guideline. My name is Anga, I'm ‑‑ we're one of the ITU comp members from the very beginning I can say. Even though we're not a part of the first edition of this guideline, we felt there was a scope for to us engage in the process. Thank you for offering that opportunity for us.

I have to admit, the section that I'm going to present the checklist, it is the user generated, social networks, so on. That doesn't fall within what we do directly because I'm by no means coming from a social network or a content provider.

Having said that, I did work closely in terms of policies and guidelines and helping understand different ways in which a particular social network is being used, a very, very popular and probably one of the biggest social networks, you can guess who I'm referring to. What they kindly did, they shared with us a set of guidelines and internal guidelines of policies that they had developed in order to keep their platform safe and that was the principles taken from that was introduced and incorporated in the guidelines with formatting and with the work of UNICEF and ITU.

Our role was more to provide feedback and to strengthen them and to provide more, you know, guiding directions rather than creating the whole body of text. Some of the things that you see in that content is taken from open source that's been used by other social networks. That's just the background, you know, how our interaction with this process started and developed.

I think Dominique and Kim alluded to the frameworks, the structures, I'm sure that Ayla introduced the basic fundamental principles that will guide all of the sectors. I think there is a lot of commonality between what Kim said in terms of, you know ‑‑ and Dominique said in terms of child sexual abuse content. What defines, what are the type of content that needs to be defined, how it is defined, what needs to be done when those are detected. Who reports? How can the reporting mechanisms be enhanced? These are some of the things that are common across the board.

In terms of social networks, it is a challenge. the content is generated on the fly, and how do you prove and vet what's legal unless someone is looking at it, there is no automatic detection to us apart from the known child abuse materials that we can use to identify the detected images. If I'm to produce a new image now and to upload it and make it public, who stops me from doing that? The conditions, this is an important part we think, and it is incorporated, what is acceptable, what's not acceptable, it needs to be very clearly defined and made visible in most cases you just check your box when you subscribe, you don't even necessarily follow what's written there. It is very important that some of the pragmatic stuff is made more visible, more usable. In terms of the social networks, you know, we also ‑‑ an important element is the privacy, the default settings for minors when they log in, the friend list, who can access the nesting of the friends, if you're a friend of a group of people, are you automatically linked to all of the friends that they have? I don't want to get in details of, you know, what ways to go through that, you can go through the document and have your comments.

Fundamentally what we suggested is that all social networks and user‑generated content, you know, platforms, that allows users to generate their own content and interact with others should have certain basic structures. One is clear policies regarding what to do with illegal content, what prizes the illegal content, what is the procedures for handling them with law enforcement, taking them down, I think all of these matters is clearly what is in place, what app developers should know in terms of creating, you know, from the very onset regarding security and safety of the users.

The technology, we have mentioned that already. We should use technology solutions as much as possible by as many ‑‑ I mean, ideally all social network providers use the generated content provider, they should have a system of using technology to detect because the time it takes to detect content and report the content can be, you know, long enough.

Did you want to say something? I saw the hands being raised. Sorry.

>> And community intelligence, another section. I will not repeat the education, that's integral to it.

One thing that I want to say about the education bit, it is that these platforms are used globally.

The language issue is very important, how these are distributed to the population, it is important. You may have something because ‑‑ you know, within a different country, most of the platforms come from, I would have to say, it is from the west, the U.S., it is not necessarily the language, the way that ‑‑ it translates very well and is understood very well in different cultural contents but it may have to be adapted in the cultural concept and you have to have the measures to distribute that has best as possible to make sure it is understood by the users. That's the kind of framework that we have.

There is a lot in common between the different partners. The challenge here is the randomness, the rapidness, you know, how the content is produced and so we need to have community intelligence to flag it and to report it and to make it as quick as possible so that appropriate actions can be taken. Those are the key messages that we tried to bring apart from, you know, age verification systems that have already been mentioned.

I hope, you know, once we have a public consultation open we can have more robust feedback and discussions because it is an ongoing process.

Thank you very much.

>> Thank you. I'm especially working on this group and response team. So, I have a small portion here.

Policy, it needs to be in place there but the views, the country's view, it is very challenging to implement other country's policy in our country. How can we bridge them with local and global policies?

And second issue, how can I get help to get the global content from ITU and UNICEF, like that? From the response team from Bangladesh, how do you get Google, Microsoft, other solutions or when we go to the government to ask, we cannot get a proper response? How can we resolve this? I would like to take your views from the developmental countries to solve this solution.

>> Thank you very much, I'm a member of the U.K. parliament.

Thank you for the presentations.

Some of the presentations, the people, I'm familiar with. I think we have a position by whereby technological solutions will be able to provide a lot of the answers in terms of protecting children, that's where parents show sufficient interest in wanting to do that. Although that we have a difficult situation where children understand more than the parents. Therefore, the way in which technological measures are developed is quite important in order to allow and help an educated and inform parents in order to take those educated and informed decisions.

Therefore, I think a lot of responsibility falls to the technological companies and software developers, so on, and the ISPs in order to come up with easy filtering models that's going to be used effectively.

Is there a genuine will from the experienced panel? Is there a genuine will by the companies to come up with easier comprehensive filters? A specific question to John Carr on the Wi-Fi, the arrangements, what recommendations are there in areas with gray areas? I appreciate it is not black and white. Hotels.

Are obvious. Is hotels a private room in a hotel, is that a private space? Is that a public space bearing in mind it uses the same network?

Secondly, relating to Public Service Broadcasters to the U.K: We have the benefit of the BBC, I think they can be quite fantastic in terms of communicating things effectively and if I go back to the first point that I made, where parents very much don't understand the risks and the BBC could be or the Public Service Broadcast in any nation could be a trusted source of advice and use the programs in terms of communicating risks to parents through partly through drama programs and documentaries and partly through being ‑‑ having Website portals which will explain what the risks are and how the filtering settings can be adjusted in order to reflect their own different cultural demands and needs.

Thank you very much.

>> I'm from the Global Opportunities Foundation from Germany. Just in reply to the question that was put on, we're running the so‑called benchmark, the benchmark of parental controls and referring to the question of filtering, we just have to say that the effectiveness is really at this point in time not high enough. I will not say it is low, but it is still not high enough so filtering can only be a part of the solution I would say.

Also referring to what Carr had said, we really appreciate this approach of having safety built in to the devices as well as into the services. We call that the safety‑by‑design approach, it is ‑‑ I would like to refer your attention to that, but I have also a question to John Carr.

John, you mentioned that situation may be in Starbucks where someone is sitting, having porn on their own device and children sitting beside. I wouldn't say this is a technical question but ethical, moral values. If you know a child is sitting beneath you, you just should shut down your computer and don't look at porn I would say in that situation. Maybe you have another answer.

Thank you.

>> So, we are running out of time. We have three questions. Maybe we can take those and probably handle the others offline.

John, would you like to ‑‑ a lot of the questions were posted to you.

>> JOHN CARR: Okay. So, two questions directed at me.

The first was about people sitting in public places looking at porn with children nearby. Obviously in Germany this never happens. Sadly, in the U.K., this may come as a shock to you, occasionally it does. In the United States, there have been cases where guys have actually been arrested in Burger Barns looking at child pornography with child abuse images on the Website. You know, I wish the world wasn't the way it is. Unfortunately it is the way it is. You would expect grownups to know how to behave. We have an expression, the brass neck, yeah, you would hope that they wouldn't have the brass neck to sit in a public place, particularly where there are kids there. A casino, a bar, different matter.

Where there are kids around, you would hope, people, they would know how to behave. Starbucks, so on, they provide the facility, they absolutely market to children, they market to young people, McDonald's ‑‑ McDonald's is great from the word go. They have filtering installed, it couldn't have happened in a U.K. McDonald's.

What's interesting, companies like McDonald's, Starbucks, different policies in different countries. In both, in the U.K., they apply filters to block porn, neither company does that in the United States and in fact, neither company does it in most countries where they operate. It is puzzling to me what is it about British kids that is so special or particular that they need protection and what is it about American kids that means they don't? It seems rather odd.

On the other point, there are gray areas. Hotels, the most obvious one. The view we took was if it is a public space, a reception room, a meeting room, that would be a public space. If it is a bedroom, that people are renting, that's a private space and it is up to the hotel to make a decision.

You raised a very good point I ought to have referred to in my remarks earlier, we had said that a publicly trusted well‑known body ought to be involved in setting transparent standards across, but in the U.K. we have a body, the British Board of Film Classification that does the ratings for cinema and so ‑‑ cinemas and so on, the 1850, RPG, our suggestion was that the BBFC could have a role to play in setting transparent standards. At the moment it is dubbed by companies like Symantic, McAfee, I have no problem, as long as it is transparent in a way I think it would be better probably for everybody if there was a publicly accountable or publicly recognized body with a level of engagement in setting standards.

>> I'm Mohammed from India.

I have a 7‑year‑old son. He's asking for a Facebook account. I said no. You cannot have a Facebook account because you need to be 13. All of my classmates have Facebook accounts.

So you're talking about age‑verification systems and I have now said to my son, the Facebook account, I'm a friend on that, otherwise he would fake it without me knowing.

The problem is, we talk about the age‑verification system from a European, U.S. perspective, but in other ‑‑ in India, specifically, we understand many people understand the changes, they just lie. I'm 18, 19, when in fact they're 6 years old. A perfect age‑verification system has to be discussed. That's number one.

Number two, as John Carr was saying, the Wi-Fi, all apps, the mobile apps, you know, you don't have control of them. So, how are we going to stop or block those type of distribution of contents, it is a very big problem. I'm not talking about particularly something but a mobile application, if you have an application and we have no understanding what data is pushing through that particular item with connectivity. We're working as a business intelligent analysts and we have gotten good as blocking, taking care of this but most children, they have accounts for $1, you get different connectors and the moment they're connected, the government don't have control, ISP has no control, so, they're doing all of these things. We have to think ‑‑ come out from Europe, other prospects and come to the realistic problems, of what we're facing and think about a solution.

Thank you very much.

>> JOHN CARR: I completely agree with the burden of what you were saying there. The fact is, some things are harder to do than others.

Things like VPNs and encryption, so on, the hardest end. That's not a reason to not do what you can at the easier end. I'm amazed to hear that every child in the United Arab Emirates has a VPN, that shows sophistication that's not copied in the U.K. I'm not sure what we can do about VPN with encryption, things like that, we may have to confront those types of issues. People are thinking about it.

There is a lot of other stuff we can do ahead of that which we'll reach out to a very large number of kids which I think we ought to be doing.

>> I'll just add on to that, address a bit of what the U.K. MP talked about as well.

It goes to the notion that education awareness is very important. You know, kids are going to be a step ahead of parents when it comes to technology that's around the globe, but as we always say, adults have wisdom, they have life experiences to offer, they may not know how to do this and that with a particular service or a device but they can say that doesn't feel right.

Let's talk about that. Let's see.

If this offer is coming to you, it sounds a little too good to be true. It probably is. It is those experiences that adults should be talking to kids about. I think education awareness is key. You're not going to be able to stop the kids that are going to go do bad things. Hopefully you're giving them the literacy to think about what they're doing.

>> Just going on to the question from the gentleman from Bangladesh, we're going to Ayla.

>> We fully appreciate the challenges in countries where maybe the corporate responsibility frameworks and the supporting legislation or regulation is not fully in place and there may be issues with law enforcement sometimes.

In terms of specifically looking at Bangladesh and ‑‑ you know, we do this in many countries around the world. So in the beginning, one core framework that our team within UNICEF is using as a tool is the children's right and business principles which are kind of ‑‑ let's call it a business responsibility framework specifically looking at children's rights and it is fully built upon international standards.

What we have done with this framework, which was launched last year, it is that we have in many countries around the world and actually in over 30 countries we have launched initiatives around this framework inviting the private sector and the governments to work together on both issues related to children rights in general but also industry specific issues. It so happens we have done it in Bangladesh as well.

I would strongly suggest to contact our office there.

I don't know the frequency that the workshops are taking, place but there have been several already where the businesses as well as other stakeholders have met to discuss the framework of policy development for corporate responsibility and due diligence processes and how to incorporate this. I know we have had some discussions, some kind of an ICT industry organization so they have been also involved and invited in to the discussions and anybody coming from other countries, we have the platforms in place in many countries, in India for instance we work with the government, here in Indonesia I have just spent the last week collaborating with the government on a child friendly company initiative and working with the chamber of commerce in this area. It is not ICT industry specific but it is supporting the implementation of the responsible business frameworks with the special consideration for child rights.

Thank you.

>> We're almost out of time. There is one gentleman here that we have to respond to. We have to wrap it up quickly.

>> Okay. I'm from one of the Internet companies in Indonesia actually. I'm working together with the government to provide Internet to all the rural areas. Right now, as you can see, the users is mostly actually children. That's why I feel responsible, you know, for the content itself. I don't have fully control of the content because we're working together with the Indonesian government and, of course, we provide the infrastructure and the government controls the content itself. I'm quite worried.

Sometimes it is just like this gentleman from India, sometimes the children are very, very smart. We created a content filter, it is in Indonesia but stands for Internet ‑‑ Healthy Internet and Safe Internet. Something like that. Of course, we cannot be compared to, you know, all of the pornography in the Website, you know, which is growing each day. So in terms of ‑‑ we already have Internet deployed into the rural areas which is the users are children.

I'm so worried. That's why I would like to know more about this program and how ‑‑ you know, I'm reaching out to the panelists here to help us. You know, it is very good that you already have been working together with the Indonesian government, hopefully it will be implemented and I'm open to any other suggestion that we can use or work out even with the Indonesian government.

As the partner of the government, I can also suggest to them on how we're going to do things. I think that's all from me.

Thank you very much.

>> We have to finish here.

I would like to say, one of the initiatives with the Indonesian government that we're just complete something a study about the use of the Internet by children. It is a partner initiative that we're looking to get the results out any time now so maybe that's in the discussions as well.

We can have a discussion offline. Thank you.

>> To answer to the question about ‑‑ your question about the role of public services. We know this is part of the responsibility that we feel that we know that many parents rely on public broadcasting as a substitute for the family control. Probably they expect that the programs we show, also now the online offer of services that we provide, provide the same guarantees and control that was in the past. This is a very difficult exercise and we're doing our best to do so. This means that we're also ‑‑ we get the access to the chat when there is nobody that ‑‑ no web master that can control the chat.

You also mentioned another point in creating, raising awareness. Of course, all Public Service Broadcasters are providing parental suggestions of how to deal with the children on the Internet. As you know, most of the parents though don't have the time to go through that. You have to place them as much as possible in your activity. It raises another level of awareness, to talk about that in the news and we do regularly, but also last year there was a production of a fiction exactly about a young girl that started to chat and to expose herself through the Internet and got trapped into a big problem and this created a large debate in the country. These kinds of things need to be made more frequent in order to raise awareness.

Thank you.

>> Thank you. I think it is time to finish the session.

Thank you so much for your input and feedback. Just to remind you, the site where the documents are now posted, it is up on the slide. It is a bit of a complex URL for the time being. We will be adding the links to both the ITU site and the UNICEF CSR site. So, for the ITU site it is itu.int/upc and for UNICEF, UNICEF.org/CSR.

Looking forward to a lot of input. Thank you very much. Thank you.

[Applause]

********

This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.

********