IGF 2016 - Day 3 - Room 8 - DC on Child Online Safety

 

The following are the outputs of the real-time captioning taken during the Eleventh Annual Meeting of the Internet Governance Forum (IGF) in Jalisco, Mexico, from 5 to 9 December 2016. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

  

>> MARIE-LAURE LEMINEUR: Good morning. We'll be starting in one minute. Thank you.

>> MARIE-LAURE LEMINEUR: Good morning. I'm told that we need to start. The technicians are ready. So welcome to the Dynamic Coalition on child online safety session. Let me see. My name is Marie-laure Lemineur and I work in Thailand. The topic bringing us together, together is Internet of Things and the rights of the child. So let me see. So Abraham Lincoln said the best way to predict your future is actually to create it. I do believe that all of us here together collectively somehow we have some sort of influence on how we want the world -- the world we want for children to look like. This is why we have the obligation somehow to trigger discussions around issues that might potentially impact on children as Internet users. Such is the case of the Internet of Things. Before I introduce our speakers, allow me to very briefly explain what we mean by Internet of Things. I think it would establish a baseline before speakers start delivering their remarks.

Basically the Internet of Things comprises devices that function as sensors, activators, controllers and activity recorders. I'm reading the definition right now. Is that correct? These devices interact with software running elsewhere on the network. For example, mobile phones, laptops, cloud, combination of those. But what is important from our perspective is that those devices that function autonomously without human intervention. Those multi-functional devices can go online and you don't need to do it with a conscious decision. I think we'll discuss this later on. It has a lot of impact from our perspective.

The term of Internet of Things has a broad scope and can refer to the deployment of devices in homes, businesses, manufacturing facilities, transportation facilities and elsewhere. Of course, we are all aware the number and diversity of consumer IOT devices is growing very, very rapidly. Those devices offer many applications for end users. Many of them are already available. And those and others are being developed for deployment in the near future and that includes a wide range of devices, sensors, to better understand the patterns of daily life and monitor health. Monitor controls for home functions from leaks to heating and water systems. Devices in appliances to anticipate consumers' needs and can take action to address them. For example, devices that can monitor automatically reorder the product for consumers and devices deployed to be used by children such as toys targeting specific age categories. So according to a report issued by the broadband Internet technical advisory group a few weeks ago in November 2016, they said that they already can see that they are devices that they do not abide by the security and safety standards which is, of course, of concern. You might be aware that also the Department of -- U.S. Department of Homeland Security also issued a report in November 2016 supporting the statement and also sort of stating that they are very concerned that the security is not keeping up with the pace of innovation, which is, of course, of concern.

So we can -- I think we can start envisioning what the implications of all these can be or could be for children, including speechless children, as end users. It becomes very obvious that discussions around how to boost ethical and right enabling Internet of Things where there is a careful balance between an innovation and regulation needs to take place between child rights activists and other sectors.

So we will structure the session as follows. First of all we'll have our speakers making introductory remarks. Then we open the floor for questions. Then we will have a third block with specific questions we have for our speakers to react on and the audience, of course. This is meant to be a very interactive session and then we'll end up with, of course, a wrapping up of five minutes, concluding remarks.

So Maarten -- I need to introduce you before you start speaking. My apologies. So Maarten is an ICANN board member currently and also the Chair of the Dynamic Coalition on the Internet of Things here at the IGF, and Maarten also is a senior strategic advisor to a major corporations and governments and was formerly an official of the Dutch government and European Commission, in addition to many other positions.

Then we have Sonia Livingstone. She is well-known by most of us, all of us, a full-time professor in the Department of Media Communications at the London School of Economics. Sonia has authored or edited around 20 books. And, of course, Sonia is an advising professor at many universities and also has directed 33 country networks online funded by the EC European Commission. Sonia also serves on the executive board of the UK Council for Child Internet Safety.

And we have John Carr, also well-known by most of us. John is an expert advisor to the European UNESCO and child online safety. Also senior advisor to the organization I work for, ECPAT, and John also serves on the Council of the UK for Child Internet Safety and acts as an advisor to the U.N., European Union, etc. And he has advised many of the world's largest Internet companies. And we have -- it was formed in Europe in 2012 if I'm not mistaken, and has authored a paper published in January of this year on ICT and assessing how ICTs are evolving and of course there is a chapter on the Internet of Things. Evolutions in security and privacy issues.

To all of our speakers, thank you very much for agreeing to speak on the panel and oh my god, last but not least, Arda. My apologies once again -- is the president of In Hope. Arda is the managing director of the Dutch National Hotline and a Dutch Senator, and Arda served as a Dutch Member of Parliament for eight years. Is that correct? From 2002 to 2010, is that correct? Once again, thank you very much for agreeing to speak here today and Maarten, if you want, you can start. Thank you.

>> MAARTEN BOTTERMAN: Thank you. I'm very happy to be here. I find the IOT is really penetrating life at any level and relevant from this perspective. As Marie-laure said I'm a member of the ICANN board but here I am speaking in my own capacity on the researcher on this topic and chairman of the Internet of Things Coalition. That coalition is really set up to common understanding from a multi-stakeholder perspective and global good practice what IOT is. I appreciate your definition. It is mainly things connected to the Internet that do things together and exchange data together and that goes far away.

It is very clear that also it's not about whether we can stop that or keep that back. First it is there already, second, it is moving in enormous tempo. At the same time, it is also a global thing, which means that no single jurisdiction can control what's happening there. What we can do is get to global understanding of good practice and everything else to follow on that. But I think that's also Marie-laure's approach in this Dynamic Coalition. And we also need these things. We need these things and they bring a lot of benefit. Technologies in themselves are never good or bad, it is how we use it. We need to use them consciously. As we see we need it for healthcare in support for those people who need that in a society, environmental challenges, but also dealing with scarce resources, whether it's traffic or water, clean water, whatever. There are things that in this increased dense society we cannot do without technology. We need that to make it doable to check that not only in developed countries but also in developing countries where a little technology can already make a big impact on, for instance, agriculture or safety. Take the tsunami networks. It is also recognized that technology is there to support -- is a necessity in achieving the sustainable development goals and there is more to say about that as well. But I won't do that right here. The focus isn't there.

So with that, the good practice principle that we embraced and we continue to test and improve and give more meaning to, because it is a high-level principle that means in itself means nothing but a good leader for the interaction between stakeholders, that it's really about developing an IOT product and ecosystems and services taking ethical considerations into account from the outset at all levels, in development, deployment and use basis. To have a sustainable way working helping to create free, secure and enabling rights-based environment for the future we want. Important elements of that are awareness, crucial at all levels. Transparency, gross transparency is what helps also the commercial sector to be even more honest and give consumers more choice in what they want from life. Accountability with that and I think there is also a joint stakeholder responsibility in ensuring there is clear choice for the crucial services we want so we don't bounce to one single environment.

So a very important element is, of course, data protection in this, for that. We just published last week a report on data protection issues from the U.S. versus EU frameworks and I think that's interesting for the world. You can find it on the project-Picasso-project.EU. It is published there. And it may -- it may help you to get a good feeling for the different sides, if you don't have that yet. I found very useful discussion from protection experts from both sides of the Atlantic, but with that also global.

Now, over the last weeks I probably also need to disclose I have an interest here. I have got four daughters and I have one granddaughter and the granddaughter is one month and I would like her to grow up in a safe and good way as well. As I'm Dutch, I am also having the pleasure of having participation right in the Dutch delegation led by my colleague across the table. It was funny that came up, like we need to talk about these issues as well and she shared a picture of the doll who can interact with people. Let's make sure it is only you doing that and not anybody else. The other example that came to me on a panel including the European data protection supervisors, he came with diapers. It will indicate whether it is wet or not. Why I use that, it is also a sign of how you can deal with business models because the diaper that it signals is wet, I can see some usefulness in that. Although I am still from the generation that used its finger. But these data don't necessarily don't need to be shared in databases all over the world. What is the added value of that for the consumer itself? So these kind of considerations are important in the world taking forward.

I hope this framework of transparency, accountability and networks of choice moving forward, ethics taking into account at all levels. Also in the ecosystem, is a useful background for you to further the specific issues that are related to children and their safety.

>> MARIE-LAURE LEMINEUR: Thank you very much, Maarten. Sonia, do you want to go?

>> SONIA LIVINGSTONE: Good, thank you very much. So Maarten talked about the -- from the Internet of Things perspective and I'll say something from the child rights perspective. Just get this in the right place. And it is not obvious perhaps why you would bring a child rights perspective in. Questions of protection will be obvious to many. What a child rights framework adds. It -- I think it gives us a way of thinking more broadly about a child-centered perspective on many of the issues that Maarten raised and others are concerned about with the Internet of Things. A child rights framework begins by saying that children are independent rights bears. Thinking of the rights from the child perspective. Always thinking about the best interests of the child from the holistic perspective rather than a top down where do government want it. Child's rights per respective is broad and how they engage in the world and puts in broad perspective questions of protection, which are really key. But also adds a whole set of rights around provision and participation. And what we very often see in debates around children and technology use that are not within a child rights perspective is that they become very focused on protection issues, which is needed, but often at the cost of participation issues or other ways in which children benefit or need provision to support their Internet use and their rights also to be heard, to participate, to meet others, to have access to information, and so forth.

So a rights perspective gives us that broad focus. I think a rights perspective also clarifies terms. We're talking about children from zero through to 18 and that's important because many think you can forget children at 0 to 6, 7, 8 or whenever they get their own phone. But what's helpful in the Internet of Things framework, of course, is we are thinking about babies and their rights as Maarten eloquently illustrated and we're also thinking about children up to the age of 18. They often get forgotten as well because they begin the make their own decisions, which is important, but it doesn't mean that we forget about questions of both harm and rights to participate and have the literacy that they need to participate. So the full range.  And within the U.N. convention on the rights of the child we also see some clarity about the relationships involved. The role of the State, the back stop responsibility for child rights, and then the question of parents and parents' responsibilities ensuring child rights and some of the tensions that we see are about the responsibilities of State versus parents. Though what's very notable as I want to say in a minute in relation to the Internet of Things that indeed the Internet more broadly is also the relationship between the State, the child and industry. And then I think also just helpful to think about a child rights framework in the way in which it links children into a broader human rights framework.

There is a lot of human rights debates, including at the IGF where people are talking about the right to expression, to privacy, to dignity and so forth. Not always remembering that children are included within those frameworks as well. So a rights framework puts children into that wider context. And reminds us that while protection is key, children in their own right have the right to freedom of expression, privacy, dignity, non-discrimination and so forth.

So the Internet adds a kind of new layer of challenges to a child rights framework and the Internet of Things absolutely I think crystallizes many of the anxieties that people or struggles that people already have in articulating children's rights in relation to the digital environment and children's rights in relation to Internet governance. Some of the issues that I think that are challenging in the intersection between the online and the child rights are, for example, the question of Internet access as it’s a right. Children to gain information in -- it was written before the Internet. It says precisely in any medium of their choice. I think that really does state something about children's rights to Internet access. So those who hope that we can resolve the question of protection by excluding children from the online world, there lies a particular tension there. I think whether or not we grant the Internet access, is it a right, the Internet is a crucial means through which children's rights are enabled and through which children's rights are infringed. Several of us in this room are working with the Council of Europe right now on how to articulate the way in which children's rights can be articulated, supported, but also infringed through the digital environment.

What we know about the online is in a sense it intensifies all kinds of phenomena we've already been struggling with in the offline world about children's rights to protection, provision and participation. It intensifies more of the opportunities, children can participate globally, they can access many kinds of information, they can gain many kinds of benefits, and it also intensifies the harms, the way in which abuse can be at a distance. The way in which children's online activities are under the radar of the supervision of their parents and teachers. New ways in which companies have unprecedented access to children's data. Indeed everything they do begins to be tracked so these are as it were newly intense ways in which child rights considerations must now be re-examined in relation to the digital.

Online as John and we have written in our paper about children being one in three Internet users in the world, we also try to address the problem of how it is not clear who is a child online. We know that many children are online and we know that children are a significant proportion of those who are online. But knowing exactly which are the children and which data comes from children is a particular challenge. We have now a lot of debates about boundaries and we see that in the general data protection regulation, we see it in debates about pornography and debates about the strangers who can contact children and so forth, knowing who is a child and how do reestablish our kind of traditions of oversight and visibility and accountability. I think these are all new kinds of challenges. So a lot of debate now about age verification and indeed about privacy. How am I doing for time? Stop. Okay. Just one last point that I think if I may is just to say something about the new kinds of organizations and I think this conversation illustrates that. The new ways in which bringing child rights and cutting edge Internet developments is bringing new kinds of conversations into Internet governance spaces. Of course, I think all of this work should be evidence-based and for that end I'll end by waving the global kids online research reports which Clara and others have been working on to say there is a real need for an evidence-base in these discussions.

>> MARIE-LAURE LEMINEUR: Thank you, Sonia. Those are available online. Those are hard copies. Thank you very much. So now we move on. It's John's  turn. You're going to make the Internet of Things with toys.

>> JOHN CARR: I'm going to mention toys central to what I'm about to say. Just before we do, let's get a wider take on what the Internet -- what's actually happening with the Internet of Things. I have been to every single IGF except the first one in 2005. So this is my 10th. I think there have been 11 all together and I love coming to the IGF. It is a great networking event. Every time I come here I am reminded that there are two parallel worlds in existence. At least two parallel worlds. There is the Internet Governance Forum and there is people, a priesthood of multi-stakeholderism who form groups like the one we heard about talking about standards for the Internet of Things and the need for ethics and the need for all of these other very good things. And then there is the real world where companies are actually doing stuff, right? I'm not sure how much they listen to any of the discussions taking place here. It is quite clear that many of them haven't been doing. I'll illustrate that with the following -- I won't read it but because most of you will have heard.

Less than two months ago, around the 20th of October, to be precise Spotify and Twitter went offline. Shopify went offline. A number of major Internet businesses ceased to operate, not for very long. They went offline because of a distributed denial of service attack that connected it. Some of these were toys, some of these were monitors and computers. What emerged subsequently was that some of these devices that are out there and are being used, a bot net network can't be patched for updated. The geniuses who created these devices and sold them never thought about the possibility of needing -- first of all, they never thought hard enough or well enough about the importance of securing them against bot net attacks of the kind that succeeded in this case. Neither did they think about the possibility of an attack succeeding that might require the device itself to be watched or updated. So we have already potentially -- nobody knows exactly, millions of devices part of the Internet of Things and bot net network that can't be patched. We'll have to find them, kill them and take them off the network in some way or another. It is not just the lack of foresight on the part of the people who make devices of this kind. Some of them will be toys. I'm pretty sure we'll find that some of them will be toys. It wasn't just that.

There is a special search engine, I don't know how many team in the room have heard of it SHADON. The purpose of SHADON is to allow anybody who goes to it to identify devices connected to the Internet which have got a username of user name and a password of password. How do these things happen? Manufacturers of devices which are connected to the Internet, part of the Internet of Things, some of them again will be toys, in the interest of kind of saving money, by making security less complex and maybe because they didn't want as many phone calls to their customer support services thought right, let's make it easy. Let's make the username username and the password password. Then, just to be completely clear intending to be helpful. They published in information about the username being username and the password being password on their websites. Anybody can look and find out what any device's user name or password is. It enables any hacker to locate devices that they can hack into and take over, okay? And some of these will be children's toys. Let's just bear that in mind. So this is not a story of success. This is a story of incompetence, cost-cutting, and I'm very pleased that you are talking about standards and ethics and maybe in ten years it will be working. God knows how much damage there will be between now and then. There is nothing to make it a law for that to happen.

By the way, one good point that touched on these issues a few days ago. One good thing that is going to come out of this network denial of service attack is the realization that the Internet is finally gaining a much more physical presence in lots of complex areas. And issues of liability, therefore, will suddenly start to become much more immediate to the manufacturers and distributors of these devices. If your monitor, the victim of a house flooding. If it could establish the damage done in Larry's house the service didn't warn you and it was the manufacturer's responsibility, Larry can sue. This is good in that sense that it is becoming this much more physical presence in our -- in ways that aren't just about accessing Google or Wikipedia or sending emails. There is a mixture of said to be said about the Internet of Things and not all of them are very good.

Let me turn to toys. There are two aspects to toys on the Internet of Things. One is called the parenting angle. Is it really good parenting to buy a toy to give to your child so that your child can talk to the toy and form a relationship with the toy? Maybe there are certain -- maybe there are certain situations where it isn't a bad thing. I'm not saying any toy that a child forms a relationship with is bad. But when that toy -- that's been going on since time immemorial. Kids have had dolls and little trains they form sorts of relationships with them. Up to now those toys haven't been connected to the Internet with the possibility that strangers can hack into them or what have you. And there are these particular dolls made by a company called KALA that you can talk to. The whole -- you talk to your doll, presume you are little girl or little boy, you talk to the doll, you make all kind of confidential -- you have a confidential relationship with your doll. What you might not realize as a kid the whole conversation is being recorded and sent to your parents and also to somebody in the company who then analyzes it for the purposes of marketing or analyzing blah, blah, blah. Parents sometimes stand at the bedroom door listening to children saying their prayers and stuff of that kind and that's what parents have done from time immemorial. But the whole conversation between a child and its doll? It raises issues of whether it's good parenting practice or not.

By the way, a legal action was mounted the other day by -- is Kathryn here? By a number of people, including a consumer group in Holland, by the way, asking for these toys to be withdrawn. And they've been withdrawn in Holland. Holland is the first country that has issued -- I don't know if it's every shop in Holland or just the big chains. The toy shops following the action that Kathryn Montgomery and her friends took have actually withdrawn those toys from the shelves. We have had the case with VTEC, computers made in Hong Kong that children have been using. There was a hack into there. 64 million photographs and messages that children had posted or made on their computers became public property because they were leaked onto the Internet. Cases where parents by baby monitors. If you are downstairs watching TV and you have a baby monitor. Some of these baby monitors are WiFi based and got an Internet connection to work. Some parents walked into their baby's bedrooms and heard people using -- hacked into the baby monitor using bad language and saying horrible things to their children. We've had cases where I got a Samsung TV, right, connected to the Internet. It has a camera in the TV so I sometimes use it for Skype calls, you get this gigantic picture. People have hacked into Samsung TVs and filmed everything going on in people's living rooms including on one occasion a married couple having sex on the settee. It is not a crime for couples to have sex on the settee assuming there are no other things going on around it. These things have been happening.

The Internet of Things has great potential but the carelessness of the industry takes your breath away and has a huge potential to do damage. My last word is that distributed denial of service attack I mentioned at the beginning. Nobody knows exactly who did it. But people think pretty much it was a trial run to see if they could do it at all. And they very strongly suspect that certain governments in the eastern part of the world may have had a hand in it and they were just testing how to do it for a much bigger attack that might come when international relations are going bad or there is the possibility of a war or something of that kind. So this is what the geniuses in the hi-tech industry are serving us up with. Children are in the mix.

>> MARIE-LAURE LEMINEUR: So thank you, John. Maarten wants to react.

>> MAARTEN BOTTERMAN: Very short part on that discussion. Yes, indeed, it was a good happening that actually the impact was relatively limited but to wake up people that there is a lot of things out there that are connected to the Internet that have computing power and hence can be used for these kind of things. One of the things that is rapidly emerging is that people who produce things to be connected should be held to a standard where they no longer have a standard password or the password can only be used once when installing the system. So that is one of the things. What you say about the Internet of Things is true for the entire Internet. We need to continue to improve it all the time and on the Dutch situation, it was the consumers' union, which is a non-government, non-business organization who advised pulling it out of some shots. -- some shops.

>> In the early days of the Internet we worried about online predators and we discovered that while they exist they aren't as prevalent as some had feared. But I think when we look at the Internet of Things we have to realize that one of the elements that we have to at least think about we are no longer talking about cybersecurity in the sense of access to information, access to data, being able to meet people but actually physical control of devices that can affect our personal safety. So for example -- I gave this example the other day. I have an app which is capable of unlocking my front door and opening my garage door. When my garage door is open it's visible from the street. I have on three occasions put the phone in my pocket with the app running and pocket dialed the button that opens my garage door and I'm looking at wow, my garage door is open. Somebody to steal by bicycle. If we want to get paranoid. Imagine the baby monitor -- not even the baby monitor. Imagine we see the 14-year-old at home alone and we have access to the door lock. I don't think that is a likely scenario but certainly a scenario we need to protect against.

Then we get into situations of automated cars which are already here. Eventually, of course, they will be self-driving. Today they're quasi-self-driving. The potential of being harmed or injured because of this technology. It's a compelling argument to elevate this way beyond traditional issues of Internet safety and really into the things in the United States when you talk about the consumer products protection division like the Samsung phone that can catch on fire. We're talking about safety in the physical world and typically authorities pay much more attention to that than they do cyber safety.

>> MARIE-LAURE LEMINEUR: Thank you very much. We need to move on with the last two speakers and then we can -- I'm tempted myself to make comments. I'm holding my tongue. We can open the floor for comments after that. Thank you.

>> JUTTA CROLL: Thank you for giving me the floor. Could you put on the slide? No, not me. The slide. Okay. Expand the screen. Yes, that's me as a baby but then the guardian was not available at that time. Thank you. It's fine. I just wanted to introduce you to one example and then deploy a little bit on the technical aspects of Internet of Things. You see the image of Teddy the guardian, which is a toy, but also like a baby monitor it monitors the heart rate, the blood pressure, the oxygen rate of the child and it has -- you can see that little camera in its paw. All the images and data taken by the toy are transferred to the parents' cell phone to monitor the child's health and well-being. Parents would do that in the best interest of the child. They would not do that with a bad intention. But when parents put the Teddy in the child's bed, most of them are not aware that they have connected their child to the Internet.

John has already been talking about smart TVs, the Samsung TV set and if you compare both things, they are connected to the Internet. The first you would say I have a lot of functionality and technical functionality within the smart TV. And I have an interface that I can control the TV. I have a remote control where I can steer the device so I can do the configuration. I can set my preferences, but what about Teddy the guardian? How do I do that? Do I have a good interface where I can control the device? I would not blame parents for not realizing that all the health data and geo location data and maybe the noises the baby makes or the speech or talk of the baby is going to the Internet and also going to the company that has produced the device. And most times not via specifically secured Internet connection. I wouldn't blame the parents for that. Parents need to learn, but also they need to be supported by industry when they use these devices in their households where the children are also there. So the organization I'm working for has also done some research into what we call safety by design. This would be a solution to address two aspects.

The one thing is that from the beginning of considering to produce a product or to develop a service, it should be taken into consideration what users of that product would mean by children. Would that put children at a specific risk? And then to consider how this risk could be addressed.

And then as a second step, I would say safety by design would mean to have a very high degree of usability of the device and its interface to set preferences and to do the configuration of the device. From a technical perspective, a lot more safety could be built into the device right from the start and then possible risks for children could be addressed by that. So the devices obviously would have a higher usability and could mean they could be better marketed. It is not just saying put safety into it and that might be a cost factor, it could also improve the marketing possibilities for the product. John has already mentioned the MIRA attacks that had been done with a bot net with many connected devices. Most of them were devices that had not a high degree of safety or security. And you have also been talking about the default username and password. It is quite funny that so many devices are out there. You might think it is stupid not to have an individual username or not to change a password like password or 1234567. But with many devices, you get that long alpha numeric string and you think it must be a secure password if it is A, Y, and some numbers and alphabetical things. What would be the case if someone swipes the stuff of the company or hacks into the company and gets access to the algorithm that has set up these alpha numeric strings. In one second they would get access to all these devices that have the complicated pass words. So we need to take care of that as well.

Going back to the children as users, it is obvious that with connected devices in very small hands we could not rely on education only. You just could not say that you could educate that small baby to be careful when playing with the toy. But also we could not rely on say parents should make the decision not to allow the children to use connecting devices, because that would deprive them from any educational opportunities and benefits. So I think industry has a role to play in here. It has already done so but I think they must get a step further. I just would like to have one final example. If you look at the top letter, it looks like it was made to go into small hands. It has a colorful screen, it is easy to navigate via the touch screen and nearly unbreakable. It doesn't have any plugs or holes where little fingers could stick in. It looks like from the outside, from the device, that it is made for small hands. That's the first step to child safety. But the safety doesn't go so far that -- to address the risks of the device being connected to the Internet. So there is no built-in safety. There could be built in but so far we don't see that. I have something to say afterwards to your questions.

>> MARIE-LAURE LEMINEUR: Very interesting points about securing the supply chain from manufacturers you raised and also the fact that for many IOT devices, data is sent -- it's clear text, not encrypted. That's another issue we can discuss. Finally, Arda.

>> ARDA GERKENS: Not all of us are very young people here. I have a memory when I was younger in my old days the marketing and the commercials we had were way less than we have now. We have gotten used to television, Internet, everywhere there is advertisements, right? Just not to be like that. Also children were not a target of advertisement in the old days. The youth became a target around the 80s, actually. At that point when the big companies found out that to target the youth would actually be really helpful because they have a lot of money. If they don't have the money, the parents will have the money. That's when they started to seduce the youth and try to target them. We have to understand that to collect the data from the children is gold. They can market not only now when they are young like a baby but for the rest of their lives. That's worrisome and something we need to understand talking about this subject. Now we're in a world -- again I can remember my grandmother telling me that her grandmother said you know, one day you will just press a button and lights will go on. So that's what she said. Now we are here in 2016, if I would tell my grandmother what we could do now she would be flabbergasted.

And we ain't seen nothing yet because the Internet of Things and Internet of toys will bring us a world we can't imagine what will be possible. It will go enormously fast. Even for us it is hard to keep up. How can parents keep up with all these things that are going on? And also we have parents -- I'm one, so I can say this -- who are very busy in their lives and we all know that feeling that sometimes it is very convenient to give the kid the iPad so at least you have some moments of rest so you can do the cooking or the last emails you still need to do in order to go on with your life.

Remember that it's not too long ago we had these iPads and suddenly we were startled because all these children suddenly bought strawberries within the game, within the app and suddenly your credit card would be overcharged. You didn't think it would be possible. We were all angry at apple that they made this possible. But again this is just the tip of the iceberg of what is coming to us.

So we also have to realize -- and I completely agree with John, what you said in the beginning, that the people who are creating these developers are here to develop. That's what they do. They want to develop new, cool stuff. New gadgets, new things that is what they are all about. The problem is they don't think about safety and security. They should. We want them to. But they aren't doing it. It is also because it is not in their education, which I find very worrisome that you don't have one bit of education that you need to be secure and private with the stuff you do. So the thing actually what happened with KALA, there was no security, not enough and there was also these data protection laws that weren't being complied and so we have these toys that we can play with and you can have another debate, if it should be in the children's rights to be creative, not to have a toy who can talk back to have to use their own fantasy. That might be another discussion. We have the toy who talks back and we don't think of the fact that a predator might use that, too. We don't think that could happen to us anyway, why would it happen to our kid?

So we don't secure our new toys with that. And also the children will be watched 24/7 with all these toys. So what does that do to a child? Not having the right to grow up being unwatched and like one of you said, listening at the door. That's one thing. But to being watched 24/7, what will it do to this kid?

I think we need to strengthen the right of the kids not only on these -- the right to be disconnected but also the right of the data. Because like I said before, big data is very interesting for this group and I think one of our panelists said it in another panel but we should make sure all children can start with a clean sheet when they're 18. If any data that is collected and I think there should be laws about what kind of data you can collect from a child, but you should be able to start with a clean sheet when you are 18.

That is still a challenge because we need to find out how to identify yourselves to make sure what your age when you are online. But these challenges we can solve with all the technical things we have.

Then I'm very happy with this action that happened with the consumers rights organization on KALA because I think we actually should get consumers engaged in this discussion. They are still so much at the sideline. As long as they don't know what's going on. You can educate but you can't educate a baby and hard to educate parents. So you have to be loud and tell them what is wrong with this toy and why they shouldn't use it so that they can put pressure on it to get it revoked from the shop.

Then last, I think to solve this issue the Internet of toys it is a multi-stakeholders we need here. We need the industry to get engaged and think about privacy and security but I think it's too easy to say it's mostly those two actors, I think very important it should be the policymakers, the legislators. There should be more and stricter laws on the Internet of toys because we're talking about children here. We need to protect those children. We need to protect the children's rights. We cannot just say okay, it is the responsibility of the parent. We know there are many parents out there who -- like I said, I absolutely agree with JUTTA said, you can't blame them for not knowing and we must realize there will be many parents out there who will not know. There is a responsibility for the legislator. One mark what you said, you said it's an unlikely scenario of the app. That's just assume it is a likely scenario. Anything that can happen is already happening out there. We just don't know it. So yes.

>> MARIE-LAURE LEMINEUR: Thank you, Arda, very interesting points. So we can now open the floor and take questions or comments from the audience. And react on some of the points that were made. Anyone wants to start? Yes, please. Can you state your name when you speak?

>> My name is -- I am from Brazil. A federal prosecutor there. I work with cybercrimes and I also with the hate crimes, the child pornography, against sexual abuse online. And I totally agree with what you say about because we see some not just crime notes, but notices about some toys being commercialized in Brazil without so much instruction. I think we need to take much more care about the industries. Not just the parents, of course, the parents have the -- the choice of giving or not for that child the toy, but sometimes a child that is not so young can reach this toy in a friend's house or in another place. So I think we must at least in Brazil we must take more care about what comes with the toys to make the industry of toys, firms that commercialize the toys, to put advice on there. This prevention we need to focus more. Thank you.

>> MARIE-LAURE LEMINEUR: Thank you. Yes, please. At the back. Can we have the mic? Thank you.

>> Thank you. My name is HOFFA from Tunisia and I am ICT working with -- I am a professor at a university. I have both technical engineering knowledge as well as -- the question is I have been in a workshop yesterday about the future of the Internet, organized by ISOC. We can't predict the future so we have to create the future. So I think in the last debate of the WTSA, ITU held in Tunisia they discussed the Internet of Things. It will be something that is -- it is a danger. Internet of Things going to create many ethical problems with the Internet. It is a threat to the future of the Internet. The debate is starting today and I think that manufacturers and private sectors are pushing these small things to be connected to the Internet. I think today we have to discuss how can we save the Internet from these machines? Internet is a network. Network is connected by people and people helped create the knowledge. If we create many stuffs, machine will create the data and I think it is a village today when we discuss about Internet of Things as part of the Internet. To connect machines on the Internet I think we have to threaten the Internet as a safe and open, I think network to the whole world. Thank you.

>> MARIE-LAURE LEMINEUR: Yes. Okay. So Maarten, then the lady over there and then maybe Larry and then next.

>> MAARTEN BOTTERMAN: If Vint were in the room he would disagree with you. Basically because this is a thing, too. It is connected to the Internet. This is a thing, too, it is connected to the Internet. We cannot take it away from the Internet. The Internet is the means, the IP is the means that is used the transfer all data. So there is -- I know there is a strong call for securing the Internet even stronger, not only the DNS but also beyond. It will be impossible to see it in isolation. Some people would say it is not the Internet of Things but the Internet of everything. Some people would say we're all connected to the Internet and it stops being the Internet, could be this device, this device, the camera device, the end notes, the -- or even me because I'm connected to the Internet, too.

So I hear what you say but I think it is more important to take into account that this is a whole exchange of connectivity, observations, data exchange and even actuation that is more and more run autonomously through machine learning and artificial intelligence. Algorithms doing things I couldn't come up with but they collect a lot of data and do all kind of things that use technology to deal with the complexity and the possibilities. So I would warn against spending too much time on that discussion of is it the Internet or not and really focus on what it's really about. This is the pervasive information society where we are all connected to increasingly dense and we need to find a way forward. I very much appreciate the introduction so far in that also from this perspective it makes a lot of sense to make people aware, not only consumers but also I think policymakers and when commercial sector is beginning to care, they know people pay attention to it. Let's make sure people know and then commercial offers will include the safety in it.

>> So Natalie, I come from Hong Kong. I'm 15 years old. As a teenager I would like to ask the question, is parents got the rights to have their kids' privacy and refer to an example of the Teddy bears monitoring the babies? What kind of people can have access to those sensitive force  -- information and the technology companies and researchers. We can have more discussions of who can go the rights and have the necessary to assess sensitive force.

>> MARIE-LAURE LEMINEUR: That is an excellent question and it goes back to some of the points that Sonia and others made about, we do not only have to look at the problem from a protection perspective but also the rights of the children and their right to privacy and etc.  and the right to be disconnected as Arda said. You want to react on that?

>> ARDA GERKENS: I was about that and I think we haven't thought at all about how the passwords and management of reporting will be shared within the family. Yes, the Teddy, people keep their Teddy until they're adults. At what point does the Teddy stop sending the temperature of the person holding it to the parent and start saying excuse me, this is a person who has their own data? I think we could exemplify this for all kinds of technology. That question at what point does the password get handed over to the young people? At what point do young people claim their own data? Are these tools reporting whatever the child says and does to the parents forever? These are my questions actually, yeah.

>> JUTTA CROLL: It is called disposable devices. That could be a way to go to have those toys basically that are being used for a specific number of years and you just throw them away and you get rid of.

>> SONIA LIVINGSTONE: It's the Teddy bear. The gift you give your child they keep all their life and might stay with them. And the market is not going to not want. Disposable means you buy more and more but they want a significant investment that becomes a loved part of your child's life and that's exactly where we cannot have the child forever infantized by their parents?

>> ARDA GERKENS: Sonia is saying this, not me for the record.

>> SONIA LIVINGSTONE: Even if you throw -- so even if you would throw or dispose of the Teddy bear, the data is still there and I think the question was going what is with the data that is stored by the company and that is exploited by the company? It could be for good purpose when it is information, making new medicine to address house problems but it could also end up in -- so far we see advertisement and we see for some people it's very good. Other people think it's strange to get the products advertised to them that they have been looking for on the Internet, for example. But with Teddy the guardian or with hello Barbie, the products would be personalized. The little girl that tells her little hello Barbie, I want this dress or that dress for my Barbie, you get the product the girl has dreamed of. We have to talk about what happens to the data stored by the product, not only about the relationship to the product.

>> It is not just about the parents listening in and what they are doing with it and what that does to their relationship with the child and the child's privacy. Who are the people being employed by the companies who are listening to this information and analyzing it? Larry, you probably remember a few years ago now AOL employed a guy who was a moderator in their chatrooms who on at least one occasion got the personal information about one of the children he was moderating, met them in real life and raped them. He was caught and went to jail. By the way, I would just check -- just checked one fact earlier on and went to the BBC website where two big stories running on the BBC website today. One is Mick Jagger at 73 has become a father again. I'm sure we're all glad to know that. And the other is I'll read one sentence. Japanese city is keeping track of elderly dementia sufferers by attaching bar codes containing personal information to their fingers and toes. There you are. Human beings will be part of the Internet of Things in a more intimate way. Mick, leave me alone.

(Laughter)

>> MARIE-LAURE LEMINEUR: We have Larry, you and we have Clara.

>> First of all I want to add it is not just baby toys. Smart basketball and smart clothing that teenagers will be wearing. Then we get into the situation of parents having data perhaps they shouldn't. I want to apologize for speaking out of turn. I assumed John Carr had the last word. I didn't realize there were other speakers. He will, of course. I kid John. But I also do want to exert a word of caution as we talk about this. Of course I agree this is an extremely important discussion and I agree that industry and policymakers need to be made aware of these issues and we need to start thinking about what are appropriate policies, but I have a confession. I feel a little guilty. In 1994 I wrote a booklet on child safety on the information highway. The advice I gave in that booklet more than 20 years later is still being spread around including many of the things I said in 1994 based on absolutely no research. I was -- that's my house talking to me. So I sat there in 1994 and I imagined all of the things that could happen online. I said oh, that 12-year-old girl could be a 40-year-old man. I said some things. It turned out once they did the research and followed up by others that some of the things I said in 1994 turned out not to actually true or they were true again as I said in very small numbers.

And so I feel a little bit guilty that I helped create what I now refer to as predator panic. That as a result of some of the early things that we in this field said back 20-some years ago legislators began to overreact and over legislate and it became a crazy situation in the United States where it was the nightly news and all the laws were being passed and eventually we began to pull away from that. What concerns me about this as much as I appreciate this discussion, is I want to make sure that we are talking first of all in broad strokes, not specific technologies. It is counterproductive to tell tech companies exactly what they need to do to protect against things because what happens is as technology evolves, probably better solutions come out, not the ones that governments legislate but better ones come out. It is appropriate to talk in broader terms such as you must have a privacy policy, you must adhere to that privacy policy. If you fail to you will be find. You must have consumer warnings. We can talk in those terms. But we have to be very careful about getting into very specific technologies which could not only stifle innovation but stifle the innovative solutions that industry could work on.

I also want to talk about one quick mention in the conversations about privacy. I met with the people from toy talk who are the technology company behind hello Barbie. They do indeed provide parents with access to the audio conversations of their children. Not only does that violate the child's privacy but it could jeopardize a child's safety if she is talking to her doll about an abusive parent and that information gets in the hands of that abusive parent. We need to always think that privacy and safety go hand in hand, especially when it comes to children. I think that there is room for legislation to make sure that children's information is not shared inappropriately with parents. Although I do think it's cool for parents to know their baby's temperature. I would support that. Maybe not their 15-year-olds.

>> Thank you. Heather from the University of applied sciences for cybersecurity and coming with the Dutch delegation of the IGF. Yesterday in fact we had a very good discussion at the group on extortion. And in fact maybe I'm a little bit more optimistic than the majority here but I truly trust technology and especially the following one. Preventive technology I think could be eventually a way -- not the absolute solution. I think it is always multidisciplinary sort of solution, but you have seen it already with the photo DNA applications, some of the online platforms are applying in order to detect pictures that have a pornography, child pornography content. But on the other handles being somebody who works as a researcher I do represent interdisciplinary team. You have to see safety and privacy and security by design from this lens. If you see it only from the legal perspective missing the technological, engineering and so on, then there is no way to really move beyond the points that have been made in 20 years ago or 10 years ago. So it needs truly disciplinary, multi-disciplinary ambition. So that will be just a reminder for the floor to have that bearing in mind especially because there are so many good research at the moment online safety for children and the youth.

>> MARIE-LAURE LEMINEUR: Thank you. There is a consensus here the problem needs to be tackled from a multi-stakeholder perspective and technology can be used in a positive way but also abused and most of the cases that we commented about are abuse of technology. Clara, you wanted to make a comment?

>> Thank you, I'm Clara and I work for UNICEF headquarters in New York, a child protection specialist. I think this is a very interesting discussion but I also want to bring up one of the issues that were brought up in the other discussion the other day on the Internet of toys and things which I thought was very, very interesting. It's to think about what do these toys have -- what kind of impact do they have on the child's psychological development? And I think this is something we as a community and as a research community also needs to look into much more. Because having children perhaps also very young children and then also older children interacting with these toys, I don't think we know what kind of impact it will be having on their behavior and we also know that children's brains in the early childhood are extremely developing very fast, right? We also actually know now much more about children's adolescent brain development and still also very much developing.

I want to raise this issue here. I don't think we have discussed it. I think it is something we definitely need to look into much more when it comes to the Internet of toys and things and then just one more thing. I think I really liked your presentation, Sonia, but provision, protection and participation and I want to reiterate that I think these three principles and rights goes hand in hand. It is in the best interest of the child and you have to balance the protection, privacy and participation. We should try not to have it going back to a discussion too much it is only about safety and be too afraid and now Larry left the room but also to be too much sensationalist without really knowing what we're talking out. It is also true, Internet has a huge potential for young people in terms of access, empowerment and at the same time, of course, I'm a child protection specialist. We need to protect them from harm but I think it goes very much hand in hand.

>> MARIE-LAURE LEMINEUR: Yes, Arda. And then Maarten.

>> ARDA GERKENS: I agree and with what Larry said that you should be aware of the way you have your legislation, also the technique of course. Sometimes legislatures are not very good. We need to strengthen the children's rights. Then you have always something to fall back upon. So that would give them more possibilities within a safe environment as opposed to prohibiting things, which I never think is really good with new technologies.

Jutta said we should think about what data to keep from our kids. I think we should have one step before that. Should they keep this data from our kid? Do we want them? Is it actually ethical? This is why already in 2013, and I had hoped this year I would get some news, but probably will be next year, I have asked our government to see whether we would need an ethical commission on the digitalization of society. We have an ethical commission on the medical developments. Also in the medical environment anything will be possible in the coming years. We draw lines over there saying this is not ethical anymore. I think also in respect to the story of the Japanese walking around with bar codes. This is something we need to debate. Is this what we want? Is it really how we want our future life to be? It is coming to us. The possibilities will be there. Before you know it, we crossed a point where it's very hard to get back. I think we need an ethical commission on all these questions and from there on we can establish a children's rights or other rights for that reason.

>> MARIE-LAURE LEMINEUR: The question regarding the data and who owns it and how it is hand, I think Maarten had a point when you mentioned the EU/US perspective on data protection. I think it is going to be in the context of IOT will be even go further and bigger and I just don't think we have answers. But specifically we think that most of the global major companies are U.S.-based like toy companies, so there will be a clash clearly at some point, depending on where the companies are based how they will have the laws and jurisdiction that will impose on them specific rules.

>> MAARTEN BOTTERMAN: Data protection for sure is something we need to agree about. I want to first react on the earlier discussion. The Internet of Things doesn't replace the relationship between parents and kids. And we shouldn't count on that, either. I think there is other means. If you talk about what John, for instance, mentioned. Environments that are parents are slowly less capable of taking care of themselves and supports them staying in the environment they want to be and supports us in supporting them, at some point there must be a conscious decision whether it's okay to have 24/7 access to that or whether it could be turned off or not. Technology is here to help us. It is not a danger. What is crucially important is first that we understand what technology does. That we understand what the Samsung TV does take our voice commands into the database to compare them to know what it is not intending to follow our conversations but it is there. And we didn't know. Understanding that motion control means that even monitor motions and that the command TV on is also something that would understand if you enabled that function. You can disable the function and I'm sure you may be able to disable the Teddy bear, too. But you need to be aware and it needs to be possible. So that's important.

So a call for that and especially it's crucial. What we talk about as well in IOT is that we need to understand where the privacy sensitive is high or less high, where safety requirements are high or less high and the same for security. And it's important because data -- data sharing is not only a threat for society, it is also what we need for society for other purposes. Yes, we don't need to know it's Maarten and Arda and whoever, but maybe some data about us can help prevent diseases, disasters, help better use the space that we share or whatever. So again, I very much agree that people should be aware and when people are aware, the toy providers will be aware. Whatever data protection framework they are subject to. So I very much appreciate the work of this group and very much appreciate to go out with that as well.

>> MARIE-LAURE LEMINEUR: Thank you, some of the points you raised are linked to the issue of age differentiation. I think -- so far we've been talking about children using -- as end users but we must be aware and at some point there are different age categories. Not the same to have a speechless child, a baby being connected or monitoring by some device than an adolescent who deserves to be maybe has a legitimate claim to be disconnected and to have some privacy at so point. And without having the parents monitoring what he is doing, so those are issues linked to age categories when it comes to the children as end users. Sonia, you wanted to react on what Maarten said.

>> SONIA LIVINGSTONE: I'm afraid I'm feeling more pessimistic. It may be that technology is here to help us but I don't know that technology companies are always here to help us. And -- yes, depends who we is. Ethical commission in a country can be aware. Busy parents cannot always be aware. I think what we see from the technology companies is that the functionality that would allow parents to make enabled choices in the best interest of their child, that functionality is not always there. A lot of functionality offered to parents, indeed to all of us, is take it or leave it. You can have it with this functionality including its health benefits and the way in which it will monetize your data or you have none of it. I think it will be very hard for parents to opt out of the health benefits that will be on offer even though they come with data collection and monetizing and it will be very hard for parents to opt out of knowing what their child has said to the Teddy when they know that data has been collected and it is knowable. The data will be collected and the parent will need to see it. It is about making the companies work in the best interests and bring the technology to help us, rather than neutral technology itself.

>> What is happening at the moment, because people who make toys have children, too, you know? Sometimes they become aware of that. The other thing is that indeed this is where legislation helps. What we see at the moment in the U.S. is in silicon valley new business models are built by organizations that use IOT services and tools that are based on not keeping the data at all because the data becomes a liability for them as legal pressure goes up. So you see both happening and this is why we need to debate in the U.N.

>> MARIE-LAURE LEMINEUR: Can I have a really quick comeback? I won't be comforted by knowing that technology companies, executives also children. The reason I'm not apart from the obvious is through this whole conversation we have a very comfortable image of nuclear families and well-informed educated parents paying attention and there are many parents in difficult circumstances. There are many reconstructed families where exactly who counts as a parent and who has the knowledge and the password may -- parents are in conflict with each other, they're busy and don't have the education. It is remembering the entirety of parents and children, all their diversity, not as it were my comfortable child with well-educated parents.

>> MARIE-LAURE LEMINEUR: Thank you, sorry.

>> Here is a bit of good news. There is a very well-funded. They have like 40 million pounds or something, international project being led by a consortium of universities called PETRAS, it stands for privacy, ethics, trust and responsibility around the Internet of Things. It is all about the Internet of Things. It has a great deal of support from industry. It is mainly doing test bed things. It is testing real world applications of the technology but it also has a strand which is being run by Oxford university in the UK on ethics. I'm a member of the ethics working stream. So don't get me wrong, there are bits of this stuff that are going very well. Industry is aware of the dangers and the risks. We have might have hoped they had arrived at this position a little bit sooner than they have. And I hear Larry saying let's not scare monger. I think the big businesses don't have to worry too much. If they do their job right we have nothing to worry about. It is only because voices like ours are being raised at all that some of the pressure is being applied to them.

>> MARIE-LAURE LEMINEUR: I'm so sorry. We have been cut off. We have five minutes left, Larry. My apologies and we have somebody else who wanted to say something. They will cut us off. Veronica.

>> I want to follow up on Sonia's point and the fact that the responsibility of the industry is crucial. I think there is so much technology going on but the highest, most advances in technologies are usually not applied to E-safety but to other aspects. I think there is technology out there and these technologies should be used to be built into the products. Toys, devices available to children should be safe by design. It is not just about being user friendly but about being safe by design. That's the point that Jutta mentioned before. We rely on improving media literacy because it goes beyond what normal people, common people are able to understand. Parents, schools, do not need to have all the responsibility because many times they simply don't understand, they don't get it. They are not informed. It is not scalable. The first place that go out there and meant to be targeting children, these products must be safe and I think that also calls for responsibility not just from the industry but also from regulatory bodies, from governments. How will we ensure and demand that these products are safe before they get to the market and they target children?

>> MARIE-LAURE LEMINEUR: Excellent, thank you. You have the last comment and then we wrap up.

>> Very quick because it was about Veronica what said and UNICEF upon the influence on the children. We have again a game in Brazil. The bullying in the increased after this. When the government allow the game, a toy, to be commercialized, I think the government and this firm should be charged to bring psychology -- public psychology following to these children.

>> MARIE-LAURE LEMINEUR: Thank you very much. We have one minute left. Can I ask our speakers to do a -- make a concluding remark if they want to? John, are you okay? Do you want to say something?

>> I'm very happy that Veronica stressed the example of safety by design. The principle of safety by design and I would just like to give you a short example from the analog world how safety by design has been working very well. In the 1990s we had the situation that law enforcement could not cope with the huge amount of stolen cars all over the world, especially also in Europe. Insurance companies lost a lot of money by cases of stolen cars. And the car industry was in fear of losing their reputation for reliable cars but also the technology industry had already the -- the electronic immobilizer technology. In 1997 I think it was made mandatory to have that built into new cars, that safety device. And it could not have happened with legal regulation only. If government had said we need these things it would not have been a success. The success was based on the situation that the various stakeholders all saw their opportunity, their business, their interest at stake and that was kind of a model for multi-stakeholder bringing together. We are not supposed to only talk about let's do multi-stakeholder, we need a situation where multi-stakeholder is -- seems to be the situation. And I don't think we should wait until we have a situation where so many children have been put at risk by Internet of toys or Internet of Things. We could be pro-active and maybe in some years we will have like a built-in digital disconnector that maybe a safety by design solution. Thank you.

>> MARIE-LAURE LEMINEUR: Do you want to say something, Sonia? Arda?

>> Exactly what I wanted to say. I agree that I think this is concerning the child so in the multi-stakeholder approach I think we need a little more legislation to strengthen the position of the child. We should not be afraid of that. If we cannot build trust around the subject, then the whole development will just stop because people won't start buying the projects. Because they will be afraid that something will go wrong. Things will go wrong. There will be scandal. So yeah, to go on with the innovation, you need legislation and I agree with Larry, it doesn't have to be heavy legislation. It can be warnings. You have to think broad but we need something in place.

>> Just something about the position of the child. It is easy to think the children are the exception and the business can get on and those of us say in this room but what about children. What I really think is important is if we think of the debate we're having about children as it affects all of us. Actually we're all anxious about these things and we all have our rights at stake and at risk from many of these developments. If we use the child as a way of thinking how should the technology be developed more generally we have a much better position than if we're saying let the companies run rip and then we have to build an exception for children. That will never be a secure world for children. >> Thank you very much agree with that. Children have a special vulnerability. Also have caretakers and how does that relate with that? I do believe that actions like what you do here, awareness raising will help. Because in the end you will see on the packaging of the doll whether you can disconnect it or not. It will be information the informed parents will seek. Legislation is always the backup. But this legislation on protection in many ways already exists, it is just not applied to this specific area. So that awareness is there, too. In the end, I also think and I encourage you to consider where could technology help taking better care of our children? Because that's there, too.

>> MARIE-LAURE LEMINEUR: Thank you for finishing on the positive note. We have to wrap up. Thank you very much for a very, very interesting discussion from the perspective of the Dynamic Coalition this is a starting point to keep on, I think, sharing ideas and on the mailing list and, of course, I would say maybe a call for interacting more with the other Dynamic Coalition on Internet of Things and keep on together working together and sort of see what we can do and how we can work with the outside world as John was mentioning and see if we can exercise some pressure and make the world safer for children. Thank you very much.

(Session ended at 11:53 AM CT)