FINISHED - 2014 09 03 - Dynamic Coalition on Child Online Safety - Room 5

FINISHED COPY



NINTH ANNUAL MEETING OF THE

INTERNET GOVERNANCE FORUM 2014

ISTANBUL, TURKEY

"CONNECTING CONTINENTS FOR ENHANCED

MULTI-STAKEHOLDER INTERNET GOVERNANCE"



03 SEPTEMBER 2014

14:30

DYNAMIC COALITION ON CHILD ONLINE SAFETY

 





***

This is the output of the realtime captioning taken during the IGF Istanbul, Turkey, meetings.  Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors.  It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.  

***



 (Technical Difficulties)



>> MARIE-LAURE LEMINEUR:  -- randomly selected people.  So here was a gigantic sample.  Those five forces when they replied indicated that they had seized about 8 million images in that two-year period.  Now, if you did the statistical work on that, what that suggested was that within the same period, within England and Wales, not Scotland, Northern Ireland included, the number of images in circulation through the United Kingdom of child abuse images, a number of images was in excess of or likely to be in excess of 350 million child abuse images; eight million absolutely definite because that's what they actually seized.  The 350 million is a calculation, a statistical calculation, based on extrapolating from that population area.  That's one country.  That's the United Kingdom, right?  Or rather, in fact, it's less than the whole of the United Kingdom.  It's just England and Wales.

    Not long after we got this information through that Freedom of Information process, the head of the British Police Unit that deals with online crimes against children, a man called Peter Davis, went on national television, and he said that the police had been monitoring the traffic in child pornographic images and child abuse images within the United Kingdom, the technical tools that allow them to do this.  I'm happy to explain how they do if you want to.  They had identified between 50,000 and 60,000 individuals in the United Kingdom -- and this is UK-wide by the way -- who had been trading in, and dealing with, child abuse images.  Overwhelmingly these images I'm talking about where being traded over peer-to-peer networks and so on.  So 50,000 to 60,000.

    Now, in no year since records began have the police and the United Kingdom ever arrested more than 2000 people for child pornographic image-related offenses.  The average is probably around 1500 to 1600.  So what that means is even if the police were able to sustain a number of arrests, 2000 per annum, which they couldn't do by the way, that's just -- I'll explain that later.  But even if they could keep arresting people at the rate of 2000 a year, it would be 2032 before the British police were able to arrest the last person that they already know about.  By the way, let's just remember that we're not just talking about police capacity to go and arrest somebody, seize their computer, do all the analysis and so on, that we've got to think about, the course and potentially also the prison system.  We have 91,000 places in our prisons in England and Wales.  At the moment there's about 91,500 people in there.  It's a bit overcrowded in a few places.  So even if they could arrest them, the course probably would collapse having trying to process and do with them, and there wouldn't be any places in prison to put them anyways.  Wherever I've gone around the world -- I do a lot of traveling around the world -- police officers essentially say pretty much the same thing.  The numbers are just too big.  We cannot get out there and arrest them all, even if -- and would like to.  They all want to but they can't.  

So it seems to me if we are going to have any hope at all of dealing with this trade, this enormous trade in images that's going on over the Internet, the idea that we're going to be able to arrest our way out of it is simply a nonstarter.  We have to find new and better technical means of disrupting the trade.  We can go on -- I'm sure we will go on to speak about the importance of the images and why it's important to stop this traffic, but it seems to me any notion that we're going to persuade people to stop circulating these images over the Internet, because they are frightened of being arrested, is a pipe dream.

So a lot of progress has been made with technology and building Human Resources.  It will never be possible to reach a point where it is no longer a feature on the Internet.  It's just because it's just the way technology evolves, criminals evolve.  And they find new ways to do new things.  We are playing, essentially, a game of catch-up here.  And it’s extremely difficult to be 100% on the winning side.

>> PANEL MEMBER:  Thank you for the opportunity to speak.  I have to agree with the other panelists so far.  I think knocking down doors, as emotionally satisfying it, is not going to solve the overall problem.  The breadth of the problem is too large.  You have to think of the Internet as a representative soup of humanity.  Whether we like it or not, these criminals are a percentage of humanity, which means we have to find other technical means of dealing with them.  And whether that is technology like Photo DNA to help the interdiction of these images that have already been identified, or other technology to identify video images and process, that starts to address some of the concerns.  But as probably all this panel knows, as the move to realtime abuse starts to become extremely difficult to use a signature-based method to actually track that type of abuse.  And so, we are going to have to develop new techniques. That involves social change.

How do we balance the need for individual privacy with personal safety of a child?  It becomes an extremely complex equation.  I'm not sure there is a right answer.  I'm not sure we had debate at the right level to have that answer yet.  I think it is, know the things we have to tackle, not only as an industry and government, but individuals as well.  What are we comfortable in having in terms of introspection so our children might be safe?

>> PANEL MEMBER: Thank you for inviting us.  I would absolutely agree with the other panelists.  We need to point out that child abuse is a societal issue, and the abuse of children on the Internet is also a societal issue.  So when we look at technological solutions, which is the reason we are here today, we also have to remember the environment in which these crimes are taking place.

>> AMY CROCKER: I think that's essential.  In terms of Inhope, I represent Inhope, which is the International Association of Internet Hotlines.  We focus on reporting and the removal of child sexual material from the Internet.  Our shared vision is an Internet free of child abuse material, but I think we have to be very realistic about the challenges of that, and we certainly -- I would agree with the others about the difficulty of saying that we can arrest our way out of this problem.  We cannot.  It comes back to a societal problem and we need to find societal responses, combining technology, combining social responsibility, and combining public awareness about the problem.  

>> MODERATOR: Thank you.  Before we may move on and discuss the challenges of some technological solutions that are available, and that Ashton raised a very valid point, and the other panelists, maybe we can discuss the positive use of some tools that are in the market.  Maybe some of our participants to the workshop are not aware or familiar with them.  So it could be interesting maybe to describe a little bit.  Could you talk about Photo DNA and how it works?  Is that fine?  And maybe why or how it has been developed and how it works and who is using it.

>> PANEL MEMBER: I can talk about some of that.  Who is using it is a private matter I can talk about.  

Can you hear me now?  Thank you.  

So, I can talk about some of that.  I can't necessarily talk about every user of Photo DNA because that information is up to them to reveal.  It is not up to us.  But I can talk about the ones that are in the public domain.  

Photo DNA is a technology that enables known child sexual abuse imagery by means of a signature.  Unlike previous signature mechanisms which were fragile, Photo DNA signatures can identify the same image that is presented in a variety of ways and formats, and even with the criminal intentionally changes an image.  In the past, a single bit being changed would require a new signature, even if the image was unidentifiably different from the user's perspective.  

We developed the technology with Dartmouth College and donated to Net Mac in 2009, and we make the code freely available to law enforcement and service providers alike.  Software companies like Net Clean included it in net products, and they make that freely available to law enforcement.  

The technology works by essentially taking an image, converting it to a common format and common shape and style and dividing into a number of segments.  The segments are then taken as a histogram in four directions and that generates 128 unsigned characters which is used to generate a signature.  And then you take the non-Euclidean distance between two images to determine the similarity.  So, you take an image factor.  We use a number in our services online of slightly less than 40,000.  It's an arbitrary number.  It just provides a good experience in terms of low force positive rate and high probability of match.  A number of people are using this in the public domain, including Google and Facebook.  Would people like to know more?

>> Do you want to add anything about that, John, about Photo DNA, John?

>> JOHN CARR: Photo DNA has been around for a few years now.

(Off Mic)

All right.  So Photo DNA has been around for a few years now and it was a fantastic initiative by Microsoft to develop it in the first place.  They were under no obligations to do it. They chose to do it as part of their Corporate Social Responsibility Initiative, and it was more fantastic when they gave it away.  I can say this.  By the way, he can't.  He works for Microsoft, I don't.

Google's Marco Pancini will be here soon.  Google has developed a similar tool for Video ID that does a similar function, but I won't speak about that because Marco won't have anything to say if I do.

My view, actually, and in the U.K. we are going to be pushing this line, we have a general election coming up soon so it is a good moment to push out new policies.  My view is that any Internet Service Provider, any company that is providing access to the public, whether free or paid for, to online services, should be under a legal obligation.  They are certainly under a moral obligation, but should be under a legal obligation, to use the available technology to try to ensure that their networks, the services they are providing, as I said free or paid for, are not being abused or misused to assist the distribution of child abuse images.

If you are providing online storage facilities, network facilities to the public, as I say, free or paid for, you either already know or should know, that there is a strong probability that your, the service you're providing, could be being misused by bad guys, bad people, for the purposes of distributing child abuse images, obviously.

And if you know that, or if you should know that, why wouldn't you use whatever tools are available to try to stop it?  Well, in England, in the U.K., that is a view that we are going to be pushing forward.  We are not going to say you must use Microsoft Photo DNA.  We are not going to say you must use Google ID or whatever in the end they choose to call it, but we are going to say, you should be doing something.  You are in a business that is about the abuse of children.  You don't want to be.  But you are.  And if you are, then you should act to deal with it.

>> AMY CROCKER:  If we go back to the point about talking about positive uses of technology and I have to agree, the developments, such as Photo DNA and various video fingerprinting technologies currently in development and that have been developed as a response to increasing amounts of videos being seen, being shared.  These are all really positive developments, and they have influenced and changed in terms of what law enforcement can do, in terms of what Internet hotlines are able to do as well.  

But I think one thing to point out certainly from the hotline perspective is that using these technologies to disrupt access to known materials on the networks is also based upon us knowing that the material is illegal and it being assessed as illegal in the relevant jurisdiction.  So, you know, we have also got to look at focusing on feeding into the system, reporting, how are we detecting the material, and that is something that the Internet hotlines play a major role in.

I don't know if we want to talk a little bit later about some of the projects that we are involved in to try to sort of bring together, to move that debate forward.  Perhaps we will talk about that at a larger stage.  I think it's important to say yes, the technology is a very positive way to be used and can be applied in many different ways.  But we must not become complacent about we have the technology and therefore we solved the problem.  We need to feed into the technology and help it develop as much as it helps us to reduce access to these materials.

>> MARIE-LAURE LEMINEUR: Thank you, Amy.  I have another question related to Photo DNA.  I think one of you mentioned it's meant to be dealing with fixed images.  So my understanding is that law enforcement is reporting that they are finding more and more videos among the collections.  So, is there any solutions, or are you aware if Microsoft is working on some kind of solution that would be able to apply similar product for videos?  

I'm sorry.  Marco is not here so when he comes here, I'll ask him on behalf of Google.

>> STUART ASTON: I'm not aware that we are working on a video solution at this point in time.  I think one of the opportunities for industry in this area is to collaborate and work on areas of strategy.  If we come up with a good answer for static images and somebody else comes up with a good answer for moving images, I think that should be embraced.  Why would you want to reinvent the wheel,  especially when what we're looking for is the ability to have common signatures we can apply across multiple platforms and jurisdictions?  That starts to become a driving factor.  Providing the technology works out in the way you hope it does.

>> JOHN CARR: It might be worth reminding ourselves why the images matter.  And I say this because in Europe, a few years ago, there was a big, big fight, basically, between us and the free speech lobby, when the European Union -- Hi, Marco.  We've been talking about you.

(Laughs)

There was a big fight in the European Union a couple of years ago.  The Commission proposed a new law to make it compulsory for every state that was in membership of the European Union to make provision for the removal of any websites, any URLs, known to contain child abuse images.  And a whole set of people jumped in and said, that is completely wrong.  You shouldn't do that.  And they won.  The Commission dropped that proposal.  

What the Commission did do, which was still a good thing, was say that it is an option for every member state.  So in other words, they said it is perfectly legal to have mechanisms for removing URLs and websites known to contain child abuse images, but it wasn't compulsory.  

The people who argued against making it compulsory, and we support today being compulsory, by the way.  They said, no, no, no.  What you have -- you shouldn't be bothered about taking down -- they didn't say you shouldn't be bothered.  They said the issue of taking down the URL is not the important thing.  The important thing is to get the image deleted at source.  So it might be in the top -- the server might be on the top of Mt. Everest or in outer- Mongolia or in some country with very few police officers who knew anything about this.  As far as they are concerned, the important thing is not to block it but to get it taken down.  

And of course that is what we would all prefer.  Everybody would prefer that the images were removed at source so that they were never available to anybody anywhere ever.  But that is not the way it works.  Suzi from the Internet Watch Foundation.  I'm sure Suzi will be speaking at some point about how quickly the system can be made to work to get images taken down but I'm sure Suzi will also mention some instances where it is proving difficult to get some of these images taken down.  

So the question then becomes why are the images themselves so important?  Why is blocking or restricting access to these images important?

Very quickly say why. First of all, it's the rights of the child in the image.  A child doesn't consent to being -- bear in mind, what is it?  70% of these pictures are children age 10 or under being raped by an adult?  Some percentage is showing two-year-olds being sexually abused.  These children have never consented to being raped or sexually abused and Dave certainly have not consented to pictures of it being published or distributed on the Internet.  

And every time that image is republished, in a way they are being re-abused.  Who knows who might see an image with a child in it and what they might do with it.  There are all kinds of reasons why the image itself is important and why the images themselves are not accessible over the Internet.  That is why blocking them and notice of takedown is so vitally important in my view.

Secondly, it's a vital evidential tool for the police or social agencies to try and track the child and get the child the help it so obviously needs.  Because the child that appears in the child abuse image is obviously in some kind of desperate state in the first place and needs help and assistance.  That's the second reason.  

The third is this:  Why is it important to get these images removed quickly?  Removed as quickly as possible?  Because there is evidence, of course, that they help and sustain pedophile networks.  People who perhaps never been engaged with these images come across them and it takes them on a path that in the end leads to more children being abused.  So there is a third reason why dealing with the images in and of itself is a very, very important task and getting them off the Internet as quickly as possible or restricting or blocking access to them.  

And the fourth and final reason, of course, for as long as we keep reading stories in the newspapers about child abuse images being available and being downloaded and so on, it is kind of sending out a big signal to criminals of every kind that the Internet is an okay place to do your --   because if they can't deal with child abuse images on the Internet, which probably even the majority of criminals don't like, if they can't deal with that, then of course it must mean it is possible to get away with many other different types of crime on the Internet as well.  So there is that fourth additional reason why I think it is incredibly important.  Things like Photo DNA and Google Video ID, or whatever they will call it.  Marco will tell us in a minute that these things work and get the images, or restrict access to the images as rapidly as possible.

>> AMY CROCKER:  Absolutely.  I want to talk about, how there are many debates surrounding should we block, should we remove.  Inhope is a network that promotes the removal of source of illegal content, child sexual release material on the Internet.  However, as John mentioned, within European countries there are countries who also block access or countries who don't.  

One of the interesting things to look at when we talk about just to go back to your original question, I think, about what technologies are available, one of the things that the in home network is piloting and it's a joint enforcement project is actually looking at combining analysis of URLs and moving forward assessment of content on the Internet feeding into law enforcement systems.  Because it's important to remember that the processes, as John mentioned, of bringing together what law enforcement, what industry is trying to achieve and what they are trying to achieve through investing in technologies and providing access and society.  

So we are working on a joint project which is a pilot project in the EU, and we are hoping to roll it out, and that includes a video of technologies.  It is allowing our hotlines now to be applying images hashing technologies to the assessments they are making and also video hashing technologies and to contribute ultimately to law enforcement databases that can then be used to investigate and identify children.  

And this is the thing we must remember.  We must come back to every time we mention technology.  We can talk about technology for as long as we want but ultimately what we are talking about is protecting real victims of sexual abuse and sexual exploitation and this is really important.  I believe if we keep coming back to this point we can reach consensus on where we can move forward.

>> MARIE-LAURE LEMINEUR: I think now it is time to introduce Marco Pancini from Google who just joined us.  Marco is senior policy council for Google.  When you walked in, we were discussing the benefit of Photo DNA, a bit of a background information about it as to what it was providing us.  And I also asked -- I'm aware that Google is using it.  And also we were discussing the in fact there are more and more videos in the collections now of the child exploiters.  Is Google working on some kind of solution to deal with the videos that eventually are in the network?

>> MARCO PANCINI: So thanks again for the invitation.  I apologize for the delay.  For who doesn't know me, I'm Marco Pancini from the Google policy team.  I'm dealing with media policy issues and child safety in some of the far away.  We have the pleasure to interact together.  I think the real focus for us in this moment is to try to understand what technology can do in order to improve the hard work of detecting child abuse material online.  

For a few years, we were experimenting our own technology very similar to Photo DNA, which is actually interoperable with Photo DNA.  And we decided to go public about that when we were sure it was exactly what we were trying to achieve in terms of as a technology solution.  

Some of the result of using these tool are actually public.  There was the news of a few weeks ago of the fact that, thanks to this algorithm match of images with images from the database and images that are floating on the network, we were able to detect a repeated offender.  

I would like to open a bracket here and say the way the news was reported in the media and discussing in the public debate was absolutely the other way around.  So, without looking into the news, because this guy was an repeated offender.  It was somebody that was already under investigation.  So it was not because of the matchup of the image and the picture in his inbox.  

The real question everybody started to say, what about somebody sending to me without my consent a picture that is a match with the database of the child abuse material?  Without looking at the context where it was clear that we were supporting an ongoing investigation that the information led to being in line with the rule of law and so on.  So say that I think technology can do a lot in order to help.

I'm sure that we are looking also into videos.  Still, I think we are not ready yet.  I think there is also some problem in getting database of videos available that we could use as a match because we have already technology called Photo DNA -- sorry.

(Laughs)

Content ID.  And it is applicable, is used on YouTube to fight against content faking and fight against piracy.  But again, here we can rely on the database of videos that are provided.  Here we need to be the first inventory and then I think I'm sure technology will be able to find a solution.  

So, again, the opportunities that technology and big data, because if you look at Photo DNA and our system of photo detection is based on big data because it's an algorithmic match.  You don't scan the whole video.  Just need a few frames of the video and actually contextual content on the video allows us to create the match and then automatically raised the red flag and therefore create the outlet.  So again, the use of that innovation in big data also in the field of detection of child abuse and activity in the coming years will represent very important strategies.  

Now, the need to find a balance and to stick with the rule of law and make sure we have a process in place and make sure that there are no abuses is also very important.  Because any kind of issue that can raise the enforcement of this policy could really lead to debate that can undermine our efforts in order to make Internet a better place.  So, for now, I think we are doing the right thing. I think there are very strong and all the actors as a good example, all the actors involved in this very important activity are super serious, but transparency and accountability is something that we need to put as a priority for us.

>> MARIE-LAURE LEMINEUR:  This is very interesting.  Speaking of new trends, there is something that is worrying me a lot, maybe you heard about it, the live streaming cases taking place in some countries, basically there are live shows of children, sex shows, of children.  It is happening in some Asian countries, specifically the Philippines and other countries.  

So what the abusers are doing is they basically pay and it is made in a show.  They pay this thing in their own home country and they are watching it through some channels.  So, I'm very worried about how we can deal with this and the kind of technological solution we can use to sort of be able to track down this form of exploitation and if it is possible to do it and the kind of challenges that would appear if we were trying to do it.  I don't know.

>> STUART ASTON:  So it is a very worrying trend and you're quite right to the call it out.  It's becoming a more -- I hesitate to use the word popular -- form of abuse.  Part of the challenge is that the traditional methods we have to identify material based upon the signature, be that a video or a static image, rely on us having original copy of that image.  So there is nothing that we can -- because it is new material, we have got nothing to compare it to that a signature level.  

So also an interesting challenge between -- and I mentioned it earlier -- the balance between people requiring, or desiring, a right to privacy on the Internet.  And a lot of technologies used for chat and video sharing or video streaming actually protect the channel of communication so you have to be on the inside of the channel to actually have some introspection to see the content.  

And then you have to look the scale.  There are billions and billions of minutes of video communications that are legitimate for months and years.  So whatever solution or approach has to sit on the inside of the secure channel, because that is what needs to be done.  And secondly, it needs to somehow arbitrarily identify or are we going to turn around and say, let's have a whole bunch of people in a room looking over our shoulder at every video?  I'm pretty sure that's not the answer everybody wants.  

So what is the balance?  I have to say, I don't think there is a really good answer today.  And it is worrying and concerning.  One option obviously is to start looking at following the money train, because this is a paid for service.  Unlike a lot of child abuse imagery, this is something.  There is money changing hands for this capability.  The option there is to follow the digital money chain back to the originator.  Obviously that relies on the intelligence that the police forces may or may not have into the ongoing abuse.  Obviously that's speaking of live streaming of events between offenders.

There is also live streaming of activity between victim and the offender.  Because there is also a trend to move very quickly from contacting children in the social media to moving to compromise the child in live stream and getting to use compromised images to force the child to perform further acts of abuse.  And again, that is another act of live abuse that we have to be cognizant of as becoming more and more popular; both worrying trends, both very difficult problems to solve.

>> MARIE-LAURE LEMINEUR: We are talking about sextortion, I believe.  That is the term that needs to be used.  Marco, Joan, or Amy, do you want to add anything?

>> JOHN CARR: I agree there isn't an obvious solution for how we deal with video streaming, live video streaming.  It's not obvious.  I think isn't all the traffic encrypted anyway over Skype?  That is why -- that's the point she was making about having to be inside the -- somehow in the stream, because you won't be able to spot it from outside.  And the Video ID and Photo DNA only work because they are dealing with an image that is already being looked at and analyzed and they have a signature from it.  By definition, if it is live streaming, that can't possibly be the case.  

I mean, I know that the general heading into this, we have to look to technology for better solutions for dealing with this problem, and we do for a good part of it.  But there are some bits where technology may well not be able to provide answers and this may be one of them.

I know that the cops in Britain -- I mean, we arrested 180 people, by the way, in the United Kingdom for being involved in live streaming cases arising from the Deutsche Suite.  The thing where they put up the false page and so on.  They got intelligence out of that operation which led in the end to 180 arrests.  So I know that the cops in the U.K., one of the things they have been thinking about is simply putting more police officers from the U.K. out into the these countries where they suspect this type of activity is taking place.

Since a lot of it is happening in the rich West and the rich North and so on, maybe that -- one of the response that is we will have to get for dealing with this is, simply greater engagement by foreign police services working with the police in the Philippines and these other poorer countries to try to make sure that intelligence and information about this type of thing.  

Because the people in some of the villages where this is taking place, everybody knew.  Everybody knew what was going on.  But the local cops, the local police were so overstretched or so not bothered or whatever the explanation was, that they didn't pick it up and weren't able to act on it.  But our cops can if we go out as part of the wider workforce or something.

>> AMY CROCKER: I absolutely agree with that.  I would echo both points.  I think looking at the money trails is a really important way to address this issue because it is an exchange, a commercial exchange.  Of course, then you have elements of how are they paying, because obviously a lot of the payments are not using traditional payment methods and we have the issues about investigating virtual currencies.  There are a lot of us associated issues with payment, but we've seen great success in financial coalition within the European Financial Coalition.  And huge efforts, for example, to establish an Asia Pacific Financial Coalition.  There are huge efforts to engage the financing to actually contribute to the disruption of the organised commercial payment for sexual exploitation of children online.  

But I think also -- and to go back to it, I think it is important to look at law enforcement capacity both within the region, both the origin countries of traveling sex offenders or those offending online children of abroad,  but also looking at local law enforcement capacity.  This is area that is hugely under-resourced and it is a challenge when dealing with crimes committed across borders.  But I also, and it is something that is spoken about a lot, but you hear it in regardless of whether you're talking about technology or law enforcement.  

It comes back again to education and prevention and actually trying to disrupt the supply within communities and that is exactly going back to John's point.  Certainly with some of the live streaming we saw was happening in concentrated areas and villages and everyone knew it was happening and there was a lower level of understanding that this was harmful to a child.  Because a lot of the adults in the community would say, well, you know it's only over the Internet.  There is no physical contact so the child can't be harmed.  So it is about also raising awareness about the dangers to children as much as looking at ways to disrupt the network and to go after the offenders in the countries of origin.  

So it is a very complex issue, and yes, we don't have answers to it and we can't just, as we've heard, applying technological solutions is challenging at the moment because of the nature of the activity.  But, you know, this is in the context of needing to deal with the problem again as a societal problem and actually working with the communities to prevent the supply of children for these activities.

>> MARIE-LAURE LEMINEUR: So what I hear here is that in    technology solutions are part of the solution but in some very specific cases, specific forms of sexual exploitation, we have also to think of other ways of tracking that down, the bad guys, and sort of complements different type of solutions.  That would be the case for live streaming.  

So speaking of money and financial coalitions, it triggered a question I have about the cost of actually developing technological solutions.  And I wanted to ask the participants about that.  Because we all know that it is very costly to divvy up the bases and applications softwares.  So, who do you think should be paying for it?  Developments?  And when it is available?  What type of business model we could use so that law enforcement agencies, civil societies, NGOs operating a hotline could bear with the cost of licensing.  So I don't know.  Marco?

>> MARCO PANCINI: First, one word on what we discussed before.  So we really believe following demand is an effective way to going after criminal organisation exploiting child abuse online.  We are a member of the European Financial Coalition.  Probably there should be ways to give to the European federation coalition to Europe in order to fight against any kind of misuse of money, services or financial services to get revenues from these traffic, including also for example, the old area of human trafficking, which is another important area to take into consideration.  So say that we should focus on following demand and make it work under the rule of law but in a way that is effective and is empowering law enforcement and prosecutors with tools and remedies.

On the cost of -- I could talk about my company and our policies.  For sure, when we develop these kinds of tools, usually we give them for free because we believe that our main goal is not to make a business out of it, but is to increase the trust online and make Internet a better place.  

So, and again, that is because of the nature of the service that we provide.  So we don't believe that there should be a business model behind that.  But this is because of the specific nature of the service that we provide.  If there are companies that believe in making a business out of it, I think it is totally elicit and at the same time, we should really think to make sure that the cost of these tools and the way these tools are developed is inclusive and is open to all the law enforcement, especially as John said.  There are law enforcement in the world that have different priorities and different means and so collaboration and the commission has done a lot of steps in this direction, between law enforcement globally is very important.

>> JOHN CARR: I agree.  I think that the production of these tools should be where they are possible to be, freely available.  Certainly Photo DNA, make it really available to both enterprise and law enforcement.  

I think there is a broader question about some of the infrastructure you need to support it.  So for example, how do we get the database of illegal images signatures in the first place?  Is that a service provider responsibility or government responsibility?  Or does that lie between the two organisations in question?  And I think the answer is probably somewhere between the two.  

But functionally, we can't go and have a database of illegal images.  We can have can have a database of signatures, because we are a commercial organisation and we can't commit the crime of holding those images.  So, there is an aspect that needs to be performed by government.  There is an aspect needs to believe performed by industry and perhaps an aspect that needs to be performed by our consumers as well.  So there are costs to this question of where they actually lie.

>> (Off mic)

>> JOHN CARR: There is no issue about companies holding hashes, as long as those hashes can be translated back into actual images.  But somebody somewhere needs to guarantee that is the way this is, and guarantee they are the hashes of the correct activities.  Otherwise we only need one or two mistakes for a legal image of somebody's holiday snaps on the beach or something ending up in one of these databases, and it will discredit everybody else in this field.  So it is really, really important that these images that create the hashes and so on, are looked at by experts, come is what happens in the United Kingdom, the IWF does it along with the police, to make sure that we are only dealing with genuine illegal child abuse images.  

A lot of people think it's all about secret censorship anyway.  It's not.  But we must never ever, ever, ever make mistakes which allow that idea to get into currency.

>> AMY CROCKER: And certainly, in the 51 International Internet Hotline Members of Inhope, the vast majority of making these assessments against their national jurisdiction, which is important to remember as well because Internet hotlines and law enforcement in countries are governed by their national jurisdiction which would define how they define an image, the level that is illegal or not illegal.  There is a huge diversity.  

And we talk about double hash sets, one of the things that is in consensus for a number of years is when we talk about different levels of illegal content and illegal child abuse material, the thing to focus on, and it could be shared globally, is the baseline, a list of content child sexual abuse material that will be illegal in all jurisdictions around the world that have legislation on the issue.  

And I think touching upon, yes, there are those available, but I think possibly there is space for separate hash lists being stored and developed by organisations on relatively shared and to move towards a true truly global list of known content.  I think that's something that perhaps needs to be looked at in the future.  

But that doesn't answer your question about costs and who should be paying for it.  And I don't want to be the first person to use the multistakeholder word in the room.  I think John may have already mentioned it, but I think it is an important thing to talk about.  I don't think it is the responsibility of any one sector.  I think industry has already done a huge amount, certainly in the last 10 to 15 years we've seen a sea change in what's been happening and the investments made.  But I think everyone needs to be doing more.  Governments need to be doing more.  

We are moving in the right direction.  No doubt about it.  And the fact that we are having a panel like this today is just evidence of that.

>> P. MALOOR:  As a U.N. agency we see this as a public/private partnership for the greater good.  It is very obvious to us.  What we should probably be talking about is building more platforms for sharing such tools.  We are thankful to Microsoft and Google for sharing this and giving this to the world.  We need platforms where these can be picked up by any member state who wants to use this, probably for open source developers to come in and develop their own solutions, extend these solutions.  So that probably should be one of the key focuses of our discussion.

>> MARIE-LAURE LEMINEUR: Before opening the floor for questions, I would like to ask two last questions, myself.  First one would be, are we aware that Google and Microsoft, they have both made changes to the way their search engines work in relationship to child abuse materials?  I am curious to hear about how is this going to work in terms of other languages with other languages other than English. If you could tell us a little bit about that.

>> MARCO PANCINI:  So we started with English.  The basic principle here and that also talks about the discussion on search neutrality and so on.  What we want to provide to the user is the best possible experience, which means that we don't want absolutely the worst of the worst of the content that is available online is displayed.  Therefore, we took in consideration a clear sign to denote a website where we know that the content posted is bad.  And that is one together with other sign that composes the algorithms that make it possible when somebody is putting that keyword on our research.  So we are taking consideration to the content that is not in line with our policies and is in some cases is absolutely not acceptable.  

For some specific content, we blocked for access to specific keywords that are given to us from law enforcement that give access to the worst of the worst.  That's to complete the information.  And again, the goal here is to provide great experience for the user.  

We are scaling up to other languages.  The same approach would be also available to other languages.  We don't know the timeframe but this is something that is happening in this month and is ongoing.

>> S.  ASTON:  I had a very similar position from myself and my colleagues.  We look for specific single use, single intent search terms and in their case, we also return a public service announcement to the users who use those terms directing them towards a charity for help.  And again, we don't have time scales for rolling that out in multiple languages that I'm aware of at this point in time.   But work is in progress

>> MARIE-LAURE LEMINEUR: One last question to wrap up this section before we open the floor, is looking a little bit at the future, and all the technology changes that we can foresee, like IPv6, expansion of domain systems, extensions -- sorry, use of virtual currency.  I mean, what are your thoughts on that and how do you think this is going to impact child abuse materials and how they are being produced and used?  

Personally I'm very worried about the impacts of IPv6 and the availability of new domain name extensions.  I don't know if either of you has any thoughts on that.  John?

>> JOHN CARR: I have thoughts about almost everything.

(Laughs)

Obviously the landscape is getting more complicated and therefore, in principal, you would expect it would be easier for bad guys to do more bad things particularly as we all know that every law enforcement agencies in the world is up to here and beyond with stuff that they try to deal with now.  

So any new developments -- and it's always been this way since time of memorial.  Criminals have been among the first to find and exploit new technologies for doing bad stuff.  And we didn't invent the motor car, did we, to devour the planet Earth of carbon-based fuels and end the human race through greenhouse gasses.  It's what we discovered later on.  So, I'm guessing as these new technologies come out, the bad guys will be among the first to work out what to do about them and we are all going to have to find out a way to get ahead of them or keep up with them.  

But I agree, the Internet of Things, these gigantic databases of personal data that are existing around the place, they all offer huge opportunities for bad stuff.  They also offer huge opportunities for good stuff.  And I am optimistic fundamentally about the determination of the multistakeholder environment to solve these problems.

>> STUART ASTON:  So I'm not particularly worried about IPv6, per se.  I think as I alluded to earlier, there is a tension between technologies that will enable people to have both privacy and anonymous behavior on the Internet and the ability for both technology service providers and law enforcement to be able to track people who are using child abuse material, consuming it and distributing it.  

The very nature of absolutely private Internet is almost incompatible with the ability to enforce laws and say with certainty that such and such a person on such and such a date committed such and such events.  There needs to be that balance.  

But that is one we always had and we solved that problem with laws and processes and turned around and said, this is what we as a society are prepared to live with in order for our children to be safe, in order for our people to be safe.  And it is that balance.  But we haven't had the debate.  And we keep on as a society by shying away from it.  We need to engage on that debate to actually have it and deal with societal issues that technology is going to provide.

>> MARIE-LAURE LEMINEUR: This is a very nice way of wrapping up.

If you please, speak into the microphone.  It's very hard to hear over the noise.

    >> AUDIENCE MEMBER:  (Off mic)

>> MARIE-LAURE LEMINEUR:  This is actually a very nice way of summarizing some of the challenges of the topic we are dealing with today, and I think at this stage it is a good moment to open the floor for questions.  Because I'm sure that the issues that Stewart had just raised triggered a lot of, or might trigger a lot of questions from the floor and other issues that we also mentioned.  So, please feel free to ask questions if you have any.  Do we have someone who can take care of the mic?  There is a lot of background noise so it is disturbing.

>> AUDIENCE MEMBER: My name is Crow (sp), I'm representing the German Centre for Child Protection on the Internet, and I wanted to pick up on that question about more and more pictures are arising and that we need somehow to use the technical tools to dry out the masses.  And when I have been reading the last report from the European Financial Coalition, I have seen that it stated that it was a decrease in the financial transactions they discovered, but that doesn't mean that there is a decrease in the trade of the images, because there is virtual currency, so other ways of payment that need to be addressed.  Because I think the financial transactions are one means to stop it.  

And on the other hand, we also face that the images themselves have become a currency in some forms of pedophiles, it's just like, you need to have an entrance fee.  And if you can provide a new collection of images, you get access to the other collections that are already there.  So, that means that this entrance fee is a reason why there are new views and images being produced just to get access to the other images.  And I think that is just not acceptable and we need some moral instruments to dry out the morass.  Thank you.

>> MARIE-LAURE LEMINEUR: Does anyone want to react to what you just said?  Jonathan?

>> JONATHAN SIMBAJWE: Jonathan Simbajwe from Uganda from the foundation supported by the IKIZ Foundation.  My question goes to John.  You have talked about some good work you are doing in bringing down and fighting against materials, child abuse materials.  I'm wondering, have you done some attempts of raising awareness to children to report cases of those who tried to engage them in producing child abuse materials?  Have you tried to empower these children; because they are affected the most?  Have you done it so far?  Thank you very much.

>> JOHN CARR: I work for and represent, within the United Kingdom anyway, children's organizations whose entire work really is about doing what you've just said.  Child sex abuse is not on the bad of the child.  It's a serious crime, and finding ways to reach out to the abused children, getting them to come forward, is an exceptionally challenging thing to do.  

Most children never report the abuse that they suffer.  And when you do find reports of abuse, it is usually when they are adults.  And that is why in the U.K. at the moment, for example, there has been a lot of what we call historic abuse cases.  And one of the reasons for that is, children don't tell.  Often their abuser is a relative, a family member.  They are persuaded by the abuser that if they tell what happened, that means the family will be broken up and destroyed.  Dad will go to jail, it will all be terrible.  

So they -- you know, child sex abusers are very, very clever people very often.  That's the reason why children like them.  They calm them, manipulate them, into being silent.  So, your question is a fantastic question.  It is in a way at the root of all the work that the organisations I work with do, but it is exceptionally difficult to achieve.

>> MARIE-LAURE LEMINEUR: I believe the gentleman over there is before you, and then at the back.

>> AUDIENCE MEMBER: Patrick Curry (sp) and I'm from the Alliance for the Sharing of Cyber Information Internationally.  And I have a question, which I will arrive at in a minute.  

What we find across the board is that identity fraud and the inability to be able to get accountability, not just for people, but for organisations and devices, provides the anonymity that fuels all sorts of criminal activity.  So, for example, in what we call the OCO Report (sp) from EC3, ID fraud is named as the top enabler for all aspects of crime across Europe.  

The point here is we are now seeing an array of measures coming in to start to reduce anonymity and to introduce partial anonymity where it is required for a whole range of requirements.  But we recognize completely a lot of the social challenges in this case, that John has identified, and I understand that that is not just the answer.  

But my question is, to what extent can start to reduce anonymity in every aspect of the transactions that occur?  And I'm now talking about user. I'm talking about location. I'm talking about cross-border traffic. I'm talking about ISPs. I'm talking about devices.  If we can start to introduce more authentification into those processes, would that be seen as a good thing from your point of view?

>> JOHN CARR:  Not only is it a good thing, in my view, in the end it's inevitable.  I don't think the current regime that we have across the world is sustainable.  Unfortunately, I think it will probably take a gigantic catastrophe to trigger action by enough governments.  

And by the way, let's be clear what we are talking about here.  If I want to log on to an Internet service over the weekend using my weekend name of Maureen -- by the way that is a joke.  That's up to me.  That's nobody else's business.  What matters is not knowing how I log in, whether it is Maureen or Arnold Schwarzenegger, or I can't remember which one I'm using at the moment.  It's, am I traceable?  Rapidly traceable.  Because if I knew for sure that my identity could be rapidly traced, then it would have a profound influence on how I behaved.  

People behave badly if they think they can get away with it.  Or more people behave more badly.  St. Francis of Assisi, maybe if he was alive today and online, maybe he would always be virtuous and there would never an issue with him.  But in general, most people do bad stuff because they think they won't get caught and they can get away with it.  It's not what they present themselves as being.  It's not how they log into different places that I'm concerned about.  It is, if they do do something bad or wrong, can they be traced?  

I think we need or going to end up with the cyber equivalent of a car registration number.  When I drive down the road in my vehicle, I know very well that if I do something bad or stupid, my registration number is there and then I can be apprehended and brought to account.  It doesn't mean anybody walking along the street or seeing me or my car knows who I am.  But what I do know is that I can be traced if I have to be.  If I go through a red light or hit a pedestrian, my registration number gives me away.  

And I think we are going to end up with something.    And we should end up with something like that on the Internet.  Now, people talk about countries where there are no free speech rights and no democratic rights.  Well, you know -- I'm deeply sympathetic of course to that.  But you can't -- I don't think we are going to solve those problems simply by everybody else allowing criminals to carry on doing bad stuff.

Issues of free speech and fundamental Human Rights are political problems within the jurisdictions and countries where they exist, and by and large, they exist in countries where they always had those problems.  The rest of the world shouldn't have to pay the price of that.  So that is a big yes.

>> STUART ASTON: So I'm speaking into the mic for a gentleman back there.  Patrick, you make an excellent point.  Having a form of identity traceability, as John put it, I think is probably going to be an important factor moving forward.  But then again, if you look at some of the technologies that are being used by criminals, their original intent was to provide the anonymity we are now seeking to avoid.  

Will other technologies come out to improve the anonymity given a verifiable identity?  I think they will.  That's because the nature of this game is move and counter move.  We are playing an arms race, if you will, with the criminals.  I think having a traceable identity is probably a good thing for the majority of normal users on the Internet who have nothing to fear and nothing to hide.

>> MARIE-LAURE LEMINEUR: The gentleman in the back.

>> AUDIENCE MEMBER: My name is Mohammed Mustafar.  Let me start where you stopped there.  Traceable identity.  You know in the bulletproof hosting and flex-hosting, these are the hostings now allowing the criminals to have the illegal content host without tracing, basically.  You know within a few minutes you can change the IP addresses.  The content is hosted either in a compromised computer or personal computers?  And they can draw off other servers.  So this is highlighting a new challenge for combating child abuse materials online.  

One more thing I would ask you and there are two different.  Second one, I'm coming from India.  We had a couple of issues recently as far as sextortion is concerned.  It's a typical black market, like I take images from the children, probably innocent children images, and I say that I have your new images.  If you don't act upon what I'm saying, I will publish on the Internet.  That's typical old blackmail.  

We have new issue.  They are taking images and already publishing the images on a couple of websites and sending the URL to the victim and say, I have your images already published on the Internet and this is the username and password.  If you don't do what I said, I don't need to do anything.  Automatically your image will be broadcasted to the Internet.  So, this is a new threat coming out.  So what are we -- how do we address this kind of issue?  I mean at least talk about it.  Thank you very much.

>> AMY CROCKER: I think I should have -- probably my moral responsibility is, it is not really an ICT issue.  This is a human relational issue, because sextortion is a growing problem.  It's a problem that has been around for a long time.  People abuse each other and extort each other, and now the use of ICT increasing and children are being talked.  There have been some high profile cases and you know, unfortunately, I think one or two suicide of young people.  So it's a very, very huge issue of huge concern.  

But ultimately, the technology being used in these cases is not advanced technology in a sense.  It's just human behavior on the Internet to abuse each other.  Now of course there may be a possibility for people to anonymize themselves so they can't be traced back.  These are challenges we face.  But, yes, I mean, I think I take your point.  It's a serious issue and it is a concern to educators, to law enforcement in many countries.  But I think we need to look at the source of the problem and the way it is related to each other as well.

>> M. PANICINI:  I would make a comment more as a citizen than a representative of the ICT here.  I think that this kind of issues as you say, that should be a focus for law enforcement and for Governments to discuss together what kind of tool or what kind of ways that they can put in place in order to not make these things happen.  I think where we need to work as an ICT industry is to improve the collaboration and discussion with law enforcement in a very transparent way as John says, because we want to avoid any misunderstanding from this pointed of view.  

But, okay, that is a law enforcement job, public prosecutor.  We need to make sure they understand this issue.  If we have information that they don't have on this scheme, it is very important for us to keep them the information.  If the law enforcement, public prosecutor, doesn't have information, or needs information from us, it is very important that we make sure that we are talking with each other and we are going into a positive solution.

But where?  And I'm sure from this point of view, we are already having discussion with law enforcement.  We need to make sure they know about these issues.  They are thinking in finding ways in order to fight against it.  And if they need support from the industry, they should be ready to provide the support to help law enforcement in understanding how to tackle these issues.

>> AMY CROCKER:  One last point to that.  It's a very good example.  I've seen examples, an organisation the Inhope Foundation is working with in Mexico, and they have a model of co-responsibility and they've had huge problems within their case, telephone extortion, something that undoubtedly will move to the Internet and could be a comparison to what is happening online.  

They've had a huge impact upon reducing the incidents of extortions through mobile phones through reporting, through creating hotlines, reporting lines and creating hotlines and reporting lines and actually empowering citizens to report to industry and then feed through to law enforcement, and honestly the statistics are incredible.  There is a huge amount that can be done just by making people aware of what they can do and how they can report and then getting the information through to the right people for investigation.

>> MARIE-LAURE LEMINEUR:  Thank you, Amy.  Suzi, and then the lady from World Vision in the back and then -- can we have the mic, please?  Thank you.

>> SUZI HARGREAVES: Thank you very much, everyone and thank you for the discussion from the panel.  My name is Suzi Hargreaves and I'm the -- I run the Internet Watch Foundation in the U.K. and we are the U.K. hotline for removing criminal content, child sexual abuse content.  

In terms of how we are using technology to fight the issue, we are working very closely with other companies funded by the Internet industry.  For example, we are doing a number of projects.  We are doing one with Bing and with Google at the moment to block pathways to torrent sites.  We are doing various other projects.  We are discussing with Google, having embedded Google engineer at the IWF.  We are looking at lots of different things to work together.  

But most important thing we are working on at the moment is the National Image Database in the U.K.

That's a new image database developed by the police and they are working very closely with us.  And the important thing is that the images that are on the database are quality assured.  And we are working with them at the moment to get the images to start the database, which will go live in December.  

What that means is that the police, every image that goes on the database has to have what the police call the 3 yeses for that specific grading.  We are supplying the third yes at the moment which means our analysts are looking at 4000 images a week each.  The database will be hashed in total, so we will also work on it in relation to inputting into the database and taking out of the database and hashing all the images we see.  

And that's really important for two reasons.  One is, we can provide the interface between law enforcement and industry so we can work closely with industry in a way they can't work with law enforcement.  But the second reason it is really important is we are the first hotline in the world that has the powers to proactively seek content on the Internet.

So actually, we are actively going out there and finding much, much more content.  We have been doing it now for three months and I can't announce the results because the Prime Minister will be announcing them at an international summit in December.  But already, it is staggering the amount of content we are bringing down

so once the hash dataset is with us as well, it will be a important initiative in the world.  So thanks for your contributions as well.  Thank you.

>> AMY CROCKER: Thank you, Suzi.  It is important also in relation to the International Hotline Network, and it relates to the project that InHope is currently developing.  And I think it was really important that there are great images going on around the world and I think what I'd like to see in the future is, a really kind of sustained bringing together all of those initiatives to have a truly global impact, not just national impact.  That is a fascinating project, and institutions with the work that we are doing around the world, I think that was a lot of attention to bring forward.  Certainly the joint project that I mentioned earlier that in hope is developing with interval is exactly looking at creating hash list of known contents that could be used in an investigation.  So I think it is really important for the initiatives; really moving the debate forward.

>> MARIE-LAURE LEMINEUR: We have a couple of minutes before wrapping up.  So, we can take one last question I think in the back.  We have three questions?  So please, very quick.  Be very brief so the speakers are able to do some concluding remarks.

>> AUDIENCE MEMBER: Thank you.  My name is Mandy from World Vision, and I have a concern which I have expressed in other sessions during this week about the gap between the great work that is being done on safer Internet in Developed Countries and that gap in undeveloped countries.  And my question to all of you up there today is, how can we build this platform as one of the speakers mentioned, a platform which can be free to help countries and cybercrime units and the in some countries they don't have a cybercrime unit or if they do, they may have only one or two members only.  How can we provide resources to train, to build a capacity of these cybercrime units?  How can we show them how to use these DNA tools which are free but not only that, how can we open up help lines and allow them to join them for free as opposed to having to pay fees so that we all working towards the same cause.  We all are working to stop child sex abuse.  I would like to open this up to all of you to see how why can work closer together for undeveloped countries.  Thank you.

>> P.  MALOOR:  Thank you very much for the question.  So, in 2008, we started the Child Online Protection Initiative which brings together all stakeholders from Governments and from Civil Society and many, many U.N. agencies.  Unicef was a valuable partner.  Most of the people are contributing to the initiative or key members of the initiative.  And we brought a platform for information sharing, for sharing tools, for sharing best practices.  for sharing policy experiences.  So, what you say is one of the most important things to be done and I think we need such platforms to be --(Indiscernible)

>> AMY CROCKER:  Thank you.  We are contributing to that initiative and I know that we have spoken a little bit before about this issue but certainly what the InHope Foundation is doing, as the Membership Association supporting the establishment of Internet hotlines and reporting mechanisms in Developing Countries and emerging IC markets, or however you want to say it, and we work with a number of criteria to identify -- I don't like to say priority, I think focus countries, whether a particular concern, and to support in establishing reporting mechanisms.  The IWF also has a solution that they offer to support Developing Countries and they are different solutions and we are in agreement that there are different countries will require different responses.  There is a lot being done.  I don't think we have time to talk about the law enforcement, but you're right.  It's a problem and I know that you know, organisations throughout the world are putting resources into this.  But it is an ongoing challenge.  We realize that.

>> MARIE-LAURE LEMINEUR: Thank you.

>> AUDIENCE MEMBER: I am (Indiscernible)

First of all, I want to thank the panel.  These are very important issues.  So I'll start with informing that the Solar Livingston will have E.U. Kits online session in less than a half hour and we should all attend that as well.  

Now to the question.  It struck me that Mr. Preetam seems to be the only one that understands what Internet is.  All of you others are talking about getting more resources for judges, for police and so on.  But what Internet is all about, it is getting users of the Internet to do the job the police could have done because it would be much more efficient.  So, what I wonder, when you are arguing to get more judges and more police, more resources for investigations.  Who are really deciding which kind of pictures that are criminal and which are not?

>> MARIE-LAURE LEMINEUR: I'm afraid we don't have much time but that's a very interesting question.  Maybe I can talk about it for a second.  The law decides.  There are legal standards.

>> AUDIENCE MEMBER: But the letter of the law and what the judges decide are two totally different concepts.

>> JOHN CARR: If you don't think we know anything about the Internet, I'm not sure why you want to hear our answers to that question, but I'll give you one.  I'll give you two if I get the chance.  There are well-developed mechanisms within each country for making decisions about what constitutes an illegal child abuse image.  The police ask the courts and U.K., and in France, and in many other countries.  They set standards which are extremely clear.  You would have to be a complete idiot not to be able to recognize a child pornographic image.  It is easy to do and it's easy to manage.  And the hotlines and InHope and the IWF around the world.  How many are there now all together?  51 around the world are doing this job every day.  And I haven't heard yet of any of their decisions being overturned by the courts.

>> MARIE-LAURE LEMINEUR: Thank you very much.  And the last question.

>> AUDIENCE MEMBER:  It is unfortunate there isn't enough time.  I really enjoyed the presentations but I can't help but notice that contributions and inputted from the African situation are completely lacking.  Our silence is very loud and this is worrying because East Africa, especially Kenya and Uganda, are the largest consumers of pornographic materials in the world, according to Google,  if I'm not wrong.  So this shows that there is a deep, deep need for this and you find situations where there is a huge gap like you said.  There might be wonderful systems in France and the U.K. but they are not there.  So you have a situation like in Uganda in 2014, they passed an Anti-pornography Act with a very vague description of what pornography is, and three lines on child pornography.  So this, while intentions seemed to be good, it is not realizing that same way.  And what is happened now is that child pornography is seen as homosexuality and that all homosexuals are pedophiles and all pedophiles are homosexuals.  It is a conversation -- we aren't even ready to talk about sex let alone pornography, let alone child pornography.  To highlight that, I think there is a need for such a space to have more opportunities for conversations between states and civil societies around creating safety online but in recognizing that there are gaps in between, what the law says and what the reality is and how this should be realized and lived.

>> MARIE-LAURE LEMINEUR: Thank you very much for your contribution about the African context.  It is a very interesting issue and I'm sure that most of us here would love to spend more time and discuss it.  I can speak on behalf of ECPAC International and the access and use of child abuse materials in Africa and from Africa is on our Agenda.  And this is something that maybe we can take or discuss after this panel.  Thank you very much to all of you, the distinguishable panelists, and that will be it, then.  Thank you.  Have a good afternoon.

(Applause)

 

***

This is the output of the real-time captioning taken during the IGF 2014 Istanbul, Turkey, meetings.  Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record. 

***