Welcome to the United Nations | Department of Economic and Social Affairs



WS 17

This is the output of the real‑time captioning taken during the IGF 2014 Istanbul, Turkey, meetings.  Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors.  It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.   

>> MODERATOR:  Okay.  Hello.  We are running a little bit over time.  So I will just start.  Welcome to this workshop, Privacy as Innovation II.  It is organized with my organisation, the Danish Media Council and the Dutch ECP and the University of Copenhagen.  There is II after this workshop.  We had a I in the IGF in Bali.  And that workshop was a little bit different than what we are doing this year.  Last year's workshop was in a sense addressing the ‑‑ a more general level.  The dominating discourse is concerning privacy.  And you could say that what we tried to do last year and I think we somehow succeeded was to rebrand privacy.  
Just to recap I think it is important to talk a little bit about how privacy as a concept has moved between different stages alongside the evolution of the World Wide Web.  We have had the early stage where privacy and also anonymity was described as a unique opportunity to experiment and established forms of power and constituted market models.  All these discourses are, of course, still very alive.  We had a second stage where anonymity and privacy and in general was named and blamed for a lot of things including being an obstacle for innovation and for that matter being an obstacle to everything that is social, open, public and shared and also contrasted to innovation.  We are doing something different here.  We are calling privacy innovation.  
Now what we are addressing here with last year's workshop and this workshop is this third stage that we are moving in to.  The one that we are kind of in the heart right now pushed forward by increasing or changing users over the last year.  The last couple of years we have increasingly had this kind of demand of users that wants to be in control of their data and their information and their social interactions online.  They want to be able to set their own boundaries, to create circuits of inclusion and exclusion.  We are addressing a paradigm shift as what we want online on users.  The last couple of years, particularly this year we hear more frequently about services, Networks, businesses, new innovations that directly address privacy and data protection and anonymity, I am going to mention a few.  And I had someone more knowing than others, we hear a lot of company names all the time but we don't always hear the smaller businesses or these things.  So we have something like silent circle, the dark mail aligns, the Indie Phone and respect network, made safe, black phone, mail pile, and there are many, many more that are directly addressing privacy and user control and many of these building truly ethical ideas.  (Internet outage).  Others might only just be gross opportunity and the concept from ‑‑ so part of what we will we do today is try to move beyond private concept.  And to think of the two ‑‑ to control ‑‑ (Internet outage).  
With the more hidden layer which businesses (Internet outage).  It is in the core, move beyond looking at privacy as can be very practical today.  Irrespective of speakers we have a very large panel.  Some would ‑‑ some are direct experiences with working with innovations.  But I will introduce to you my co‑Moderator.  It is from your (Internet outage) ‑‑ I just want to say that.  Privacy expert ‑‑ aspects of privacy.  All the speakers (Internet outage ) sitting over here to make this as close to ‑‑ and so experiences are young people to other service and ‑‑ either one lists or something ‑‑ would only their show what they do online.  So I don't know ‑‑ who to be the first one.  Yes.  
So let me introduce your names.  UK (Internet outage) and second who is the one ‑‑ and we have (Internet outage) from the Netherlands.  And then we have Olivia Bang Brinck from the IGF panel.  Just stick up your hand.  
>> ZACH:  Like a few (Internet outage) ‑‑ Childnet.  But focus more on privacy from one another in this aspect.  So there is things, I like features that I like that I use that have to do with privacy and (Internet outage).  I like that one service that I really like, so you get on the different services, media sites which basically just like child ‑‑ that really sticks out.  Because not messaging but they also have group messages and if you have a part of any different like a social event then you can create ‑‑ you can kind of chat platform.  That's not all your friends.  Just a select few.  
Next focus on is the ‑‑ and the one thing that I focus on is and I like to use is Snapchat.  The whole point of really kind of private app because it is all (Internet outage).  So after you send it it deletes itself.  That's less than ten seconds.  I like that because there is different levels of privacy that you can have within the Snapchat app.  So the first level which is probably the most exclusive is sending snaps back and forth from one to ‑‑ from one person to another.  So, for example, I can be talking to Harriet and then I can send her a photo and it will delete itself within ten seconds and she would reply back and forth, et cetera.  Then you can also choose to go like a more kind of ‑‑ the next level up, a bit more public when you have Snapchat.  And you send chats to a group of people that you can choose from your contacts.  
So you can Snapchat pretty much everyone in this room.  Youth peers here and I can just send them one direct Snapchat there.  And then the next level up is also they have got a story feature which is basically you send it to every single one of your contacts and it is visible for everyone to see.  Another thing with Snapchat which I really like is that the features within itself are the privacy settings are very good because you can choose if you want everyone to see your snaps.  You can choose who can send your snaps.  Only your friends, the public, and certain contact members that you may have.  
So the fact that you can choose your privacy and easily accessible settings, very great.  And then also even though this is more of a privacy platform I really like public platforms where you can really go and express your ideas.  So then the first thing that I think of when I think of a public platform in social media anyway is Twitter.  So I really like this because it is kind of freeing your voice and it is more based around the public perception really.  So instead of like different things like Snapchat and Facebook which are more about your friends then you kind of now have Twitter which is basically all about the public perception and putting out your ideas and broadcasting it to the public.  So I find ‑‑ I am sure there is many other apps like Instagram where it is more based on photos that are more public but Twitter sticks out in my mind.  These are the many different general features that I enjoy using on the Internet.  
>> MODERATOR:  Thank you, Zach.  Any of you?  
>> HARRIET KEMPSON:  I am Harriet.  I am also with the child and youth IGF project.  Building on what Zach said and the thing that came up with in our discussions when we were preparing for this workshop our experiences, the young people who we know, really enjoy having a choice whether you can have a private, in the group Facebook chats or the public.  Features they can choose.  A good thing about Facebook is that you can have the private messaging, the direct messaging service, but then you can also post a message to somebody's wall and then all of your friends can see it or all their friends can see it I think.  And that creates a platform where you can directly ask someone a question, but if other people feel relevant to them or it includes them as well, it is the public or private, if people feel that's what they want to do.  You can also share a post.  You like something on a Facebook by tagging these people but you can share to people that you think would appreciate it.  As Zach was saying I think that in my experience young people think of the privacy, as privacy between peers.  I might want to share something with Olivia that I wouldn't want Zach to see because you can personalize your settings.  The features which make the choice for privacy that young people use a lot.  
>> MODERATOR:  Thank you, Harriet.  Olivia.  
>> OLIVIA BANG BRINCK:  I wasn't told about this.  I just found out.  Is what said being private?  Is it actually really private?  As Zach said the pictures have been deleted but that's not correct.  That's not correct.  They are keeping it in their database.  Snapchat, who says that the pictures are gone which they aren't.  And I have an incident with my friend.  Sitting next to each other writing on Facebook just because yeah, it was fun.  Then I just saw that I could see a phone number in the information.  I told her because yeah, she didn't want her phone number to be ‑‑ everyone could see it and then looked in the settings and understood it was only her that could see this phone number.  When I sat beside her looking at that phone number.  So if a company like Facebook, if you can't trust them about what's private, then who can we trust then?  We have to remember that nothing is free on the Internet.  If something is free, then you pay with your data instead.  So we are always talking about yeah, I want to be private.  Of course, some things I want to be private from Harriet.  I don't want her to see everything.  But some of it.  But maybe I should also think about what I want to be private from a company like Facebook or a company like Google.  Yeah.  
>> MODERATOR:  Thank you, Olivia.  
>> MODERATOR:  So we have the Dutch team.  Should we take it from ‑‑
>> Yes, of course.  I have been the NIGF and where other panel members we are more focusing on the privacy.  I would like to think about privacy as innovation in the back stage scene because I was ‑‑ when we were discussing and preparing this meeting I was thinking about a service that doesn't exist now.  And that has to do more with back stage privacy of things.  We don't have a platform yet ‑‑ like a privacy library of personal, social services, central place through ‑‑ to review the data that different services have like Instagram and Facebook.  At Twitter you can ask for the data they have of you.  I was thinking about kind of a dashboard where you can see what like Facebook, which data they have from you and which data Twitter has from you and you can choose to share that with other people if you want or be paid for it by people who need the data or companies who want the data.  And it would be a service that doesn't own all that data itself but uses the API of those different services to collect them and show them in a way that's understandable and easy to use and it is probably because we are interested in this topic, but I think a lot of youngsters aren't actually and they don't care that much.  So it should be as easy as pie to see and maintain your data and see and have a nice overview.  So that's what I was thinking back stage.
>> MODERATOR:  Thank you.  So does anybody ‑‑
>> NATHAN BIERMAN:  My name is Nathan.  Also NIGF.  I also point on the back stage privacy cookies, cookie law, every time when we enter a new site you have to accept cookies.  Where use cookies you don't know which cookies are used of you and where they are sent to.  So it is like ‑‑ certain kind of dashboards.  So you see it on every site what kinds of cookies you use.  You can choose which you use or turn on or turn off.  Like Facebook apps.  I really love them because you can use your own settings, which apps you use and which apps you don't use.  And every app is like a list.  You can read a list of which function they use of you, your friends, your phone number and we can see which cookies, which information of you they collect.  And where this information is sent to, which third parties or just keeping inside to make your experiences at the site better and easier or these cookies sent to like a business that may use your cookies for advertising through your Google results.  So I think this will be nice if this function come to Internet sites.  And I would really like ‑‑ yes, this is it.  
>> MODERATOR:  Thank you.  It is extremely important because it is really nice to hear these very practical experiences of what you need and I mean you're representative of users in general that are asking for choice and more control.  One thing that I see in your ‑‑ in this presentations that you made is also that some of you are a little bit surprised over the fact that you thought you had privacy but you feel that you don't really have it.  But let's move on to the other speakers.  I will introduce you one by one as you speak.  And as I said all of you here have two minutes for introduction and afterwards we will open up for discussion where everyone in this room can participate.  I will take it from the beginning.  Please.
>> Hi.  I think it is very difficult to have any sort of meaningful conversation about privacy without understanding, without a doubt what privacy actually means.  Because one of the things that's happened due to the efforts of companies like Google and Facebook is that we have redefined what privacy means.  Privacy used to mean for you alone.  It is about having the choice of what you want to keep yourself and what you want to share with others.  But if you ask Google and Facebook what privacy means they say oh, it means just between us.  Just between you and Google.  Just between you and Facebook.  So in a sense if you ask Google or Facebook what privacy means, they tell you in essence that it means public.  So Orwell has a great term for this, he calls it double think.  We have to engage in double think in order to make sense of it.  
And what they ‑‑ what we are actually seeing there is a setting on their service, right?  When we set something to private on Google or when we set something to private on Facebook like you guys were saying, it is not really private.  It is really private as telling your creepy uncle a message to give to somebody else.  The creepy uncle is Google and Facebook.  You tell Google what you want to tell to your friend and Google tells your friend that but it takes notes.  Because why?  That's how they make money.  It is a very simple thing.  There is no conspiracy theory.  It is simply their business model to learn as much about you as possible.  We are starting from a point where we are living in the home of a creepy uncle and we are trying to see how we can best protect people who are forced to live in the homes of creepy uncles who make money by learning as much about them as they can.  We need to refrain this conversation so that you have your own home.  We can reframe it.  We can create technologies that you own and you control, where you are living in your own home and you have the option of sharing what you want to share, but you also have private in the sense of private as we have always known it.  
In the sense of private as it is enshrined in the Universal Declaration of Human Rights in Article 12 which means for you as lone.  
>> MODERATOR:  Thank you.  Next speaker is Gitte Stald, associate professor.  
>> GITTE STALD:  My participation is based on the research I have been doing in the EU project and the service we have been doing there, of course, gives us, provides us with a kind of overall picture about how much children and young people know about privacy settings.  So the diversity across Europe some countries more.  Older children are bettor at it and those children who go online from various devices are very often more alert about which privacy usually are at stake, but when we go to the interviews it is very interesting to dive in to what the children say about the considerations, what privacy is as we heard examples but what they do and what their thoughts are.  One thing is the personal privacy.  What is going on underneath the surface.  The children are aware of what they share with whom and so on in order to protect their own privacy and all of them have heard a story if you want to get a job when you grow up you shouldn't share this.  But increasingly I hear children talk very much about protecting their peer's privacy.  It is also a collective thing and I think that's really interesting.  But it is not just your own privacy.  It is actually something that goes on in social relationships and that's one of my points.  That's definitely something we need to build on and to take with us.  
We also ‑‑ I mean we heard about all the ‑‑ some of the technological issues here from the youth panel.  There are, of course, a number of good technical solutions opportunities out there but more advanced ones and more alternative ones, not that many of the children are taught to know about them and know how to exploit that.  
And the final thing I would like to put forward here is that in the interviews and in general when I talk to children and young people about this I see this increasing awareness but also a strange kind of self‑censorship that some of the children I am talking to actually avoid sharing things and putting things up and creating things and exploiting all the opportunities that you online on digital context and I think that's really ‑‑ it is stupid.  I mean it is crazy because that actually prevents them from all the positive opportunities that they actually stop doing things in order to protect their privacy.  So we need to find a way of balancing this kind of using the technical opportunities and also sharing that with children and letting children know but also giving them the good advice on how to exploit the opportunities.  And I can see the very good ‑‑ a very direct thing is to having youth Ambassadors because what works best is to share the information and the good inputs between young people themselves.  
>> MODERATOR:  Thank you.  And now we have Ader Garnes from the Dutch Center.
>> I liked your comment because you can see there are different kinds of privacy, perspectives of privacy.  And it is very interesting to know.  And also very interesting to know that you can already see what your cookies are doing on your website.  There are programmes that you can see how it tracks and everything.  There is a lot of products or applications out there which we can use which protects our privacy much better.  For instance, a lot of open source software which is made.  But the problem is we don't know it.  I think the big problem is if you want to have your privacy protected you should go to a product which really protects your privacy and that's not actually leading the deleting the photo after so many seconds because we don't know what Snapchat does with those pictures.  
In the Netherlands we were forced as Facebook users on the mobile to use messenger.  Oh, good.  What messenger does can look at my contacts and look at my video and can take over my SD card and I don't use that.  And my oldest son told me mom, what are you talking about.  They are all using what's happened.  It was very weird they were all talking about this app which already does the same thing.  If we want to innovate in privacy we should actually think the way people think who are using those applications and then don't want too much hassle to change it.  We all love to use Facebook or maybe some people around here doesn't want to because my privacy.  But let's be honest a lot of people use Facebook because it is a nice application.  It is a nice service.  You can do a lot of nice things with it.  
I think one of the biggest problems we have is that most of those applications come from the more dominated countries.  Come from Silicon Valley.  They don't think about privacy.  They think about getting rich and they use your data to make money.  We should see how we can put more focus on products who don't have your data to make money, who have different business model and they are out there.  I will present you one example which is at this moment being developed in the Netherlands local box which will be an alternative drop box.  As you know drop box you probably say is a conversation between people who I give access to.  It is not.  Drop box uses the data you have and they can do things with it and they can look in to it.  This local box is open source applications.  It means that we have the many eyeballs through.  People can look at the source code.  It is not only encrypted during traffic but also encrypted at the server itself.  Even if somebody would get in to your server it is still encrypted and hard to get.  If Apple would have done that with iCloud, we wouldn't have nude pictures all around from the celebrities.  
>> MODERATOR:  Thank you.  I also asked Bart to come with his two minutes points on this issue.  He is my co‑Moderator.  
>> BART SCHERMER:  Yes.  Thank you and in my role as co‑Moderator I will try to put some things together.  The way I see it also from hearing the comments is that if we want really to have privacy as innovation, I think there are three prerequisites that we need before we can actually innovate in the field of privacy.  And I think the first one and it is a little bit off topic for this discussion maybe, although Ader touched upon, we need an international playing field when it comes to privacy legislation and privacy settings.  I am from Europe and generally speaking in uniform the level of privacy protection is very high.  Whereas in the United States where they have a different idea and conception of notion of privacy you can argue that the level of privacy protection is lower, but what we see in Europe all we do is use American services.  Because there is a little ‑‑ there is a lower barrier for innovation in the United States than there is in Europe with a lot of formal requirements in the field of privacy protection.  So level international playing fields could be very helpful.  
The second I think and that's what most of the discussion is about is we need a real market demand for privacy friendly services.  As it stands there is no real market value in privacy.  Everyone wants to have privacy friendly services but in practice everyone flocks to the services that are most privacy, unfriendly because they are very user friendly and because they are very useful, of course.  So I think we need to change the tone of the debate and maybe move it away from privacy and more toward what is actually happening, self‑censorship, the creepy uncle looking over your shoulder.  That's what is really happening and that's a lot more tangible than just saying this is a privacy debate and how should you protect personal data, et cetera.  And the third and I think it is also a very important one and I'm allowed to say this because I am a privacy lawyer, keep privacy away from privacy lawyers.  The debate about privacy and privacy and data protection is too much about what is allowed and how the law should be interpreted and what's in the law.  And I think the third prerequisite for privacy innovation we need privacy engineers.  We need engineers that can implement these systems like user friendly privacy dashboards and user friendly services that give you real choice and access to your data.  And currently the privacy debate is dominated by privacy lawyers and not people who come up with a working solution.  That will be my three suggestions.  Level playing field and create real market demand for market friendly services and get more privacy engineers.  
>> MODERATOR:  Thank you.  You are combining expertise which is what we are doing today.  Next on our list is Hanane Boujemi.  
>> HANANE BOUJEMI:  Yeah, my microphone is working.  I am going to give it a twist.  You spoke briefly about the U.S. and the comparison with the EU and I'm living in Holland and I can see that people are more aware of privacy.  And I relate a lot to the cookies, you know, issue in Holland and how your data is tracked and you don't feel really private in the way you use the Internet.  And sometimes some websites actually don't work if you don't accept the cookies for some odd reason, if you want to load material which is quite concerning.  In the case of the Middle East I think there is a completely other level of discussion that we should have because it starts from the behavior, you know, of the consumer in the Middle East and how, you know, consumers are actually compromising their data.  They are more detailed with their card details being stolen which is very concerning.  I can see there is an increasing level of maybe people care a little bit more to protect their privacy but according to the statistics that I see it is mainly basically people are more concerned if their accounts, their bank accounts will be hacked rather than the government tracking them or the data has been mined by Google or Facebook and so on.  There is a need, you know, to increase awareness there, this issue because the more, you know, people are less concerned the more problems we will have in the future.  
Now in Europe definitely I can see, I mean I am really thrilled with, you know, with your inputs and the discussion and I wish we had the same kind of environment where we can exchange experience from people who are actually using the Internet.  I am sure you guys, you just want to do your thing and you don't want your parents to know about it, but you happen to share all your information with the creepy uncle.  I will borrow that word from you because it is a really very good description of, you know, big company services like Google and Facebook.  Due to the high level of hospitality of our countries I think they are very happy to host the creepy uncle because people are mostly concerned with using Facebook rather than caring about what information they are sharing with the creepy uncle.  So that hospitality thing applies on everything unfortunately.  And I think we should a little bit, do more work, you know, to bring the issue to the surface.  I know that there are many organisations are trying to raise awareness about privacy as a prerequisite now that we know that obviously we are exposed.  And the Internet exposes everyone.  If you are not very careful, the history will hunt you forever because whatever you have data locked on the servers it is a really difficult thing to erase it completely.  In Europe you have the luxury to be forgotten.  Because if we talk about the legislation side of how data is being protected in the Middle East it is not even implemented.  The penal code, the constitutional guarantee has a right but there is no track that people can take to sue Google in the Middle East because the constitution is not, you know, whatever is guaranteed there is not implemented in practicality.  And that's why I totally agree with you that we should involve more, you know, engineers, people who are engineering the technology to protect the user rather than relying on a legal system because I think it is more efficient.  And it is basically people from the very beginning and even companies should be actually bound by these specifications.  So when they want to sell the products it has to come with specific privacy settings which will guarantee the privacy of users.  Otherwise nobody, for example, will reach the level where nobody will be using Google, for example, if they don't provide standardized privacy requirements.  And I know they have been working very, very hard after the revelation to encrypt all their gmail and they started to realise they will start losing their customers.  The whole business model of Google is based on the users.  It is not going to be in their benefit.  Maybe it sounds weird now, you know, you feel that Google is a giant company and it is not going to fall down ever, but you never know.  
The future is bleak for these companies if they don't take the privacy of the users more seriously because we are noticing like the young generation, even people that plug in the Internet at a very young age.  They are more aware.  I can see from the discussion they are plugged and on it from now.  So I think these big companies have to literally revise a little bit their policies.  I know that most of us here don't even probably bother reading the terms and conditions when they subscribe to a service.  This is almost a culture but I don't think that's the future.  We need to rely on technology to help us gain our privacy back, gain our life back and that's my contribution.  
>> MODERATOR:  Thank you.  And we have the last speaker and I can see some people sticking up their hands.  We will start with discussion a little bit.  But let me bring Pernille Tranberg.  
>> PERNILLE TRANBERG:  I am a person who wants to control my data and I want to decide who knows what about me when.  That's why I work with a lot of different identities.  For example, I go to fakenamegenerator.com where I can create identities.  I block all cookies.  I block Facebook from tracking me on other websites.  And generally I use Tour the Network where you can browse anonymously and it is a lot of work and it is really annoying.  I want to control my own life, my digital life and I can't expect all of us are going to do that in the future.  It is too expensive and it is too time consuming.  And that's why I want products and services where it is privacy by default and not tracking by default.  We should all share all these new innovative products which are alternatives to the triggers.  For example, startpage.com, I bet that people that use Google as a search engine.  Try startpage.com and that's a Dutch search engine.  Giving you Google results but they are anonomizing.  They won't track everything that you are searching for.  Already now you can find products and services out there which are alternatives and we need to promote that.  And we need to ‑‑ that's what I was ‑‑ in my professional job I am working to find ways how to finance all these new products coming up which is privacy by design.  Thank you.  
>> MODERATOR:  Thank you.  Sorry to say are you one of our speakers?  
>> AUDIENCE:  No.  
>> MODERATOR:  No, it is just because we are missing one today and I don't know the face of the person.  Okay.  
>> (Off microphone).
>> MODERATOR:  Yes.  Well, because now we have come to the interesting ‑‑ even more interesting part because everyone can participate now which is our roundtable discussion.  Just to recap a little bit we have been talking about the reframing the conversation around privacy.  We have been reusing ‑‑ from yesterday we had the ‑‑ there was the metaphor of Internet as a family member.  Today it is the creepy uncle.  And I think one thing that I have heard that is really important is the effects of not having privacy, the self‑censorship and the request that we have an international level playing field, that we are trying to create here with a combination of expertise that it is not dominated by one specific type of expertise but we have some of the technical community here.  And then one thing I find really important which is a prerequisite to innovating in privacy and it is the market demand and users need nice, easy services that are privacy that has some kind of idea of privacy built in to them that we can all agree on.  Let's start the roundtable discussion, the discussion in general where everyone is participate.  Bart, my co‑Moderator we talked a little bit about topics we would like to cover.  And I see people already raising their hands.  So let's start with the hands that were raised and you were one of them.  Sorry.  
>> AUDIENCE:  Okay.  Okay.  So my question is actually a bit different.  I am interested whether we really care about our privacy.  And by privacy I mean the classical understanding of it.  What I am trying to say we ‑‑ every day we exchange with so many datas with each other.  And we send thousands, if not millions of messages every day.  And if I text my friend and I tell something, who cares about it.  Who reads and who ‑‑ I mean who is this creepy uncle.  I know you call it Google but in reality, yeah, who cares about it.  And I know, I know the creepy uncle, the data is so big.  There is so many files, documents that are being exchanged on a daily basis that I don't think anyone can keep track of it.  And even in ‑‑ I don't know if anyone has read how much data was NSA, actually possessing of the whole data.  There was only 0.13 something percent.  So it is like no one can handle this data and everything ‑‑ so I was trying to say is that we still can't have our privacy on this big data era I think.  
And another thing I want to talk about is that our attitude.  Usually I meet these young people that talk about Google, that Google is so bad.  And Facebook is so bad, that they are surveilling God and, et cetera, et cetera, and at the end they text me Facebook something and Google something.  If you are not okay that someone is reading your message, if you do believe that, and surveilling you then don't use it.  If you like Facebook, I like Facebook.  I like all the Google product.  I use them.  You should stop saying that and use it normally.  
>> MODERATOR:  Thank you.  We have to keep the intervention short.  Just to say that you are addressing something on a very general level and I think we have to keep the discussion very practical but I think ‑‑ yes.  To respond.  
>> Right.  Thank you for that and by the way who are you with?  If you could introduce yourself.  
>> AUDIENCE:  I am Anna from Georgia on behalf of myself.  
>> Okay.  Great, Anna.  Thank you for that question.  I think you raised two very important points there.  The first one you said well, who cares about this data.  There is so much of it they couldn't possibly look at all of it.  Right?  This is not entirely true.  We have algorithms that go through data and they flag certain things.  Now today, for example, you may be sending a message to your friend and through that message because of the words you have used I understand that you favor people of the same sex.  Now in the country that you live in today that's not a problem.  But that's been flagged and there is data retention.  So that data is going to be kept.  Even in your progressive country the next election happens and the next election happens and the third election happens and you get a very far right Government in and they think that same sex relationships are not okay.  And let's get those people together and let's have a conversation with them and maybe we can change their minds or, you know, we can take some action against them.  That data has been retained.  And they will look through that and they will flag up that message that you sent to your friend and say okay well, Anna is gay and we are not okay with that.  Let's invite her in for a chat.  
This has happened in the past and technology is a multiplier and technology is progressing at an astronomic rate.  We have a logarithmic scale.  Even the stuff we have encrypted today will be crackible.  So if we keep data long stuff ‑‑ even stuff we think today is encrypted and private we will be able to crack.  Because with quantum computing we take things that cannot be solved within the lifetime of the universe and we solve them within linear time.  
Your second one, if you don't like these services don't use it.  The business model of spyware that Google and Facebook have.  What's yahoo's business model?  Exactly the same thing to spy on.  I am not going to use them.  I am going to use Snapchat.  What's their business model?  It is to spy on you.  Once you have a monopoly, but the business model is a monopoly.  There is no real choice.  The real choice you are presenting people with is either stop using technology, become a hermit or be spied on.  
>> MODERATOR:  Thank you.  And what I would ‑‑
>> MODERATOR:  What I would ‑‑ yes.  And what I would really like to do is to instead of discussing already existing things and models and the way the world looks right now, I would like in this session to spend some time and be very concrete and specific principles, specific areas that we think are important in terms of really talking about privacy as innovation.  So we talked about some areas that we could address, but one thing that I can hear coming up is the business models, of course, but also data ownership.  If anyone has some input to these specific areas, then would be good.  Do you have ‑‑
>> Also touching upon the previous intervention is why keep using Facebook and the predominate business model being the free business model that's what I meant with real market demand.  We need to change ourselves as consumers.  Taking the metaphor of the creepy uncle, creepy uncle, nobody likes him but he has a very nice house and has an Xbox in his house and Play Station and has a big bed and there is free food in the kitchen.  I like using them.  And still is there a tradeoff being made between my privacy and this service?  And still it turns out in favor of using the free service.  So by changing business models, nobody who is here willing to pay for a social media service that is not free.  Say 100 Euros a year for a service that protects your privacy.  Who is willing to pay for that?  
>> Who can pay?  
>> And that's also a very good point, not only to will, if not only want to pay but actually can pay and it is especially for poorer countries that's very reticent.  Unless we start changing ourselves as consumers and fight the dominate business model on the Internet nothing will change because everyone wants to keep using this model.
>> MODERATOR:  You.  You have been pointing.  
>> AUDIENCE:  Thank you.  I dislike the idea of the creepy uncle because it is saying something that might be right but in a sense it is not.  We also need as an industry the idea of a level playing field for consumers and businesses regarding reliability.  If you use a service from the U.S. it is not using a service from Japan or EU.  But on the other hand, if you are talking about privacy and data protection, we learned from Europe a very scarey thing, we like to regulate the use of data and we are not talking about regular things, abuse of data.  So we should shift that discussion and say okay, if we are using a service like social media or Google services this means those services are that good because they are using data and they need data to further services.  Only works with data.  The issue we have also as an industry that we might like with the idea of transparency and control that we say okay, we have done some mistakes in the past and explaining our business models and telling what we are doing and what we are not doing.  This also means we need in this discussion some sort of how can we use data.  A itemization for tracking data.  Every month at the end of the month advertising services and content, if you don't use advertising or if there is no advertising, each country has to put 700 or 600 Euros for content services.  We need data to refinance our services and we talk about different ways how to do that without having the image of a spy.  Because it is not economically reasonable to say I like to reprofile of one billion users.  So we have to find limitation of what does it mean the use of data.  So therefore I think we need an open dialogue and happy to shift this debate in that direction.  
>> MODERATOR:  Yes, last year was also about reframing privacy in not necessarily a box or an obstacle.  It is a normative way, more human way of using technologies and sharing data.  So but I have you first and then I have several people down in here.  Yes, but I think you were first.
>> AUDIENCE:  I will keep it short.  Because I don't agree with Bart and I'm sorry about that.  Let me take ‑‑ make two points.  First of all, we need a level playing field and I think the IGF and said it before and I will say it again I think the IGF should take that role.  There are many, many products out there, not only from the states but from Europe, from Africa and Asia which are worth looking at.  We just don't know about it because we don't ‑‑ they don't have the big funds to launch it.  So I think the IGF could be a platform just to make ‑‑ to share what's out there.  Secondly the Internet of Things is coming.  So this means our privacy will be at stake much more than we think about it.  And what I found is in the Netherlands we have websites who ask personal data on an insecure line and when I e‑mail those people and say you have an insecure line asking Social Security numbers and everything you can't do that.  You have to make it secure.  They say well, I thought it was right because I had a website build it.  What we need is a technician who can get education at their schools that they need to think about privacy.  If you are a website builder and you make a site with Social Security numbers on it, you should think immediately whoa, wait, this should be a secure line.  They don't even think about it.  The technicians should get more involved.  The Internet of Things is all about new innovations and things going on and they don't think about the end user.  They just think about a product.  So I think we should make sure that the end users all get together so we can talk with producers of the new innovation and let them see what we want.  
>> MODERATOR:  I couldn't agree more.  And the idea of having the IGF as a place where we can at least start some more exchange in this area globally is very good.  I saw actually you in the very back, have been sitting for awhile.  
>> AUDIENCE:  Thank you.  I missed the beginning part but I think I got here on the right time.  Is it?  Yes.  Okay.  Let me first introduce myself.  My name is Siray and I work for the Turkish National Council of Science and Technology.  I am here on behalf of myself.  I am a computer engineer who is getting a main degree in cyber law.  I can see both the lawyer part and maybe the technical part.  It has been suggested twice since I came here that the engineers should sit down around the table and talk about this idea.  But I think the first comment made by the person from Georgia was a good demonstration of the general public view.  And this general public view is shared also by the engineers.  If we see engineers as innovative people who are trying to make better things they are like scientists and their objective is not to be the creepy uncle.  They are trying to find a new, more fun way to share something or something innovative.  And I don't think ‑‑ I have a lot of friends who work at Microsoft, Google and they see me as the paranoid friend who is avoiding all the social media because everyone is watching her but they say who cares.  So ‑‑ and we also talked about the business models.  The engineers are just creating new techniques and systems.  They don't really care about the outcomes.  The people who care about the outcomes are the companies who use the data that process information to sell more products or serve some other objectives of Governments or something else.  Personally I don't think if there were engineers here who sat around the table they would ‑‑ they would just feel more ‑‑ that they won't be able to create more.  We should find a win‑win model for the people who are getting a profit out of this system.  The monopoly.  And try to find a better way, so that they can give up the monopoly by also providing more privacy protective systems.  Thank you.  
>> MODERATOR:  Thank you.  Bart would like to comment on this and I have seen Olivia at one point with her hand up, a few people here.  Some in the back down there.  So I recognize you, one up there and there.  I recognize you and I will try to be fair.  
>> BART SCHERMER:  Just maybe to explain a little bit more by saying privacy engineers, I don't necessarily mean technicians.  What I mean is maybe a person like you who combines a legal background with a technical background and also the business and marketing side is very important.  So I work for a lot of big companies and currently there is very little incentive within the companies to listen to the privacy officer because it is not a requirement demanded by the market.  Consumers don't stand up and walk away from the service.  Why invest time and money in building if it is something that consumers don't really care about.  When I am talking about privacy engineers I mean actually putting together the business people, the legal people and the people from the technology side who can build it and also make it user friendly.  So a little bit more than just a technician.  
>> MODERATOR:  Yes.  Thank you.  Olivia has actually had her hand up.  I don't know if both of you want to comment ‑‑
>> OLIVIA BANG BRINCK:  You first talked about who cares about the data.  And that we don't care.  Yeah, I kind of don't care that somebody is looking over my shoulder.  I just want to know about it.  I think that's one of the problems that you have to be really technical and yeah, you have to work with it to know all about how we are being surveilled and how people are looking over our shoulders.  Yeah, I don't know all of that stuff and yeah, you can look up that's what you are cooking, but you have to look like seven pages with words squared.  If we are going to build Internet, build on trust which I think is the best.  Then you have to ‑‑ it has to be in a language and you have to put it down in language so all the children and young people and normal people can actually understand.  
>> MODERATOR:  And Zach, if you would.
>> ZACH:  Kind of on to that point, as a young ‑‑ as a young child I mean you kind of really don't realise what could happen and what the effects are if you don't have privacy.  That's my big issue.  I don't know what really happens with Facebook.  All I know from my personal experience if I go on Facebook ‑‑ so I was thinking if the problem is that we don't like have ‑‑ like ways to be private and what is privacy, then like what are the actual solutions to solving it.  So there are a few points raised on like money.  Maybe if the money was the problem, that if we have services that really were private but maybe they costed money that would be one solution, but what about if we had the actual businesses and we raised awareness for the actual companies and businesses that maybe don't cost money but they are built in with the privacy.  So maybe if we raised awareness with maybe the kids who don't actually know and care if they have their privacy or not, if we get an interest sparked in them and like make it relevant to them that okay, if I don't have my privacy settings on or if it is even private, even if you do have the privacy settings on then this can happen.  So I want to go and kind of use this thing now like the service you are talking about was star page.  So whatever service that is.  Maybe kind of boost the credit for that and kind of take away the power from the big monopolies like Google and Facebook.  
>> MODERATOR:  Very good point that you have default and you don't have to deal with what they are doing but we should just ‑‑ things are in order.  You have had your hand up the longest I think.  So...
>> AUDIENCE:  Thanks.  So there are two comments I'd like to make.  First one I know it is ‑‑ you don't think it is related but business model of privacy, but it is a lot connected to demand which I want to talk about.  Try to keep it short though.  The first thing is if you want to pay for Facebook, like a premium Facebook where your data is private and the cost will be like 6 to 700 Euros a year.  If you put a cap on that, say Facebook more than 600 Euros from you, later on if they start to know you better in the future they might sell their advertising way more expensive and the value you are getting from Facebook for free will be even more than 600 Euros.  The platforms like Google and Facebook are a very important part of your life.  And right now you are paying phone bills and they add up to 600 Euros.  If you think about it there is large demand for the products they are building but not yet for the privacy and people will be able to and willing to pay 600 Euros if they see it as vital as phone or Internet connection.  And secondly the demand there is totally not demand for great privacy settings and group privacy.  But the creepy uncles are the ones that are actually creating the most demand for privacy right now.  Their whole story about Facebook messenger, the ‑‑ you get a popup that shows you which information you get.  This popup that's been developed by Google, Android they use it for their information.  I am in the ground troops and I am Robert between startups and engineers building great stuff.  All the apps and tools that are connecting with Facebook are being asked okay, you don't have a privacy policy on your website.  You should show it.  Please be clear about what you are asking from users.  Though it doesn't really affect relationships we have directly with Google, and this in a sense creates a bigger demand for privacy because we get used to being shown what our information is.  The next stage is that we are starting to demand it from them.  And I think the only thing that we can do right now is be really open and nice to the creepy uncle and just ask him to be more involved and ask for ‑‑ create more demand for privacy.  
>> MODERATOR:  Thank you.  I will jump down in the corner down here.  Yes.  
>> AUDIENCE:  Thank you.  My name is Pastor John.  I am editor, a think tank on digital society and thanks for the interesting comments this morning.  I am here to talk about users coming together and a lot of faith in the market forces being able to put pressure on the Internet giants to fix this.  However in other areas of Consumer Protection we rely on authorities.  We don't trust the makers of makeup, for example, to do what's best for the consumers.  We have inspectors appointed by the Government to do this.  So I think this is the missing piece in this debate.  What is the role of Government and regulation to protect consumers online.  Thank you.
>> MODERATOR:  And does any one of the speakers want to ‑‑
>> Yeah.
>> MODERATOR:  Let's start with you.
>> I love your point about that and I think it ties in to what you were saying about the pricing of these services as well.  When we price those services, say if you were to pay for it well, they make 600 Euros from your data potentially.  So we would have to price it at that.  That is almost saying slave owners used to make this much from slaves.  It is not how it works.  But also if I may, I would like to steer this a bit back to the original topic which was innovation.  We have mentioned business models several times as I think that's a key thing to understand.  This is just one business model.  Spying on people for money is just one business model.  If we are going to have other business models, we talked about levelling the playing field and you mentioned that in terms of the EU and the U.S.  I would like to bring another aspect to this.  You have to understand that companies like Facebook and Googles are subsidized.  They are subsidized by venture capital.  The only way you can build a free system, a free business is to support it during that time that it is not making any money.  The free service noodle works like this.  I have a startup.  And I go to investors and they say we will give you a million pounds for your startup in exchange for a certain percentage of your startup, right?  And at that point I have to tell them how I am going to exit.  So I am just starting but I have to tell them how I am going to exit.  That's how the game works.  I either want to exit to sell to people with an IPO or sell to a larger company like Google or Facebook and that's how they are going to make ten times their money back.  That's the venture capital cycle.  
So from the beginning I am thinking about how I am going to exit.  This is part of the myopic nature of this.  Short‑term thinking.  But at that point I am locked in.  So I say to people come on to my lovely and free platform and it will be great fun and look at all the things that you can do knowing from the very beginning that I am going to sell them out at some point.  Knowing that I need to build as big of a group of audience as possible on my platform because that's what I am going to sell at the end.  Right?  So that's one business model.  And if we are going to have an alternative to that how is that going to be subsidized.  It cannot be subsidized by venture capital.  We need to start thinking about how we support these alternative companies.  And that's a crucial, crucial issue here.  How do we support them and subsidize those companies so they won't be forced in to that model of exits.  
>> BART SCHERMER:  I think that's a very good natural bridge to move on to maybe the role of this group, of this Forum and the IGF.  So looking towards the floor and also looking towards our speakers what do you think are crucial steps to take to move this issue forward?  So to remove it from the level of discussion and take practical steps to improving this in an international context.  You already mentioned the role of funding, of venture capitalist.  Maybe also already have an inkling of a solution in that area.
>> I would urge you to take a look at the Indie tech.  If you go to ind.ie/manifesto that's how we feel should be moving forward.  And we have a social enterprise in UK I founded and we are trying to build a new platform.  And Smartphone you are in control and you own it.  So you start in your own home.  And then you decide.  We are not trying to protect you from anyone who in your own home but look at that manifesto.  That's what we are trying to address.  
>> HANANE BOUJEMI:  I am inspired by the engineer who is studying law now.  Technology is evolving so fast that the mind that can't comprehend why people don't like Facebook because it doesn't protect their privacy, re‑enforcing this notion does not mean we don't want people to use big services.  We want them to ‑‑ these smart engineers to start thinking, plugging in their brain to consider certain Human Rights, you know, to be protected and that's what we need.  Now it is not cool that engineers feels restricted when they have to think about Human Rights and privacy specifically.  So what the future, you know, and the industry in general should be is that for these companies, big companies that don't want to focus on Facebook and Google I have nothing against them.  I have a Facebook account as well and I am cool as well and I use it every day.  What forces Google to have a whole department of Freedom of Expression now because is there is a lot of talk about that and they had to, you know ‑‑ ‑ basically it is a business company but they have a whole section on Freedom of Expression.  So in that section maybe they will be forced in the future to think more about how they build their applications and products, you know, to cater to guaranteeing privacy rights online.  This is what we are talking about and it is a good thing that we have new business models like social businesses.  It is a good thing.  But we are not bashing big companies here.  We are just trying to, you know, raise kind of the awareness about the issue if we are going to do this routes.  Our lives evolve around technologies and also about the psychology of the user.  
When you have something that is so cool to use and easy to use you don't want to lose.  We don't want that either.  We want to protect you and protect your privacy and we need to make sure that you are aware of that.  Now the majority of people unfortunately have the same kind of mentality of that little lady.  So which shows me, you know, from my own perspective, from my line of work that there is a lot of work that needs to be done to raise awareness about services which mine data and base their whole business model on basically providing a free service in the service but in reality we know that data has a huge price now.  That's my point.  
>> MODERATOR:  Go to Pernille because I am sure you have some practical solutions to this?  
>> PERNILLE TRANBERG:  You can't make service on Facebook because they make more money on you as a free customer than a paying customer.  We need to understand business models.  And there are a lot of new business models coming up, like paid for, for example, but we are also seeing smaller companies now who are trying to deal with your data.  If you can trust them, they will take your data and sell the data to some of the companies.  Because we are paying too high a price for these free services.  We don't know what is the price.  How much is my build worth.  How much is my political opinion worth.  We don't know.  We can only see it is a lot probably.  So we need to share I think ‑‑ in the EU and IGF we need to share knowledge about these business models, about new services with privacy by default and we need to make the EU finance some of these new companies coming up because in the EU we have training or culture where we have more Government funding whereas in the U.S. they have private funding.  So the Government funding could be used to pave the way for these new companies and services.  
>> MODERATOR:  There has been ‑‑ you down there have been sitting for a very long time.  Yeah.  
>> AUDIENCE:  Hi.  I am Ian from Net Y.  Ambassador from Hong Kong.  So about the new business models, recently I have heard of the softwares installed by private individuals.  So that these softwares can defend traffic surveillance.  I don't know if you have heard of the hot spot shield and softwares like this.  So basically lets users to publish websites and other things without needing to reveal the location of them.  And they can also use them for social reasons, communication.  So I think ‑‑ I don't know much about business but if the software companies, now they are in small scales but if they can cooperate with the manufacturers or the technical industry, together they may make devices that prioritize and protecting users' privacy instead of just how convenient it is or how nice the system is or how fast a processor is.  But I think it is not only the age of high tech gadget but to defend our privacy.  People need to raise awareness on this privacy problem and I think it is just an idea of new business models.  Thank you.  
>> MODERATOR:  Thank you.  Let's see who has been ‑‑ I think you ‑‑
>> AUDIENCE:  Thank you.  I am coming from a national data protection authority.  The Dutch speaker reminded me of an initiative that existed two years ago, brought at conference of international commissioners for private and personal data protection to deem every web application developer as a data processor.  (Internet outage)
>> AUDIENCE:  We have to care.  We don't have the luxury of saying I just want to play with my toys anymore.  We don't have that luxury.  And if we are going to create alternatives, they have to be design led.  People wake up.  I want to share my photos with my friends.  Let's build a great solution that is private by default.  We have to build great alternative products and that's how we are going to win this battle and I am very sure that we can win it.  But first we have to understand that there is a problem and next we have to do something about it.  Thank you.  
>> Well, a bit difficult to come after that.  But I refer to what you are saying because I think you are absolutely right.  One of the biggest challenges in creating alternative solutions is actually to meet the user needs and user interest because we all want to be where the others are.  We want to make sure that when we communicate we actually find our friends out there and we find what we are communicating and that was the case with e‑mail starting and texting and now the social media.  It has to meet the exact needs that the users have.  That is something where we also have to listen to the users at different levels.  
I would like to end by commenting on what our Moderators tried to prompt us to talk a little bit as well and that's the role of IGF in this.  And it was a little bit ‑‑ mentioning it a little bit just before because I think we have to really remember that as an organisation the Internet Governance Forum is an advisory group for the general secretary.  This is my seventh Internet Governance Forum and I have heard a lot of discussions over the years.  And I can see there is progression and I can also see a lot of redundancy and what is very important.  And that we make things happen in between the foras.  That we actually take action and we actually make concrete recommendations for the ‑‑ I mean higher up in the power structure where it actually matters and where the decision power is and where things matter.  And one point that we get together and get all these good experiences, good advice and best practicing to in some way or another and that somebody actually take charge and find the funding because it also costs money.  And then another level we will actually communicate those good advices and good insights that we share from different stakeholders.  Thank you.  
>> Okay.  I will keep it short.  
>> MODERATOR:  As I said before send us an e‑mail and we will think about how we continue because I don't want this to end here.
>> I think you should name the e‑mail address later.  Two things, alternative products yes, they are out there.  A lot are being built, like alternative for what ‑‑ we need to get them out there and I think the IGF can and have to play a role in that.  Second thing is we have to get the end user involved.  We really need to do that and to get the end user involved we have to unite them.  Frankly in this IGF there is no end user.  Not in the way I mean.  Everyone here is already involved.  Is already on some sort of level concerned about this problem but the end user is concerned but doesn't act on it.  We need to get the end user involved.  And we need to ask them what do I have to do to get you worried about your privacy in a way that I am and then we know what kind of products we need.  
>> BART SCHERMER:  Well, a lot has been said already.  I just want to pick up on the point that was made on the involvement of Government and the role of regulators in legislation.  Indeed I think that it is also very important.  I also agree with the previous speakers that we cannot just trust market forces to turn out for the better.  So there is also definitely a role for regulators in Government.  The EU there is a new proposal for a general data protection regulation with massive fines for not complying with the rules on data protection and privacy.  While I think it is necessary I also think we shouldn't put too much faith in the ability of politicians, regulators and lawyers like myself to come up with rules that actually can work in practice.  So while legislation is important, and I think there's also an important role there for the IGF to take that on a global level and avoid the legislation being split between different countries, because companies will go to the country with the least resistance and they will leave the countries with the highest levels of privacy protection in law.  Because it hampers and hinders them in innovation.  So I think there should be a global Consensus and a global level of privacy protection, but even then I think we need to have the discussion that we are having here, how can we involve end users and how can we bring together different types of stakeholders, different types of expertise, technicians, lawyers, business, et cetera.  And together come up with practical solutions for this problem.  And I look forward to working with you on that towards the next IGF next year.  
>> MODERATOR:  We'll just two quick and then I would like go to the ones that started with ending it with the end users.  
>> HANANE BOUJEMI:  Well, I personally have faith in technology and obviously practical solution is alternative enterprises like you suggested.  Social enterprises and establishing an effective regulatory framework, even you are advising not to rely a lot on regulation because that's probably a long track, but we have to be realistic as well about what we can achieve in terms of finding immediate solutions to, you know, the users of the current, you know, business models.  I know that social enterprise is a great initiative but I think it is a long shot and we have to be in the position to find solutions to services that are being used by billions of people all around the world.  You can't claim that social enterprises will replace the services we are using now and that's why I think regulations should be definitely plugged in to the whole process to be more efficient to solve these problems because big companies they need licenses from Governments to be able to operate and they are definitely in.  Governments are in position to ‑‑ practical solution is to plug more policymakers in to the technology.  Problem we have now is the legislators are not wired to technology and they don't understand the specification of standards.  Hence they are not able to engineer a policy that is effective to solve these kinds of problems.  
So I think policymakers need to be more kind of up to gear, you know, when it comes to how we can apply policy to technology.  And that's why I think it is very efficient to have engineers as policymakers at some point.  If they turn in to politicians at some point in their careers it would be great.  I know it sounds a little bit like a completely different twist, you know, to the career but that's the only way we can solve these kinds of problems.  I will never forget the European Parliament when she addresses politicians at the European Union and when she mentions servers, they think she is talking about waiters.  We have to be realistic.  And we need to keep all our objectives within, you know, a certain limit to understand that we can't really change the world.  We can influence things and that's a long track also but we have to be realistic a little bit.  But I would support your alternative, social enterprise no doubt.   
>> PERNILLE TRANBERG:  I want to say privacy it is going to be so hard.  It is so hard that everyone will want it in the future.  The big question will be whom can you trust because Facebook, Google, most big companies will promote themselves on privacy.  So we need to learn how to distinguish who can we trust and whom can't we trust.
>> MODERATOR:  We are running over time, but let's please, any of the youth panel who would like to come with final comments?  I see Harriet first.  
>> HARRIET KEMPSON:  Well, I am just going to be short, of course.  I am going to move away from the technical side of things because I don't know much about that.  I just know the experience I have had and with my friends and although lots of you know that Facebook and Google within our e‑mails they can track us and we know that in theory it is bad.  I think why people don't in my experience do much about it is that we don't know what, why it is bad.  We don't know what is wrong with them knowing about it.  We don't know what's wrong with them being able to advertise to us.  It is a bit creepy that they know the sort of things we like but it doesn't seem that is much of a danger to and with these new social Networks if they are private, people are already attached to what they have used.  They feel they have invested themselves.  It is hard to change people when they already set it up and they are confident in using it and feel comfortable with it.  
>> MODERATOR:  Thank you.  And Olivia.  
>> OLIVIA BANG BRINCK:  I talked about ‑‑ they talked it was never going to be yeah, total safe and everything.  I don't think so either.  We can always hope.  We can always think about how can we do a little bit better instead of thinking it is never going to be perfect.  Then we have to take a little step and then it will get better.  And then the thing we have to be aware of is just do they know about us, what do we want them to know about us.  
>> MODERATOR:  Thank you.  Yes.  Zach.  
>> ZACH:  So I would like to say as an end user basically my whole point is that I personally from hearing this and from my own personal experiences I want to have privacy in everything I do on social media.  Just basically like what I am thinking is what can we do, what can like ‑‑ like what can kids our age and things do to focus on getting that in the end.  So another point was raised I think earlier by someone over there on the panel.  They said that the first thing we have to do is kind of like raise awareness basically.  So first I think if you ‑‑ as we were talking about earlier we kind of want like a practical solution to solving ‑‑ not solving but helping to better the situation right now.  So thinking like what solutions can we do.  What things can we take and what's the next step in going forward.  And then finally having something that we can end up making safer and more private Internet.  
>> MODERATOR:  Thank you.  And Pim.
>> PIM TEN THIJE:  The last point I want to make we youngsters we don't care about privacy at this time.  We don't care.  I can say it again we don't care.  And ‑‑ or we don't know and that's I think not the first challenge to tackle but it is something everyone in this room should think about because we are your future users and either we don't know or we don't care about privacy.  And if you think about your business models that are not free and are not alternative to what is now, you should maybe think about them without us or maybe with us youngsters and later on come back to us and talk to us and I hope that in the end we will care and we do care.  
>> MODERATOR:  And I will just already now extend and say that the invitation to continue practically on this work is also to you.  
>> PIM TEN THIJE:  Thank you.  
>> And at last I think this really Hanane, your privacy use in my timeline I get asked for parties and I like that and I don't like to get a new app for new houses and give to companies.  We could really use for good innovation, which privacy they gave away and we control it by ourselves.  It will be good for innovation for new apps and yes, this will be good.  Thank you.  
>> MODERATOR:  Thank you.  And I would love to continue this conversation but we're passed 15 minutes in to lunch which is a success in itself.  See how many people were still here.  But thank you very much.  As I said look at the programme and see the Moderator's name and get our e‑mail.  I don't want my e‑mail on a transcript.  So that's why.  But you can find us in the programme.  So please write to us if you feel something for this.  Make a little bit of an effort.
>> I am sorry before we go can we also give a round of applause to Gry for organising not just a great panel but a very diverse panel?  Because I think it is very important if we are ‑‑ if we are going to solve the problems of a diverse world, we need ‑‑ diversity is essential.  So thank you, Gry.

This is the output of the real‑time captioning taken during the IGF 2014 Istanbul, Turkey, meetings.  Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors.  It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.