You are here

IGF 2019 – Day 2 – Raum II – WS #83 Different Parties' Role in PI Protection: AP's Practices - RAW

The following are the outputs of the real-time captioning taken during the Fourteenth Annual Meeting of the Internet Governance Forum (IGF) in Berlin, Germany, from 25 to 29 November 2019. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 



>> Hello.  It's great to have you here to welcome you to our workshop.  It's a great pleasure also to have such a wonderful panel here which is I would say at least very gender‑diverse.  And we have quite a regional diversity, but unfortunately due to different circumstances, we are now very western‑European here.  Apologies for this.  We are organizing this workshop.  The idea is to ‑‑ of this workshop is to identify the best ways to ensure consumers can make informed security choices about consumer devices.  The question is rather not just, I don't think we will be able to find the solutions in 90 minutes but the idea is to actually try to bring the next level of the discussion to find and address and identify questions which maybe should be taken onboard for the next IGF.

We have actually the idea was to review the current landscape.  Also a little in the different presentations and the interventions to have overview of different best existing frameworks and practices that can help to drive security for consumer IoT but also how to approach it, not only to standardize and it how to approach it.  And to discuss and develop further ideas and shame transparency security I.T.  One very important issue for us to was to take into account the previous IoT workshop from the previous IGF 2018 and also what we are trying to do here is because IoT is a very big topic, maybe to focus on consumers and in specific to focus on smart home, well‑being, and maybe also different examples like food traceability.

I'm very happy, I will introduce my panel.  And I will with Estel.  She is global protection and senior analyst based in Brussels.  I have Martin.  And to ensure continuity with last year IGF workshop because he was the chair of the IoT workshop in 2018.  On my left, I have Kristina, also called Kris.  She's CEO and she's cybersecurity, researcher, practitioner.  I would say has a very heavy background in cybersecurity but also cyber defense and cyber warfare.  And on the left‑left side, she's chair of Women International Security Germany but she's also a lecturer and consultant on data protection cybersecurity.  Remotely, we will have also have Sayed.  He's online.  And my online moderator also from Google.

So I would start the idea was to give with this panel a bit of input without maybe at a certain moment if we can have breakout sessions but I don't think this room doesn't really qualify for breakout session, so that's why we would love to discuss with you.  Please raise your hands.  Microphone, you have to use microphones to be able to be heard.  But please free to jump in and make it as interactive as possible.  And not just panel here.  I would ask to set the scene, IGF last year, what were the outcomes and also to maybe frame a bit sectors and key layers.  The floor is yours

>> Thank you very much.  I know we have the graveyard slot.  That's always not the easiest when you have a section and on top, we cannot see you, but you can see us.  So we would really appreciate if you use the microphones.

I would like to do three little things.  One is to summarize a little bit what was discussed at the last IGF and what were the outcomes of the workshop there.

The other thing is I would like to say what are the challenges ahead of us.  And I would like to bring one example.

The last IGF in 2018, Martin was there and will talk right after me.  The key findings and take‑aways were the three following.  IoT good practice principles must factor in at least four primary goal, security, consumer trust, meaningful transparency, and affordability.  It will be important to revisit, establish emerging principles in future to ensure they both reflect the current environment and continue to achieve their intended goals and very good that we have this workshop because we are just doing exactly this in this workshop.

The second take‑away from last year was, it is the responsibility of the larger IGF stakeholder ecosystem to educate, engage with public Government stakeholders on the process of these discussions and to receive feedback from key points made, agreed to them, and relay those points and messages back into the IGF stakeholder ecosystem.  The third point was more needs to be done.  And regarding better formulation of ethics and better understanding of activities underway towards longer‑term sustainability of IoT applications in society.

Now in the last year, Governments have done quite, have been quite active.  And so has the industry.  So many groups have been active on this, business industry advisory council.  Of the OECD, for example.  Identification online which is a soft standardization bodies, you name it.  I mean, it's about five or 10 at least who have been working on this issues.  But consumer IoT applications needs to be divided into the different steps because each of the steps has basically different requirements.  So therefore, it is important to take a look on who works on what of these different steps.

And the last point, consumer IoT applications and their management fall into really very different sectors.  So therefore, they have completely different requirements for transparency and security.  And because the industries know the requirements of their respective industry, the best, it is important that we try to bring together not only stakeholders in terms of Governments but also from the various industries and there it would be wonderful if we have several representative from different sectors here in the room who can pitch in and talk about the specifics of their industry.

One point that I would like to make with regard to a specific industry is the food industry.  Not necessarily something that comes to your mind when you talk about IoT and consumer products.  However, when I was in the 2000s until 2011 or something, the chief privacy officer of Bibbing, we already started to work on a project for food traceability because we saw this as an amazing field, especially with cow diseases, et cetera, at that time.  And now we have , if trace, which although we thought it's long gone but it's still alive.  GS1 bar code system.  So you can download an app.  You can see whether the product that you're buying has an F trace label on it.  You can type in the code, and immediately you get who has produced, when it was produced and, how are the quality standards of that producer.  So this is just one example that I wanted to leave you with of how different the requirements are for the different industries.  And with this, I'm very happy to give it back to Lucy.

>> Thank you.  And don't get confused, people call me Lucy.  This is privacy protective nickname.  When they search for me ‑‑

>> I apologize.

>> No worry, it's fine.

>> As a former privacy officer, I should know that.

>> It's not a secret.  So thank you very much.  I think this is exactly what we're trying to do to see the different devices, different applications where IoT can ‑‑ will be actually the everyday life.  But also to understand where we need, what kind of security requirements, because for food traceable, the there is one requirement when you go about the food chain and I don't know, cooling down something.  This is another security requirement.  Just to give a question to throw into the audience.  I think he put the document online.  Max?

>> Yeah.  We're using the IGF, the document what we are discussing, and we've put the comments including the questions that she just raised on the Wiki page for this workshop.  So if you want, we can continue the conversation and use that tool as well.  Thank you.

>> Thank you, Max.  I would ask Martin to give us a bit of your experience, a bit of the global settings of the different approaches from the different systems actually.  What is possible international corporation maybe.  Martin.

>> Well, thank you very much, Lucy for basically, yes, over the last 10 year, we've been working with the dynamic coalition of IoT on this.  And tomorrow morning at half past 9:00, there will be the 11th meeting of that during an ITF.  The specific focus of this session on consumer IoT security is very useful because if we look to the world of IoT, just like the internet, it started just to be used to see how it could help us make our life more present.  To see where it was necessary, ranging from networks to see the device coming to the pure consumer stuff that is related to counting your steps in your shoes and whether you move enough and things like that.

It wasn't designed with inclination to have security from outside.  And now we see so much that it's about time we do address, because otherwise, we end up in sustainable situations.  And this is not an issue you can solve in one country, because this stuff comes from across the countries, the interaction with the device comes across networks that's span the world.  And industries are from everywhere, consumers are from everywhere.  How do we make this work in a responsible way?  That's one thing.

In that, how do we work with things going forward?  How can we design future IoT in such a way that we can manage it better?  Recognizing that anybody who can put a couple of ships together can create its own IoT device and hang to it the internet.

The other thing with that is that there will always already the mass of devices that are already out there.  Now the focus on consumer IoT use is useful because we can't expect consumers to manage all this complexity in a responsible way.  We need to help.  We need to help by informing them better.  We need to help by making sure that devices that they buy are clear on what they do in terms of how securable are you.  What do you do with data that people understand better so they can make smarter choices?  And we need to back that up of course, that if there's something like certification and labeling that what is said on the label, what is said on the certificate is actually true.  So that's one of the elements.

The second element is that even then, it's important that we consider the challenges for what they are.  It's not just a challenge for the device.  The device can be used to be weaponized.  So it's known that consumer devices, like cameras, but also for payment machines have been used for such a thing, generate tons of signals towards one target and basically stopping all possibility to interact with it.  That's one element.

The second element is abuse of the thing itself.  Actually I had a personal experience of having install cameras because I travel at times.  In my house, I have cameras that if unsuspected movements are taken place, you get an alert on your phone and you can actually see what's happening.  Sitting on my couch at some point, one of the cameras turned to me and I had changed the password.  So it just shows how vulnerable this can be.  And because we focus on consumer IoT, will not go into how such devices can be weak spots in corporate or enterprise networks.

So how do we help consumers so one thing is better information on what they buy, the new stuff.  So they can make smarter choices.  Second, to provide more tools to help them to do that.  And one of these are for instance, captured in the new standards that you can't just use the standard password, the IoT device comes with, because it asks you for a password when you put it online.

But the other thing is also that with all these device that are not all safe, it's good to provide the use for instance of a door lock in front of your home network, for instance.  Where you can manage all the access, like your front door.  You may not have your door locked unless you keep your keys or money.  But shift from door look, same thing in a way with IoT devices in the household.  So there's solutions developed for that as well.

And these will become more and more natural and maybe even part of the service provider offers, the service provider offers access to the internet, to the telephone network, to your cable TV.

And that brings me to the third part of that.  Which is that all these service providers starting with the one who supplies you with an IoT device, to the access provider, to the cloud provider, to the application provider, the service provider, they all have their role.  And if future devices will be recognizable by the system for doing certain things and doing other thing, you can expect access providers to block if something goes wrong there.

So it's a whole series of actions that need to take place.  And they focused on consumer education labeling and on network resilience.  Other approaches range from in the Netherlands where they have complimentary approach where they set standards and for instance, by Government procurement policy, they steer towards better products to be delivered but also legislation and to back it all up and liability to make it expensive.  If you don't do your expected duty.

And other examples include the UK where really looking at good practice and offering that.  So there's a range of things that are moving at the moment that are helping us to manage this better going forward, recognizing that this is a global thing.  You can't solve this in one country alone.  But also, yes, you can help consumers in your country to be more aware and have better tools to deal with it and also make all players in the value chain enabled service to take their responsibility on the whole.

So I think that's sketching of the landscape.  And I'm very happy to have examples here in the room as well to listen to.

>> Thank you very much, Martin.  You mentioned consumer be better aware.  And I think this is interesting.  When we had the operation meet, we touch up on what the last workshop on IGF 2018 was talking about as exactly ethics.  When we had this, our operation panel she said actually talking about ethics is we want to talk about users’ rights.  And I would like exactly because it's all about consumers here and we want them to be aware why the difference is necessary.  Maybe you can elaborate on this.  Thank you.

>> Okay.

>> Sorry.

>> Thank you for having me.  And as Lucy mentioned, I am policy access now and our mission is to detail the rates of users around the world which means protecting human rights in the digital area which is why I want to focus on a human approach rather than an ethics one.  And I will explain what's the impact of IoT with human rights and the things that should be considered.

So you probably see that almost nearly year there's a new product put on the market, whether in the form of home assistance devices, toy, jewelry, there's obviously connected car, contacted object that you can put into your home and while we see there are efforts to keep some of those things secure, those discussions of limits to security which is an important component, but they don't look at specifically the larger human impact that those products can have.

So which way IoT can interfere with human rights.  They can do so in ways, either actively or passively.  First, there is the obvious impact on the rights to data protections.  IoT often collect a lot of data which can include private communication in your home.  And if you live in a country which do not have appropriate and enforceable data protection law, you may be vulnerable to that if the IoT product is not baked in protection into the design.

And this is not just about your own right, we are also trying to figure out how modern law adapts to the new technologies because let's say for instance I purchase an Alexa in my home, and I do some of the recording of my conversation.  But I invite Lucy who has not consented over to the use of her conversation with me, how do we deal with?  IoT not only actively collecting information about you but passively people coming to your home, coming to your home potentially being impacted.

And there's interference in risk with the rights to privacy.  This is linked also often to potential vulnerables that products can have.  External factors can enter into the device can use it to you.  We had the example of the camera being accessed by someone else and turning against him.  Which can be done by private actors and also by Governments if there is not enough protection on the security of those devices.

And then in that sense, they can also be impact on freedom of expression.  It's wildly known now that surveillance and risk of surveillance harms privacy and chilling effect on freedom of expression.  And what are the risks when we are hearing that different home devices listen to your conversation and also their employees from the company that can access your information?  Doesn't mean people are purchasing the devices will start moving private conversation if part of the house where they don't have the devices or the devices people will feel like if there's no human rights safeguards included in the production and deployment of sale of the products, you're basically buying a surveillance device you're putting into your home.

And the solution to address the issues would be to go with human rights baked in the product.  We have a few large recommendation on how to do that.  Which would include having data protection, privacy and security by designing by default.  Which would mean from the moment a product is designed, they are specific rules baked into it in order to protect those rights.  This goes beyond the mere compliance with the existing law that may exist in the country regarding data breaches or minimum data security or data rights.  This is really something that we would like to see more innovation around and more product development.  There's also an important part that was mentioned about the transparency and the information given to the users and I think on that issue, we're seeing in the IoT market at the moment, at some point IoT companies may be bought by others and rules may be changed while the users learn after that and neither informed or consulted on the opportunity to say no to the change of rules.  So there much greater consultation with users in case a company decides to change the way an IoT system operates.  And more largely, we should consider discussion around the right to disconnect.  You might initially be purchasing an IoT device or a device that connects to the internet but later on deciding that you no longer want that to happen.  So the product should always be able to function in a way that it doesn't have to be connected to the internet.  Which may have some limitation.  Not all product could do that but to the extent it's possible, there should be a need for consumer when they purchase, let's say a smart fridge that the fridge still functions if you plug out the internet.

I want to highlight that some of the safeguards are not limited to country who do not have privacy, security, or data protection laws in place because in places like in the European union where we can see we have some standards to apply, there's still a lot of question including the question I mentioned before, maybe there's protection for your privacy but not the ones of others around you.  What is the passive impact on the right?  That is not necessarily addressed.  So some protection that needs to happen here.  And not just good for users because you're protecting their right which is a duty that companies have to do.  But we think it will be good for the IoT market which may experience failures if users are constantly seeing in the news that there is such and such vulnerability, that an IoT product for kids is used to spy on children, that there's faulty design, or that software no longer maintain and therefore the product can no longer be used.  If we don't have the safeguards built in by design, there is a risk for the IoT market to not develop the way it could be.  And therefore hinder innovation.

Coming to the ethics versus right discussion, we saw a lot of ethics discussion, not just on IoT development but also actually intelligence and others.  And these discussions are valuable.  But they need to happen once we're sure human rights obligation are baked into the products and protected and P. when that happen, if we see there's a gap in level of protection, let's have a discussion on ethics but ethics should not be a replacement for human rights because ethics are a variable concept depending on where you live.  And ethics are not enforceable.  So you need the human right process text.  And I look forward to discussing the points with you.  Thank you.

>> Thank you very much.  Now we will go to the darker side and go to a bit more examples.  And I know that Chris has prepared some examples and knowing your experience could be very scary, so I hope that still the end of the session will be digital optimist here.  But Chris, please tell us about your experience from the cybersecurity side and your experience on how IoT and the vulnerabilities.  Please, the floor is yours.

>> Thank you very much.  So in 2012, the world experienced the most devastating cyber warfare attack that has yet been known.  How many of you in the audience are aware of a company called Saudi  Aramco?  There's a few.  It's the world's most valuable company and a good deal of the energy that comes to us right now in this room is actually produced by that company or by various joint ventures.  What happened in 2012 was the I.T. and IoT systems and some of the industrial control systems that belong to the company were weaponized because they were infected with malware which wiped them.  And they were forced, the company that is, to disconnect from the internet.  And it got to the point around day 13, day 14, the refined petrol supplies of the country of Saudi Arabia were halted because the IoT devices that helped automatically load it too into trucks were no longer functioning.  Imagine if you go to a petrol station and you're told there is none.  Imagine if you call 112 and you're told, listen, we can tell you what you should be doing if someone is having a heart attack, but the ambulance system and emergency services do not have fuel to come to your locations.  One of the challenges is, anything can be turned into a weapon and used for dual use.  And part of our modern world, we all depend on IoT devices.  I would probably venture to guess that all to us in the room right now have smart phones which are IoT devices that we use on a consumer basis or business basis.  But we also depend on other things related to IoT to keep our modern world running some we have things like solar panels that many of you may have in your home.  And those are IoT devices.  Wind turbine, either for your personal use connected to the grid or for industrial uses.  IoT is everywhere.  There are billions and billions of these devices.  Either consumer grade or industrial grade.

And one of the challenges is, many cases security is not really thought of when they are developed.  They're thought of to meet a certain price point.  And we also have to look at it from the case that we all enjoy functionality.  We all enjoy usable.  But to enjoy those thing, there seems to be a bit of a push back on security.  Now security is not an easy thing.  And unfortunately, one of the least thought of things around the world another than the European union is also privacy and how that can be leveraged for security purposes and most used.

Japan is about to have the Olympics in 2020 and they recently passed a law that stated that their access providers were allowed to scan their networks and find weak IoT devices that could be turned into weapons and attacking the country.  And one of the primary concerns is that the 2020 Olympics goes smoothly.  And these different types of IoT devices could be in your homes that are suddenly blocked from being usable whatsoever.  So if you have a doorbell which are very, very common, more and more around the world, and suddenly you cannot answer your doorbell remotely, that is the least concern basically of the Japanese Government.  Because they are looking to protect their national infrastructure and obviously their reputation.

Now last year, I had the privilege of speaking at the EU commission for an event related to the presidencies regarding the smart grid for the European Union.  And some of the examples that I showed, I showed not only are industrial wind turbines vulnerable to attack because many times they are not secured and they use default or hard coded credentials, but that also extends down to things like actual smart homes, smart appliances, your fire alarm might now be a smart fire alarm.  Your burglar alarm might be a smart burglar harm.  And in case, they are shipped with very little security and they expect you as the consumer to be able to set up security which usually requires a lot of complication.  And security is difficult for everybody.  And one of the examples I showed was I could remotely get into someone's smart meter and bypass the authentication due to very poor coding practices and then see the electricity being used and also adjust the price for peak and off‑peak.  More recently while I was preparing for this particular workshop, I found a whole bunch of, we'll say Zesla power walls connected to the internet.  And someone could script up an attack using default credentials because most of the time they are not changed and the interface that connects to these do not require it to be changed.

So what a person or attacker could do is get into the power walls and force a mass dumping of the electricity on to the grid.

Now I am talking about kind of dark subjects, but I want to stress that I am actually very positive about our digital world, both currently I have a smart phone, and for the future as well.  But there are certain things that we need to absolutely consider.  A lot of the consumer grid devices that are purchased right now are basically throw‑away devices.  They are produced at very low cost.  They ship with older versions of operating systems such as older versions of Linux because it's inexpensive to do so.  But when the items are shipped and produced in this manner, they have no secure software development life psych and will they be shipped with known vulnerabilities.  Things that can be used, for example, for the bot which got out of control from the original creator and turned against various critical national infrastructure and tried to disrupt various types of our everyday world.

We also have a problem where even if these devices carry some sort of encryption, many times they are shipped with something called a known private key.  And what this means is you can actually find the devices with a known private key.  And that means that everybody actually knows your encryption key which equals zero encryption and zero security basically.

So we're all looking for good deal.  I know in the U.S., and I have seen a bunch of sales around Europe as well for black Friday, we need to consider the fact that looking for the lowest price point also in times also includes looking for the lowest level of security or security testing.  And in most cases, these devices are not security tested whatsoever.

But I do see from both last year and working for this workshop for this year that there is a lot of opportunity to change that.  And if we want to keep the modern world going in a nice manner because I certainly like running water that's clean and electricity and all the trappings of a modern life, then we need to understand some of these risks and how they can be used unfortunately against us and then try to change the way that these particular devices are regulated and try to certify them and also make it easy for consumer, easy for businesses, because not everybody is a tech technological ebbs pert or engineer or what have you.  So we need to take what we know is wrong about it and change that for the better so that these types of devices are not our enemy.  Thank you very much.

>> Thank you very much.  This is a wonderful bridge to our next speaker.  Who is ‑‑ we have Sayed.  For the non‑speakers among us, it's the technical inspection association.  And he will be exactly talking about a bit of testing and bring the private sector and what is actually all there as possibilities.  I think you are online, yes.  Yes, I see you.

>> Hi, yes, I am online.  I hope I'm not on the big screen.  So hi, everyone, and thank you for having me.  And I will basically start with something else, not certification from the beginning but basically even today, only a few companies are actually following the principle secure by design which basically means if you think of secure by design, basically means manufacturing IoT device in a secure way from the beginning and the areas of people, process, and technology and also covering the whole life cycle of a product.

If we want to go into details, if you talk about people, you of course would make sure your people are trained in a secure way so they can develop IoT devices in a secure way.  They have a security awareness.  If you talk about technology as mentioned right now, we should make sure that technology which is used in IoT devices is secure.  Of course, we shouldn't use our data encryption Al go rhythms for example and talking about processes of course.  We should definitely make sure that we are following a secure development process which then again includes, of course, security testing, scanning, of course, and maybe also surface detection.  Of course, if we do this on a continuous basis, manufacturers can make sure that they identify vulnerabilities very fast and mitigate this risk which helps them to reduce cost because they don't have to fix all the findings before go live.

Again, if we ‑‑ telling you the benefits of security by design or what is secure by design, what do I think a certification can help?  Of course, a certification will never stop a hacker or stop a hacker from penetrating your system.  But where certification can help is basically making sure that all manufacturers of IoT devices are basically following this principles because basically, for example, you if you could only enter the market if you have a cybersecurity certification, we could make sure that everyone actually is following the principles and there are no low‑hanging fruits anymore in your system.  And that's it actually.  I wanted to keep myself very short.

>> That was really very short.  Thank you.  But you have, you actually approach quite a lot of ‑‑ you throw actually quite a lot of buzzwords to us and to the audience.  I think we've been talking about different elements.  We talked about security by design which is basically when you start to design a product, you think about security or privacy by design but it's also now under the GDPR obligation for manufacturers and service providers.

The question is also about how to ‑‑ we're all talking about different elements like education what Martin mentioned.  Apologies.  The different global negligence of ethics, that's why for example the question of, well‑known norms as human rights and users rights to be actually the angle, how to see and to approach IoT, the question of being aware of what I have at home and what I am wearing or using can actually be misused.  But also, the different elements of the continuum.  And I think this is something about certification, there is the question of what is the certification?  When I see policeman and policewomen on the street I feel secure but that doesn't mean the security level is actually constantly low.

So this is the same with certification.  It is a momentum where you have requirements, you fulfill as a product or service.  But that doesn't mean that afterwards you're secure.  Vulnerabilities could be unknown before that.  And then appear afterwards.

So that's why I would like to challenge my panel and think about what could be.  We talked about the different perspectives right now.  We tried to address the different elements for users, for private sector, for public sector, what Chris mentioned about the question of certification regulation.  But I would like also to know what could be solutions.  So anybody wants to dare ‑‑ the audience, please, if you have any questions.

So do we have to regulate security?  Is it something that we need to be constantly aware and educate non‑stop?

>> No.

>> No?  Okay.

>> Sure, the answer is clear, right?  All of the above.  We cannot solve this with a single solution.  That's very clear.  Being aware and educate is a key part.  Partly this happens because of media reporting.  You really should take care.  And in this I refer back to the time that PCs came in the households, early '90s.  Nobody made a backup until they became more important and people started realizing that PC that crashes.  Now it's normal.  You store it in the cloud at the minute we create it.

And such things will happen here too.  It will take time.  People need to be aware and take their measures because they become aware that phishing, for instance, is also something that you should think about.  The stories in media about people being blackmailed, people being extorted because of these kind of emails help other people not to follow the same track.

And I guess the new generation also, I hesitate to arrange me in the same generation as everybody at the table here, but the new generation will be more aware automatically of what the risks are, and they should be made aware.  Talking amongst peers but also in education.

Producers of the IoT devices will need to take the responsibility.  So far, it's like the pollution.  People never had to pay for the pollution that cost.  It was never part of the price.  Now you get taxation on top.  Maybe the same is for making producers that deliver insecure devices pay for their failures.

So in that way, you see that both standardization, certification, good practice come in.  What can you expect from partners in the value chain of delivering these services to the devices?  And last but not least, Government will need step up particularly when needed.  Where for many years we talk about privacy.  And insufficient action was taken.  This forced the Government to take measures that are far stretching.  They're not perfect.  But they came because the industry didn't manage to regulate it itself.

So I guess that's a call and a warning also for all of us that make money by offering services that people want.  To build it now from the outset

>> Thank you, Martin.  I see ‑‑

>> Yeah, well, we all know that laws are necessarily the panacea that will fix every problem.  However, I remember vividly that we developed lots of privacy enhancing technologies in the late '90s, early 2000s.  And nobody wanted to buy them.  And the reason was that it didn't cost anything if you violated that privacy rules.  On comes the data protection regulation.  And all of all of you a sudden, we have interest in private enhancing technologies.  How about that?  So I found that very interesting that it was a catalyst for privacy by design to have this law.  The other thing that we will see is that in IoT it will be very much important to see who is responsible for what.  And that will be decided by courts and by insurance companies.

Because the insurance companies needless to say are working on the question of what can you ensure and what can you not ensure in the whole IoT chain, especially not of course in the industrial IoT but also in the consumer IoT.

So therefore, while laws are not so to speak the fix‑it‑all, sometimes it helps.  Not always.

>> Thank you very much.  The thing I heard recently that insurance especially, I don't want to make advertisement for anything, but cybersecurity insurance are obviously getting very popular.  And that makes exactly companies force them to foresee some security processes in‑house because insurance, we want securer if you don't have it.  So interesting that you mention insurance and that's an interesting checks and balances approach.

 

>> Thank you.  A little bit in the same line that was mentioned that obviously laws and regulation are not necessarily always the answer to everything.  But they were partly an answer for the failure in the market for companies to comply with a specific obligation to protect those rights.  I think not only use freely and voluntarily that sense but give certain toy users, if it's baked into the law, there's an obligation to protect security and data and privacy.  It provides more confidence to the users that there is at least some minimum standards.  And in case of failure, there's a remedy mechanism.

And I think the point we need to make sure is that those laws or those measures in particular around security or technology are neutral and not imposing specific standards.  But just put in positive obligation on companies developing those products to always protect it.

>> Thank you.  Max, do we have something online?

>> Nope, not so far.

>> I'm just asking ‑‑

>> Oh, sorry.  He raised his hand.

>> Okay, thank you.

>> One moment.  Can't hear you yet.

>> I think I will repeat myself a little bit or repeat what my colleagues just mentioned.  Basically I think regulatory bodies should enforce cybersecurity for a simple reason.  If you look at the manufacturers of IoT devices, the reason probably they are not pushing cybersecurity in the development life cycle as much as for others, things is very simple because at the moment, customers are probably not asking from them.  So therefore, why should they invest the money to make the IoT devices secure if nobody at least, at least not the customers are asking it from them.  And basically makes the IoT device more expensive.

Of course, cybersecurity is an important topic.  Therefore, in my humble opinion, regulatory bodies should or even enforce cybersecurity into the development life psyching of the manufacturers of IoT device.  Thank you

>> Thank you.  Chris?

>> Yes.  Now one of my angles is I definitely don't want to hinder our technological advancement.  But we need to make people aware of the risks that they face with the different types of devices.  Now going on to her earlier point, about awareness.  Many of us are taught from a very young age different ways to keep ourselves physically safe.  However, that is not done as much in our digital world, even though our digital world is taking over more and more our physical space and becoming part of us.  With the advancement in technology looking at the next type of communications protocols beyond 5 G into something called 6 E.  That would enable IoT devices to be placed inside our bodies.  And we're still struggling with how do we represent that risk to the everyday person.

Now there's a very good example that came out of the United States.  It's called SBOM or software bill of materials.  When you pick up a produced piece of food such as a bag of chips or a bag of crisps, you see an ingredients list.  You will see certain things that are what we would say a description of the open source materials that are used such as a carrot, such a pa power walls at a toe.  And then you also see proprietary thing such as flavored agents.  And one of the things behind the software bill of materials is to say, please list the open source libraries and things that make up this particular product while at the same time you can still maintain intellectual rights by keeping proprietary source schedule elements private in a way that we could then all understand.  Now she made a point while we were prepping for this that it would be nice to see a way that when we purchase an IoT device for even a black Friday sale, it might have an image or an icon that shows how either secure it is or how much privacy has been baked into it to begin with.

So that you know visually immediately if it's a higher‑risk thing or it's a low‑risk thing to you.  And last week I was presenting one of the keynotes at the Dutch police technology days.  And that presentation won't be shared publicly because of some items that I unfortunately found.  The police now use body cameras.  And also teachers in some countries use body cameras in certain types of schools.  And one of the things I found was quite unfortunate but a major vendor of body cameras, their IoT devices and also their I.T. systems that connect to the body cameras for the images to be uploaded and processed were exposed to the internet in an extremely open manner and also exposed actual police officers private information.  So here we think that we feel more secure by having more police officers, but because they're wearing IoT devices, that actually places us more at risk because the systems behind are not required, at least right now, to have security by design.

And unfortunately, that also leads to privacy matters.  So imagine if you found out that in your school your children's school, the body cameras that a teacher might be wearing are uploading images unsecurely that anybody could see images of the children.  So we need to address some of these things because there are going to be billions more of this particular device out there, more and more and more.  So we have to look at the simple fact that we all need to understand the risk in a clear and concise manner.  Thank you very much.

>> That was already built into the data protection general ‑‑

>> You want to ‑‑

>> I'm sorry, privacy by design is already billed into the data protection directive.  So that is a lack on the manufacturer rather than on the legal requirements.

>> So, who sues this guy?

>> Well, not every country that produces these devices have signed on or really agree with a loft principle which is also a problem.

>> But if you sell that thing in Europe, you have to apply to the Europeans and the data protection regulation holds.

>> We have just ‑‑ okay, I think Max has done something very useful.  Some conclusions maybe just to warm you up because I want you to come up and say something.  There are a lot of microphones.  So just stand up and go to the mic so we know you want to say something.  Max, you want to ‑‑

>> Yes, thank you.  So as Lucy mentioned, I tried to distill what you said, and I don't claim at all for this to be complete.  But please do point out what is missing and what we should add.  If you could switch to the Wiki.  Yeah, this is the Wiki page I mentioned earlier.  And it has the updated planning if today and then down here it has some notes about the flow.  And at the bottom is where I started to type and I heard what is needed is really a mix of different elements, right?  Of governance tool, technology, awareness raising, education, literacy, where some of the things that were brought up, I list them a little bit.  A list below, privacy enhancing technology, create laws that force security standards, and push ‑‑ so I definitely need to edit that, punishes by behaviors.  S what that should read.  And the most viable option for a multistakeholder group like the one up here and happening at the IGF would be some kind of nutrition labels were just brought up or seals.  And then best practices in this space that could be promoted, default usernames and passwords to ensure those are of high quality, coordinated security updates, devices should be secured throughout automatic update where is feasible, and having an end‑of‑life plan.

I think that's at least tidbits of what you guys brought up.  So maybe now we see the first comments from the floor.  Please help us.  You can edit this as a user of the Wiki.  Thank you.

>> Thank you very much, Max.  Okay.  The first, thank you for breaking the ice.  Go ahead.  Just maybe introduce yourself.

>> My name is Bert.  I am from the U.S., originally from Europe.  What about distributed security processes?  So instead of looking primarily at regulations, looking at technology to solve the problem?  And what I'm thinking about is something similar to how Bitcoin, blockchain technologies, trace banking issues and security issues, of course, introducing different one so, it's no panacea.  But still distributed access controls that are governed through consensus mechanisms that are mathematically controlled.  So for instance, in your house all your devices could be linked to your identity, the identity devices you're wearing can be linked to your identity and to other devices.  And you can combine this with encryption like proxy re‑encryption so on.  I don't want to go into all the technical detail but there are projects out there that are tracing security and privacy in totally different ways, leading to different challenges because they're all just being created.  But they are technological solutions to some these problems.

>> Thank you very much.  Anybody wants to react on this?  Martin?

>> I think you will need to look at that as we look at the internet today.  The internet is a network of networks.  And I think it's really about the applications that you talk about and how they deal in that specific ecosystem with security.  Sometimes blockchain may be a solution.  Other time, it may need other solutions.  You already have examples today, for instance, car that is a combination of IoT device its own security system.  An airplane.  But I can also see for a different services that this will emerge.  And my prediction is that we will have several solutions working together.

>> Thank you.  I think on the blockchain, there is still very experimental stage and wing one of the most of the biggest challenges of blockchain is probably the energy consumption.  In order to really use it for specific, I mean, then you have to use it for very specific issues.  Most of the companies are experimenting with this, but the question is then, I allow you a remark because we have a gentleman standing here.

>> Yeah, I'm not talking about using Bitcoin and the world's energy to solve the problem.  But I'm talking about devices that are linked together and governed by the same principles.

>> Thank you.  Please.

>> Holding association of Swiss globally operating companies.  And these companies, we had exactly 20 years ago already quite an interesting moment which was preparation for the Y2K.  And this showed for the first time what really and some kind of a test case, it was an announced catastrophe, how the failure of essential systems might work out.  And thankfully it didn't.  And thanks to much preparation, I see that our companies are today going back to the planning.  Planning of that time.  And also whether gain knowledge about tackling such situations.

But actually I have a totally different question to the panel which is on open source software.  Every product has elements in it.  And according to the panel, who should be responsible for that?

>> I will be silent.

>> So I'm all for open source.  And they can have a lot of different benefits, but we also have to be concerned with the fact that a lot of the open source projects that are utilized actually are not very well maintained themselves.  And you also can't provide very much, if any, integrity that the open source code that is being used has not been tainted in some manner.  And this is also a concern when we discuss the use of open source.  A lot of IoT devices use a lot of open source because, there again, it's a cost‑savings.  But at the same time, at the arch aircraft right now uses over 400 open source library, that's Boeing, that's Airbus, and so forth.  And there are no controls on who can manipulate and taint the open source libraries.  Is.

>> Would you say that there is some kind of, I won't say regulation or control but maybe self‑control necessary?

>> Sometimes there is.  But at the same time, the EU recently started a bounty program for certain key open source projects to find any sort of exploitable vulnerables and report them on things like open source SSL and things of that nature because they don't want them to be manipulate because they used in so many different devices.  From your smart phone to an airplane, to a car, to a burglar alarm, to a regular system like the ones we have on the desks.

So pushing for some of that in a positive way, I think, would be one of the solutions to the security challenges and integrity challenges with open source libraries.

>> Okay, thank you very much.  There was one question raised to the panel in between, namely, how to deal with the IGF Wiki.  We will do this at the end.  Maybe Max can explain at the end how to go to this Wiki.  Oh, I see another gentleman.  Please.

>> Hi.  Back on the same thing as the bounty ‑‑ sorry, jack from Australia.  Back on the same concept as the EU for open source, given how much reliance there is on open source soft, does it make sense for it to be publicly funded so there is proper maintenance of critical infrastructure like that?

>> Take this one.  In my personal, professional opinion, I would say yes.  Because we are reliant on certain core key open source projects.  So yes, I would say we should start looking at public funding for some of the things that are key to our modern world.

>> Thank you very much.  I will try to bring back maybe, I don't see anybody ‑‑

>> There is.

>> Oh, sorry.  You're exactly under the light.  So it's kind of like, please go ahead.  Thank you.

>> Yes, my name is Cameron.  So when we talk about solving the security problem through regulation I cringe because we all know how we haven't been able to solve the security problem under normal internet.  And I think it goes back to the business models that IoT providers and the purchases of the IoT services operate around.

With the exception of a very few gig, new one goes a buy a product and the security implications are the very first thing in your mind.  They just want a smart phone, they want to open it remotely.  So those considerations are secondary.  Which brings me to what we ‑‑ it's not even possible, security is a process, an ongoing process, not an event.  And when you talk of regulation, we are looking at Governments.  These are people who their understanding of the things that lead to most of the security vulnerabilities we're aware of, actually quite questionable.  And the processes for data regulation to reality is not necessarily the fastest.  So I think that is not really a good idea to start trying to come up with regulations on security.

Which to me then says, the best thing to do is education.  Because once people understand that, yes, even though when you go to buy something you don't look at the security implications, it is possible to both get what you want and be secure.  Because privacy as valuable as being able to the primary activity of your product.  So education and no regulation

>> Thank you very much.  I think you have raised a very important element that actually regulation is something that only helps when mostly when there is market failure.  The questions on the security side, if not a peer review, peer kind of when companies are actually offering secure product, actually people will probably just have more trust in this.  And I'll just want to bring back, because it's a technical discussion which I found extremely amazing, but I would like also to come back and she already read my mind, to come back to the users and how what are the user expectations?  Do they need it to be secure or just buy it because it's very cheap?  So maybe you want ‑‑

>> Thanks.  I agree with some of the points that were just made, that education is important part.  And regulation may be slow.  But I'm not sure that regulation is only necessary when there is market failure.  There are some obligations that private companies has to comply with.  And we've just seen that if we don't put it, a lot of time in binding obligation, they just do not happen.  It's unfortunate but that has been, a lot of the reality.

And having those ‑‑ having obligation to provide security, not having detailed rules on how to you make the product secure is positive for the users but that's actually what led to innovation in that sector.  Arguably, we have been having a lot of the data protection rules which are under the GDPR in Europe for more than 30 years and not a lot of innovation by design products.  But it's only when the regulation came that there was a support to be there.  Nothing prevented companies to do that before.  But when you create those obligation, you actually create the opportunity for innovation.

And obviously education is an important part from the user perspective, but just that would put too much responsibility on the side of the users to go have a look.  And it's great to have more transparency and there's a part that's important, but alone it cannot really work.  And we need to make sure they are some security obligation that exists.

>> Can I add something?

>> Okay, Martin.

>> Well, just a little bit on that GDPR.  Example, new obligation, but what happened is that there's new real price when you get caught for not living up to it.  So the risk of not living up to it goes up so much that it's worth a couple percent of your turnover to really invest in it and live up to it.  So I think what you say the basis of technology‑neutral regulation that comes with functional requirements of good care.  Make a lot of sense.

>> I think that what she actually fits with he said at the begin, there were in the '90s privacy enhancing technology developments, but nobody really didn't take them onboard because there was not the demand on this.

>> Yeah, basically somebody in a focus group said to me, I think this is really useful.  I will implement it if the Government forces me to.  Otherwise, my budge set so stretched that I will not spend a penny on it.  So that was a situation back then.

With regard to a loss in technology and requirements, Martin is right when he says it has to be technically‑neutral, by the time the law is passed, the technology is way somewhere else.  And we saw this with the 10 years we spent on the review of the data protection regulation.  So that was a funny thing.

With regard to distributed technologies, yes there, are several things out there and lots of very good concepts.  I remember we once developed something which we called the sticky policy paradigm.  So once you were to define your settings for privacy, this would go with the settings or your preferences would go wherever you went on the internet.  So you didn't have to log on each time and do something, but it would be automatically done for you.  Which for the users would be helpful and nice?  And like in other things, it's more of a concept than has seen the reality so far.

So if we were to put together all the great ideas that already exist and put them into practice, we would be just fine.

>> That's a great final words.  But we still have discussed a bit more.  Trying to challenge, the panel and audience all the time.  So I think when we're talking about all the different elements as Martin said, please do everything we said before.  She is saying just do everything what is already there.  The question is, like the balance and maybe when we have given another example, another topic where self‑regulation helped, when peers actually said some best practices and there were companies not actually following the practices.  But there was a relation, kind of a public promise to deliver on this with the conduct, which is something that the peers, the companies among themselves are setting as would be something that could help?  I fully agree with the last speaker that actually cybersecurity is not a ‑‑ it's not a final condition.  So cybersecurity is a continuous process where actually you need constant monitoring, constant development and innovation which cannot be only served by certification, for certain devices and maybe where we're going now for the next steps to think about that certain, we need different approaches for different devices and for different sectors.  But also to think about the different international approaches around the world.  How we can solve it may be technically without entering the question of international norms in cyber space.  So these are the different ‑‑ that's why the question would be something that would help.

Also trainings.  For example, would be coding should be maybe obligate at school.  There is a lot of knowledge and Chris said technology knowledge where we don't want the user to be burdened.  So the question is, yeah, Max, I think you have ‑‑

>> He would like to comment.  Can you switch the screen, please?  No, the other one.  There you go.

>> I wanted to say basically I would also agree that the regulation must be, let's say technology independent, shouldn't tell you what to do, but it should tell you at least to follow security by design principles.  And one thing which is necessary is awareness.  And I think it was just mentioned that coding in school or just basically awareness in school about data protection, about cybersecurity and about technology because it's not just cybersecurity, it's technology in general is an ongoing process.  So of course, never the one state where we can say we are secure and if you are following the test, then you are secure.  No, we have to continue to monitor and continue to test and we have to continuously develop actually also the processes we are following and the methodologies we are following.

And again, one thing which was also mentioned, I think then of course there needs to be a risk, let's say a layer of risk where we specific IoT devices which could harm a person needs to be certified, needs to be tested more than other devices and also maybe continuously tested and of course, there could be IoT devices which don't cause any harm or don't or can't affect you in any way.  Then they probably don't need as you much security testing as IoT device which potentially could harm you.  Thank you.

>> Thank you very much.  I see another gentleman here.

>> Thanks.  Calling me a gentleman, that's probably a slight exaggeration.  From Ireland, we're a hosting company in Ireland called Black Night.  I think this is an interesting discussion but the idea of self‑regulation is possibly a little bit naive.  The cost for producing a lot of the IoT devices consumer level is incredibly low.  While the bigger manufacturers might be in a position to set certain standards and regulate at the level that we need.  I can't see that happening with if smaller creator, the guys going off the kick starter, going off and buying the components and using the cheapest staff to write the code.

So I think the focus really needs to be education, raising awareness.  Getting people to and questions, to ask what should they be asking the supplier, is the thing up to date?  All of those questions.  But this room is full of people quite technically aware.  How do we bridge the gap to the people who aren't technically aware?  Go on to any of the Amazon websites at the moment and they're pushing a whole range of connected devices as part of the black Friday sales.  They are not targeting technically aware people.  They're targeting the average user, the average consumer who just wants to plug something in and get instant satisfaction

You can train somebody to change the oil in the car, but we've all failed in industry to train users to update their laptops, keep their websites CMSs up to date.  Expecting them to and questions about open libraries in an IoT device is pie in the sky.  It's just not going to happen.

>> Thank you for the reality check.  And I like the expectation like getting instant satisfaction.  But you're absolutely right, that's what we want all want to use the service or use the product.

There is a question over here.  Yes, please.

>> Yes.  I'm a law professor.  And first, I'm a digital immigrant.  And for me, a regulation of security of IoT is unchartered territory.  It's our chancellor once put it, (indiscernible).  And second statement I want to make and share with you, we live in a transition period meaning not everything is digitized.  And so perhaps now we discover the advantages and efficiency and effectiveness potentials of things of the real world of the past.

And because this is unchartered territory, I want to confront you with my research on interactive toys.  You perhaps know about Kyler.  So it was prohibited in Germany and then Walmart took it from the shelves in the United States accord dog my research as well.

But the point is, I mean, there is spy ware in your nursery.  And if Governments in our, in Germany, agency prohibits the sale and usage of these toys that grandmothers buy for their children because they think we want to be up to date, this is not ‑‑ and they have to destroy it afterward, you know?  And we don't enforce it.  This is not the best way how to deal with it.  And that's why I want to challenge with your approach as a law professor and human rights and data protection.  I mean, I'm not into data protection law only, I'm in cyber law.  And I'm in protecting human beings.  And perhaps we could advance a different association, and they simply want to take the risk with Kyler.  Yes, they love it.  They love it.

And others who are ant protagonist and we shouldn't only focus on these people that I call ant cyber protagonists who want to have data protection.  We also should include into our human rights perspective in a new cyber law, people who really want to participate with cyber space on all opportunities, taking more risks, sharing more data, as a lot people of my generation have ever envisioned before.  And I don't want to ‑‑ I only want a differentiate as a scientific thesis.  But innovation, not stymying progress and taking into account different kinds of people of data subjects of cyber citizens

>> Thank you very much.  There is ‑‑ we have announced only nine minutes left.  And I would like also to give the panelists a final word, but we have one comment online.  Which I would love to hear.

>> Yes, Luke recommended that we should look into security by design as an important approach.  And in particular the NIS directive that might be known to the experts.

>> I would start with the final words with Martin because he has to leave five minutes early for the main stage.  So Martin, if you have wisdom at the end.

>> Well, very much like what the lady said.  Sorry for not remembering your name.  But you're right, this is a transition period where we have to get used to an environment which is increasingly digitized.  And it needs to be.  There's no way back.  We need it if we want to use to call the world together well.

So it's not about doing do we want it done, it's about how do we make it happen in a responsible way?  I'm afraid that regulation is needed.  As the point here that to be technology neutral, like in cyber space like in the real space in a way, there's no difference in both the real meaning of the law to protect us as people from abuse, basically.

At the same time, the risks are different in a digitized world.  And we really need to realize that.  There's several aspects that we need to address in the years to come.  I think really making the risky to put people at risk by your products will be good incentive for producers and service providers to think twice of how they want to offer their products and services.  And I think that's an essentially part.  The other part is essentially many chain thinking.  Because we do go across borders.  We cannot solve this in one place alone.  And with that, I think following the transition recognition, there will be more to address in the years to come that we are not aware of yet today.

>> Thank you very much, Martin.

>> Thank you.  And thank you for your question.  I think from a perspective, when I talk about building a human rights approach, that's not from the perspective of being afraid of those innovation but getting most benefit out of it.  And make sure that it's sustain.  And I'm not ‑‑ I think that's how the internet and a lot of technology has been built, to be open, secure, and free for all, free in the freedom sense.  And not everything has developed that way.  And by building the human right into the technology is making sure that what was intended remains in the sense.  And not saying that when you call for human rights or data protection, it means that you're, do you want tonight use this product, you want to make sure when you use them, you're not going to get harmed by it.

So I actually do think this human right approach say per perspective fit for those who want to take the full benefits of from those development.  Thank you.

>> Maybe just to add human rights are not able to be waived.  So like she mentioned, privacy enhancing technologies is something that actually without users to know they are actually enabled to use their rights because factors built in, privacy enhancing technologies.  Chris, your final words.

>> Yes, I should add that one of the articles actually states that they must demonstrate security.  That does not stipulate at whatever level, just must demonstrate security.  On the other hand, when I was with Armco, I worked with legal department to look at the contracts to ensure the technology in the third‑party suppliers had to adhere to things like secure software development life cycle because we knew it was coming.  And we wanted to ensure as a company that we tried to be prepared as much as possible.  Because we couldn't just hand over the liability and risk over to third party which is stated in the EU GDPR.  And just before I finish, I've got cards for everybody as you're leaving or I will try to set them out, but I have to run to another workshop as well.  Which all have real world case studies, just changed it around.  And one of the things we would like to you do, take these back and think about them.  As decision cards and I'll ask Max if he wouldn't mind putting them up on the Wiki as well so everybody can see the different questions.  Thank you very much.

>> Thank you.

>> Yes, I would like to come back on the question of consumer education because I think it's really an important issue.  We see a lot of things on the internet as well as put out by agencies like the German.  He was the I.T. security agency.  For example, recommendations for smart home security.  Put out something on recommendations for smart car security.  So there are tons out there.  But you're absolutely right, it does not reach everybody.  Now with the digital natives coming into age, that might change a little bit.  Because of the interest, et cetera.

However, we also do have something that is so to speak of testing of products.  In Germany, latest in last edition tested wearables and smart watches.  But what we will see is consumer agencies, protection agencies with class actions.  And I think will make a big difference as well

>> Thank you.  Remote.  Two minutes.

>> So basically what I wanted to say is I think the important thing is basically just making sure that more manufacturers of IoT devices are following secure by design principles and therefore consumers of IoT devices can enjoy the digital world in a secure way.  And therefore, we should spread the awareness that cybersecurity is important.  And yeah, I still think that regulation can help.  So thank you very much for having me.

>> Thank you for being remote.  Full 21st century here on this panel.  Thank you all for being here with us.  Few maybe last remaining for my part.  Regulations, so we talk about a lot of different moving targets I would say that depending on the different sectors and devices we're targeting the different requirements necessary.  The question of the user perspective from human rights from company perspective, but also the question of maybe if IoT change privacy understanding what she mentioned at the beginning that actually when IoT devices are recording my communication, they're recording automatically someone else with me the room.  And then of course the question of data protection and privacy as well.  But maybe also IoT will change our understanding.  The kind of a binary dimension, and I am producing something, you regulate me.  But also the whole chain of software developers, of manufacturers, users, supplier, so I think this is a lot of the question of responsibility.  But also the question of cyber hygiene.  We mentioned, didn't mention this word today because we were talking a lot about the user, what the he or she can do and also not just put the password on a post‑it but maybe think about it to use some different passwords or even two‑factor authentication.  And the question also very interesting to hear, we talked about a lot of cybersecurity in a security workshop about privacy and data protection.  So we mention a lot, on GDPR which turns to be a technology‑neutral regulation but two sides of the one coin, one is not possible without the other.

I would ask first of all, if there anybody who wants to engage afterwards, to come to us to the panel.  We will explain also the Wiki.  If I'm not mistaken, it's ‑‑

>> IGF.Wiki.org and I have put my email address next to the notes if tough run out in but would like to engage in the follow‑up.  In typical IGF manner, we could only start the conversation today.  But I think it might be interesting to have a conversation that starts now.  Also, I would like to point out that tomorrow the IoT dynamic coalition has the meeting from 9:30 to 11:30 if I'm not mistaken.  Martin will chair that meeting.  And it will be nice to build on the discussion today and to have some continuation on that level.

>> Thank you, Max.  So we're just starting the discussion.  We're trying to take into consideration what was discussed in the past.  But also maybe to give it to formalize questions and go for the next steps.  What expects to be the next future.  Thank you very much for joining us.

(applause)

 

Contact Information

United Nations
Secretariat of the Internet Governance Forum (IGF)

Villa Le Bocage
Palais des Nations,
CH-1211 Geneva 10
Switzerland

igf [at] un [dot] org
+41 (0) 229 173 411