16 SEPTEMBER 10
DATA IN THE CLOUD: WHERE DO OPEN STANDARDS FIT IN?
Note: The following is the output of the real-time captioning taken during Fifth Meeting of the IGF, in Vilnius. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.
>> PRANESH PRAKASH: Good afternoon everyone, I'm sorry we are starting late I was under the impression that it was 2:30 the session was supposed to start at. Today we are look at data in the cloud, where open standards really fit in. We have among us Daniel Dardellier from WC3; Jeremy Malcolm, Consumers International; Karsten Gerloff of the Free Software Foundation of Europe; Vinton Cerf, father of the internet; Wilfried Grommen, from Microsoft; and hopefully we have Kevin Bankston from EFF for a short interjection later on. So today we will be covering a large array of topics, a large number of themes, both from the technical to looking at how it actually affects consumers to how developers work with it as well as the general principles often this how we deal with Interoperability in the cloud and we will be covering wide, developing countries need be to be interested in this issue. Without further ado you I would like to give the floor to Mr. Cerf.
>> VINTON CERF: Thank you very much, I hope I'm audible, I know we have competition. The head set is plugging into the speaker system here and I'm not listening to the football game or something! I thank you very much for allowing me to open this discussion. I wanted to start out by suggesting to you that there is a very old concept that was very useful in the original Arpinet design work. In those days every computer had a different kind of terminal attached to it. When we were trying to get all of the computers of the Arpinet to talk to each other and offer remote access to time sharing systems, there was a question whether Arpinet needed to know every possible character and everything else. We invented the network virtual terminal. It didn't exist, it was just a concept. But every computer on the Arpinet would present to the rest of the computers this virtual terminal appearance so any terminal in realtime was transformed and presented as this virtual terminal. There is a protocol called Telnet that made use of this. I would like to suggest conceptually that we may need to do the same thing in the cloud environment because the clouds that we know about, like the ones that Google and Microsoft and Amazon and IBM are not the same, they have different APIs and interfaces, so we might need to invent the notion of a network virtual cloud. It's a set of conventions that the clouds choose to present to each other. The virtual cloud each individual cloud can do but it would need to do enough, it would need to have enough function I can't tell that you could do useful things between clouds. So in some ways in 2010 we are at the same place where the internet was in 1973 when we didn't have a way of getting computer networks to connect in a uniform way.
The second thing that I can't read my own handwriting! Here we go. So now how will they interact? The simplest thing I can think of is that the user wants to get his data out of one cloud and present it to another. The user might want to tell cloud A to move data from cloud A to cloud B. The first problem is how to get cloud A to know about the existence of cloud B. We need a vocabulary that let's clouds refer to each other. The second thing is we need a way to express the data that's in cloud A in a way that cloud B can receive, understanding and absorb it correctly. Third for example, the data in cloud A has meta data that tells cloud A how to protect that information, how to control access to the information in cloud A. We need to move that meta data to cloud B in a way that cloud B can correctly implement the access controls so we need to have conventions for this kind of access control and meta data as well. If we decide to get ambitious, not only might we want to move data back and forth, but we might also want to be able to start computations or processes in cloud A and have them interact in realtime with processes running in cloud B. Maybe cloud C and cloud D, too. This is getting ambitious because now we need to have running processes, find each other in the different clouds and actively exchange information during the course of a computation. So I think as we see the potential for using the special features of various clouds we see this possibility of interprocess communication between the clouds. So I think the only other point I would like to suggest in conceptual framework is our users in today's clouds come in many forms, with lap tops and desktops, but increasingly they're coming with PDAs and mobiles. So one interesting question is how do those devices look to the clouds? And one conceptual idea is to treat a mobile as if it's a cloudless. That is to say it has equal standing with the gigantic, huge computational cloud that's out there somewhere in the internet. This is not different than the philosophy that drove the internet protocols, where as an internet was seen as no different than the big cloud computer. The idea of treating the mobile as a little cloud leads to some uniformity and to the possibility that the little PDA or mobile that's doing a small amount for you could be moved functionally into another cloud and be seen by the rest of the clouds as still a cloud so the uniformity has some attraction. Now whether this is a good or bad idea, I honestly don't know, but I think having uniformity and not forcing distinctions because of computational power and memory and so on may actually quite useful so I leave that to the discussion that will follow and the presentations that follow and thank you for allowing me to participate today.
>> PRANESH PRAKASH: Thank you Mr. Cerf. I would ask the audience to hold questions until the end of the session where we will bunch them together and have participation as well. I would next call on Wilfried Grommen from Microsoft.
>> WILFRIED GROMMEN: Thank you Mr. Chairperson, good afternoon, everybody, it's in fact a pleasure to hear the forward looking of the inventor of (Away from mic.) I am going to stay with the actual status of cloud deployment and cloud offerings, and as the title of the work session was really focusing on open cloud and on standards, I would like to a little bit put that into the context of what we call the Interoperability principles of cloud platform, because at the end there are a number of dimensions to be taken into consideration on defining openness of cloud platforms today, or tomorrow. So, in fact, I would like to cover four topics, one is to do with data portability, the way that data gets in and out of clouds and has been referred to that this can go very deep if you talk also about a meta data layer. The second is the he's of migration from, say, existing infrastructures and two cloud computing models and what that would invoke or changing from one infrastructure to another cloud vendor infrastructure. The third is the discussion about the developers, meaning which way have developers kind of freedom of semantics and such and I would like to conclude with the standards as such. If you talk about data portability, the principle is clear that in cloud platforms, it should facilitate the move of the data in and out of the cloud. How do you make this tangible. We document performance and we do it here to check messaging standards, like RES and PUP and the application levels do have open standards, like document format standards. There is still a lot to be done and there is still a lot of custom data so I think cloud providers document now their custom data and I give you one example. We just released, in fact, to Open Source two kits to decode outlook files like they are in the cloud or on the premises infrastructure. We have launched an initiative called Open Data Access which is, in fact, an extension of PUP and RES protocols which makes it easy to publish data on the internet and for the cloud and for that specification, it has been released, so it is, in fact, a free release from our perspective.
So the concept, yes, platforms should facilitate the move of commerce' data in and out of the cloud. The second thing is the he's of migration. Cloud platforms should provide a secure migration path to preserve investments which people have done in their infrastructure so there should be a co existence with on premise software, private cloud and public cloud. So these discussions about migration today, I think, the results are covered by individual cloud providers, in our instance we do provide tools, measurements, and services to he's this migration, but I'm talking then about Microsoft's cloud offering. I think there is a need and it was mentioned clearly here, to really come to a common kind of user infrastructure where these services are mentioned.
There are services where this seems to be easier already and accepted. I talk about identity management. We as an industry seem to have aligned on the federated identity management with WSP, Star and other protocols where there is a bridging which is enabled.
There are, of course, initiatives around virtual machine portability like the OVS group is working on, which I think are another kind of evolutions in standardization for the he's of migration. The third element is the element of developer choice. Cloud platforms should offer developer choice in software development tools, languages, and run times. So this is the really the opinion today from us as a company, and we try to live up to that expectation that actually the deployment and offering of our services around Azure we clearly state that PHP, Java and software tools like Eclipse are today made possible to use, in fact, the services of our cloud. But the concept is there that, in fact, developer choice should be there in programme tools, language and run times to use cloud platforms.
Then the last one, to conclude, what about standards? Cloud platforms should support commonly used standards. I don't think that anyone questions that. But we have to be realistic that today there is a need for new standards. To really comply and live up to what I call clouds are able to offer and I think in the future road map as described often there is a lot of work which can be done. So our position in general as an organisation is that you could summarize it. As cloud provider you have to make it easy to move data in and out of your platform and there should be a way that you evolve to do that.
Data within the cloud is available and can be exported, there is still a lot of work to be done about the processed element of data, the meta data aspects of it and I think we are at the beginning on discussing in an industry, you know, what to do with that and what could be the next step. Then in general, I think cloud platforms support widely accepted industry standards and where they are not existing, I think we have to start working and adopt them when they are industry ready, thank you. Thank you.
>> PRANESH PRAKASH: I would like to call on Daniel Dardellier to speak now.
>> DANIEL DARDELLIER: Hello, good afternoon, so I'm going to try to present to you shortly what World Wide Web Consortium has been doing with cloud work, even though we don't have a cloud and we don't have one, we've been working for the cloud, the cloud existed before it was called the "cloud "on it idea of using the web as an application platform, running script language, using HTML to present that, and Java to access a server, that was sort of the part of the iceberg that we have been doing for a while, so recently (Away from mic.) Focusing on the application part of the cloud it is web integrated so that you can find the data in a meaningful way, and sort of an application model where the data is stored in the cloud. There is not much difference for us. Because of the independence of those layers. The World Wide Web Consortium never actually started anything, if you remember ten years ago we were asked to standardize Java because it was part of the web but at that time we decided there was not enough of a web technology, Java is a computer language that happens to be useful for the web but it's not a web technology so we never worked on computer language like Java script or Java and we get a need for standardizing those language or others, how do you read and write and update in sort of a universal way using some cloud discovery mechanism and things like that. That's, again, something we're not planning on working on because it's mostly a computer level kind of work item. So the Amazon, the cloud technology is SOAP, web services, which are web technology and we think that there are they are stable enough today that we can actually have them being exploited like that by the greater community, by the cloud community, business in general and we've been in sort of maintenance mode for this technology that all of these basic web technologies, but there is still come work to do, of course, as people working on the platform, middle platform level have to use those HTTP and other web technologies. So all in all there is a lot of cloud activity at World Wide Web Consortium, there is not a name for our project but we realise there are parts missing like web identity is not something we have found a solution for yet, so you can share private data across clouds and web sites and that's something that everybody wants to standardize but there is no clear area where to start from so we're probably going to look at that because it's a need for the social web not just for the cloud area. . That's about all I have to say for that. Thank you.
>> PRANESH PRAKASH: Next is Jeremy Malcolm who will bring in the consumers' thoughts on this.
>> JEREMY MALCOLM: I'm going to be looking at the cloud from the perspective and access to knowledge for consumers. That may seem a little out there but hopefully it will become clear. Consumers seek fair access to the fruits of their society's culture and this is what we call "access knowledge" and where this is stored, locked up, through digital management, and products that you have to access, already you can see why access knowledge may have a bearing on open standards. Access knowledge and open standards I will look at out of the cloud before I move on to the cloud. Things like document and media formats, with document formats it comes down to the well known quote of Bob Young at software who was asked the question would you buy a car with the hood welded shut? Propriety documents prevent vendor lock in and they stop you from moving software application to another with your documents. They inhibit the ability to maintain access after the software that created it is no longer supported. On the other hand open formats do facilitate those things so you have property document formats such as word processes, property devices, and other things. (Away from mic.) Even the PDF has been an open standard since 2008. With media formats, it's the similar story, we have the same problem with lock in and archiving of property media formats, additionally propriety media formats are likely to be locked up with DRM to a greater extent than document formats, also we have the issue that they can't be used in free software and my friend from the AFF is going to sorry from The Free Software is going to talk to us about that. So ideally they could be read and written free. Being an open standard in a narrow sense is not necessarily enough for it to be really free open standard because it may be pay the he not encumbered. You've got a scale or continuum of media formats from the totally propriety, such as such as windows, to an unencumbered, and then in the middle you have standards like mp3, so really from the consumer's point of view you want the most open and least encumbered format that gives you the greatest form to use it in free software and to create as well as to consume. Consumers nowadays are creators as well. So this is outside of the cloud, outside of the cloud, open standards and Interoperability support access to knowledge. Standards are a partial guarantee of that but we need standards that can be implemented freely and that aren't locked up with DRM and that or with pay the he not restrictions. Moving on to the cloud. It really builds on what I've said. The same concerns as outside the cloud apply, Interoperability, lock in and archival, exactly the same issues except that the cloud raises additional concerns because the data and code is physically removed from the user. At least when the user buys a piece of software and installs on their home computer it's on their home computer so they can to some degree hack it or whatever more easily than when it's on the cloud and it's out of their physical control. Moreover when user's information or documents are stored in a cloud often they don't know what standards have been used to store it, for example, Google.docs, the user has no idea what format it's stored in, whether it complies with standards, sure, you can export it but what format is it stored in? The user has no way of finding that out. The other additional problem with the cloud for consumers is that the use of propriety standards may export restrictive intellectual property laws from where the cloud is posted to the user is consuming it, who may be in a completely different country. So, for example, they maybe infringed from a pay the he not because it's enforced in the place where the cloud app is hosted. So you can see cloud applications have the same problems without following open standards they have the same problems as outside of the cloud plus more.
Now, I'm going to look at some concrete examples of this. I'm going to talk about three consumer facing cloud applications. You'll see from these examples that I'm talking about cloud applications in the broadest sense. I'm talking about social networking applications like Facebook, microblogging applications, Twitter is the example there, and peer produced media such as You Tube. Beginning with the first of those which is social networking, how do open standards for social networking impact consumers' interests? Interoperability with other social networks is important for consumers and open standards can help with this. Users want to have a single social graph so they don't have to construct their "friend" networks in every social networking web site that they sign up with.
They want to have Interoperability between networks to share their status updates, to share their messaging so Interoperability with other networks would be assisted by the use of open standards. Secondly they want privacy controls and the problem with being locked in and let's take Facebook for example, it has been notorious for stealth privacy changes, slipping it under the user's notices and opening up activities to their friends or their friends' friends. These open standards can be a way to give you greater control over your levels of privacy. Portability and ownership of user information is also very important. What if you want to take your information out of the social network and put it back into your own physical control? Some networks this is partly a technical issue and partly also a legal issue because some networks lay claim to your own data. When you contribute your own data to an online web service or web site they may claim to "own" that data under their terms of service. That's a bit out of the scope of this discussion but it's a technical issue that you want to be able to get your data out and if it uses an open standard or expects data in an openly specified format that is in the consumer's interest. So there is movement towards open standards in social networking. We have, in fact, today or yesterday, Deaspiro or the source code released and it's being used by some people. That's the up and coming Facebook challenger that's based on open standards. There is another one called "One Social Web" it's based on XMPP which is the standard best known for its use in Java. Microblogging is another area where we have concerns with the lack of use of standards. For example, putting your reliance on Twitter, when Twitter goes down or is attacked you want to be able to have access to your updates and to continue tracking the same contacts. If you use a federated network of microblogging sites that used an open standard to communicate between themselves that would be easier and there is such a network, called Status Net and it's deployed by Identica and you can use that, it uses a standard called O Status, which is admittedly young and not mature and it has developed out of a previous standard Called Urban Microblogging, and it would help you from putting all your eggs in one basket and it can help with client software. For example, Twitter has enforced the use of a standard called O Off for encrypting with the server and the problem with this is open source software is required to use a secret key to make shoes of this authentication and the secret key can be read in the source code so this has caused problems with open source software developers who want to develop clients that can interoperate with Twitter. So better to look at something like Status Net which is designed from the ground up to be fully compatible with open source clients. I'm probably running out of time here. I wanted to talk about standards, and I won't say very much there except, of course, there has been discussion around cross platform, are cross browser standards for open video, PTML 5 at the moment you have to code in two different formats if you want to catchall the browsers, hopefully that will change in the future because Google has purchased the VP 8 video and has bundled that into a new standard walled Web M, and the licensing authority has threatened to try and create a pay the he not for VP 8, so we'll have to see whether that plays out, will we have an online standard for Open Video or won't we? I will conclude by saying open standards can serve consumers' interests in choice, competition, privacy and access to knowledge but we have to be careful that cloud services don't raise new concerns for consumers in these areas because they take data away from consumers' controls and they're not trance parent about what standards they use. So the use of open standards in cloud services can help to facilitate Interoperability and alleviate consumers concerns in these areas, thanks.
>> PRANESH PRAKASH: Thank you very much, Jeremy, we will have time after all the presenters finish to carry this on, so you will get more time. We have Karsten Gerloff of the Free Software Foundation of Europe.
>> KARSTEN GERLOFF: Thank you, I would like to thank the previous speakers, Jeremy you've taken the load off me to explain many things that I wanted to explain and I'm happy I don't have to anymore!
(Laughter.) So I will move on to the visionary things as is my privilege as the president of a foundation that prides itself on looking to the long term, looking into the future even though we might not know at this moment what it will turn out like. So I will not speak so much about technology as rather about power and about control and about freedom. We've heard about data portability. There are a number of clouds out there. What is a cloud, anyway? I actually bothered for the first time to look it up on Wikipedia this morning and the explanation was that cloud computing is the idea of basically having it's a long article but it's nicely summed up in the first paragraph of having computing be like electricity, you just plug in, you don't care where it comes from. I don't know where my power comes from exactly but then I am not sending my private data up that outlet. So data portability, we have many different clouds out there, and if we as customers or consumers want to move between these clouds, from one to another, maybe between social networks, between computing platforms like Amazon's EZ 2, so something else we want to take our data with us and our meta data as well, not just the information but the meta information about how the data is organized and how it should be read. Then that takes us to the second point which has been mentioned of Interoperability which goes further. You should actually be able to import and export that data but you should be able to have those clouds talk to each other. Now, an importance element is, of course, open standards. Since we've been hearing about open standards so much I would like to restate what open standards actually are. These are standards that can be freely implemented by anyone without asking permission from anybody. These are standards that are maintained in a public participative process, several of them, preferably at least free software and an important aspect which is forgotten is that for standard to be open and the users to derive it's been fits, the benefits of openness, the patents that are referenced in the standard need to be licensed loyalty free, otherwise there can be no implementation in free software and we will simply not have a competitive environment around this standard.
So for clouds, are as clouds become economically and from a user's point of view every more important it's important that they are based around open standards.
Now, this is what we've had so far, let me move on to the visionary sort. Right now clouds are mainly you said that we are back in 1973 and I agree, we are basically when we talk to a cloud it's much like talking to a mainframe in 1973 except that I wasn't around then!
>> VINTON CERF: I know, but some of us were.
>> KARSTEN GERLOFF: I'll rely on my left hand neighbor to supply the comparison!
So 1973 might have been a good year but we didn't to want remain there and I don't think we want to reproduce the age of main frames today so we need to move forward. Why not retrace the development of the internet as it was? Why not have devices that are equal in speaking to each other? Why not have distributed clouds? Peer to peer clouds? Well the data centre come from, you ask? Well the servers be that run this distributed peer to peer cloud? They will be in your living room and in mine. They will be small servers, if I had slides I could show you pictures but there are servers about this size, cost $100, presumably less if you buy them in bulk or even routers that have been reconfigured with a free operating system as firmware to function as a server. These have low power consumption that can stay connected to the net, that have extremely low complexity and I think no moving parts. So why not use this hardware? Why not pull together all the free software packages that are out there? Develop the few that are missing and build a stack that can be installed on one of these devices? Well, instantly, after you configure your name and email address and perhaps a few other parameters will instantly connect you to devices of this sort or a subset your choosing, will give you a mail server, data storage, instant messaging, social networking of the distributed sort, anything you can imagine. Let's simply cut the mainframes out of the picture! I'm not sure we can do it today, but having seen how free software advanced from an academic pursuit and idea 25 year ago to a 50 billion economy today I'm not in any doubt that it's achievable and within a rather short time frame, let's say in three years. So why am I harping on distributed peer to peer? Because it removes the central point of control. For us at the Free Software Foundation in Europe, the reason we care about the freedom to use, study, and improve software is not like because we like downloading for free, well, we do, but it's not the main reason! We believe that in a society which is essentially running on computers, where the computer is a central tool that we all use every day in so many different devises in your phone, refrigerator, in your car, in your train, that it's essential that we be able to control those devices and the infrastructure, that we know what's going on within our tools and we can shape them before they shape us! So coming from this idea let's look at the clouds as they are today. Free software consists is defined by the abilities, by the four freedoms it gives you. Can you ah, yes, the four freedoms, to be able to study the software and understand it you need to look at the free form of that software, to share the software with anyone that you want, like I can share a hammer with my neighbor but why should I share my tax programme? He doesn't need a copy, and it's a tool. The ability to improve your needs and pass on those freedoms, which of those freedoms does the cloud as it stands today give us? Let's go one by one, freedom zero as we like to count, freedom zero, the freedom to use the software for any purpose? Well, we can use most cloud services, some of them don't even charge you. But most of them have terms of service which state clearly the purposes for which you may not use the service. So they're usable but not for any purpose so we get half a point at freedom zero. Freedom 1, to study the code or software, rather, and understand how it works. Many people love, for example, the Google Mail interface. But I don't think anyone outside Google has looked at it and been able to look at the source and understand how it works or Facebook, which both services run as far as I know almost entirely on free software. Yet you can't really look at how they do what they do. So you can't study it, zero points on freedom 1.
Then freedom 2? To share the code, you don't have the code so you can't share it, zero points, and since you don't have it you can't improve it either so out of four freedoms, we have half a freedom that these mainframe clouds as they are today give us. So this is why I say, let's build distributed infrastructures, let's build peer to peer clouds that we all control, let's invent free software which we can examine that encrypts our data, breaks it up into different segments, stores it on servers and in the segmented and encrypted form with us as the only people being able to pull it together on our client put it together in the form that it was meant to be read and decrypted. So for this vision of open standards, instrumental, they're merely not sufficient right now. We have the hardware, we have most of the software, it needs development but as Jeremy mentioned just yesterday or today, depending on which time zone you count the Diaspera project released it's long awaited developer release and there are a number of other free distributed social network models out there which don't have quite as high of a profile but are viable so we're in an exciting phase there. Let's build software that allows us not to share but to distributed our data and be sure that we are the only ones who decide when and with whom it gets shared. That's, I believe, all I have to say about this for now. I'm looking forward to your questions.
>> PRANESH PRAKASH: Thank you very much, Karsten. A very rich talk, looking both at where we stand right now as well as where we could possibly go to and we have one quick intervention, Mr. Cerf?
>> VINTON CERF: So first of all I love the scope of your vision, it's pretty ambitious. I need to remind you of course that there is a difference between vision and ha loose nation, so we hope we can distinguish between the two. I don't know how far we will be able to go with open source but Google as you know is very committed to making a lot of open source available for many of the reasons that you suggest. I wanted to bring up one other point which touches on the standards issues and the let me say the persistence and preservation of data over long periods of time. I have been characterizing this as the bit rock problem and what I mean simply is that when we have information in digital form and whether we put it in clouds or in our personal machines or put it in memory sticks, often that information is not understandable without an application that can interpret that format. And if the engine that runs the application is no longer available, even if the bits are there with the data they aren't very useful. So the worry that I have and I don't have a good solution at the moment, maybe not even the open source one, is how to make sure that over long periods of time and here I'm thinking hundreds of years, that information that we have composed using various tools regardless of whether they are proprietary or open source can continue to be correctly interpreted. There are intellectual property hazards associated with trying to maintain access to this information and its ability to be its ability to be interpreted. If some company offers a proprietary package that we use and we create data with it and later they say I can't support it anymore or they say I will support it but only on these operating systems and too bad if you don't have that, there are barriers to our ability to keep information correctly interpreted. So I'm not suggesting a solution here but I am saying if we don't tackle that problem, and I think it's potentially it can be tackled in part in the cloud environment if the clouds, for example, can run multiple operating systems, can run multiple applications over long periods of time, if we don't do that, then we will lose our digital memory and people in the 22nd century will wonder what the 21st century was all about because it will be documented in digital securities that nobody can interpret. Just to make the problem of cloud computing harder I add that to the list of things we should attempt to achieve.
>> PRANESH PRAKASH: I think this is a wonderful discussion we are having here between the perhaps conflicting requirements of privacy and memory and with that I would like to move on to Viive Näslund.
>> VIIVE NÄSLUND: Thank you. I would like to talk about where do countries fit into this debate when talking about open standards and data in the clouds. We know that developing countries are moving to implement cloud computing and we can see this for example with the growth in data centres in many developing countries. But it remains the case for the majority of developing countries that have scars resources and are making huge investments in basic ICT infrastructure still, dealing with issues of electricity and ensuring reliable, high speed internet connectivity which is the basis of making cloud computing work, this is not an issue of the present. It is likely to be an issue for the future but for the near present, cloud computing is still not used in developing countries. Withstanding that, nevertheless, the way in which cloud computing overall develops now will have a significant impact on developing countries. Now, cloud computing is in an infancy stage and therefore we need to give attention to building the proper foundations to create a cloud echo system that will be supportive of the information society that is based on the principles we have agreed upon. For developing countries in particular that means ensuring equitable access to information knowledge for all, in recognition that of the world that ICT plays in economic growth and development.
Now, we have heard a lot about the promise of cloud computing for developing countries and there was a previous session today also that discussed these issues. I won't say too much on that, but to say that I will say that the promise of cloud computing for developing companies depends on how cloud computing develops. One of the basic promises of cloud computing are for developing countries is to lower the costs of IT infrastructure and the management of such infrastructure. The idea that we might be using virtual devices, we might have more computer power and availability to process data and to store data. Now, benefits that are highlighted usually for governments, and improving E government and providing government services there is talk about the potential for small and medium enterprises of developing countries to access cloud computing as a more level playing field where the entry barriers are lower for them to provide their services under these platforms and for consumers the availability to access mull I applications and even become developers.
So, again, to highlight the point, the promise of whether this may happen or not depends partly on how cloud computing continues to evolve. Here we come into the question of where open standards fits in.
A lot on this has been said before and what I would add to this is that the main issues that have been highlighted before really apply same way to developing countries. An additional element would be the importance of afford ability and access to cloud computing services, and the other points that have been made prior are equal important and these being Interoperability among the clouds, we know now there is a mix of private and public clouds, and in the case of governance, for example, we will be seeing these challenges in the Interoperability of clouds and here there is an important role for open standards. On the issue of portability, when we are thinking about countries implementing cloud computing, we need to be sure they will be able to move data in and out from the cloud and the issue of functionality of devices, I like the idea that mobile phones might be looked at as cloudlets, and we know they have the highest penetration in developing countries particularly for rural areas this is quite significant so if we could focus on ensuring that we have functionality of these devices and to really try to ensure that mobile phones can be used to access the clouds in an Interoperability manner. The he's of migration from existing infrastructure to cloud computing models, again, equal important. Related to open standards, the need to avoid vendor locking is usually crucial for developing countries, right now governments are being approached by vendors who are selling related services and toward the future it will be extremely important to ensure that there will be continued choice among clouds and, again Interoperability between them. Finally, another issue, the freedom for developers being crucial in ensuring that there will be choice in these tools but at the same time to ensure that developing country developers will also be able to carry out their activities in an open and reliable and secure cloud. Thank you. Thank you very much, Viive.
We had Mr. Cerf talking about the cloud computing and pointing out issues of data and meta data portability, Wilfried underlined that and spoke of how we use standards as a means of ensuring Interoperability and the importance of keeping those standards open. Then we had Daniel talking about the way protocols develop into standards in a sense, how this happens in a bottom up fashion and how we need to account for that when we talk about standards and how there are so many different issues that we haven't grappled with including online identity, for instance. And how when we are looking at cloud computing and when we are looking at cloud platforms, how such issues really need to be dealt with. Then Jeremy spoke about consumer concerns in cloud computing and the importance of from the user's perspective of being able to shift between different services to be able to shift between different devices and different applications and still retain control over their own data. Karsten reminded us what open standards really are and presented an interesting idea of federated and distributed peer to peer clouds as a means offer rat indicating central points of control and giving users much more control themselves. Then we had Mr. Cerf again talking about the bit rot problem and this is, again, a fundamental problem that lots of the ideas that are being thrown out today are very new. As ideas themselves they aren't very new but they're new, given the way we are addressing them, given the skill of the problem, given that so much of the world is now and the world's information knowledge is now contained on line and say moving computing away from local servers then Viive told us about developing countries, how they fit in best and how it's too early to really call and while there is potential we still have to migrate the waters carefully to ensure that openness of standards is maintained. I have a few questions of my own to ask the panel, but before that, I would like to throw open questions to the floor and I would request anyone who is asking questions to first introduce themselves and then perhaps mention who in the panel they're addressing as well.
>> MIKE SACHS: My name is Mike Sachs, and cloud computing is undergoing so much innovation in some different areas, managing the cloud, clouds communicating with each other and as we think about open standards, I think it's important that we allow room for that innovation to happen. We choose standards that are flexible, that allow different people in the community or vendors or groups to create new innovations that can bubble up, that people can gravitate toward, and I think HTML is a great example of a standard that has been very successful but has allowed different parties in the community to build and extend it and then have the community good afternoon take the toward things that then get enrolled in the whole standard. If we have a top down approach where basically a committee decides what's good for us, we'll be less productive for everybody. Even though in the short term there might be more chaos in a bottom up approach where we allow different approaches to become more successful or not, HTML5 is an excellent example of where experimentations have ended up gravitating to a common standard that everyone can be happy with, and innovation cannot be forgotten.
>> PRANESH PRAKASH: You can respond, Mr. Cerf.
>> VINTON CERF: First of all I strongly endorse your point, and we learned to not try to overspecify or overdescribe what you want, so be careful. In the internet there is a notion of layering and to build interface standards that were stable but allow nor evolution in the implementation and co existence of multiple protocols, for example, at one layer supported by those below. May I suggest a different vocabulary than layering, which feels restrictive now but compartmentation so we have interunits that work and can be drawn upon in order to achieve things, allow us to add new components overtime when we realise that there are better ways to do things or there are new things to do. So I think the point is extremely well taken. I do remind you there are committees and there is nothing wrong with a committee, it's just that the committee's mind set has to be to retain this open flexibility. If it doesn't understand that then the outcome is a problem whether it's a committee or some individual who is trying to drive a particular result.
>> WILFRIED GROMMEN: If I may react, I also think in the definition about these open standards, I think there is a maturity model developing in the industry. Sometimes I think that the ICT industry has become using concepts about hooking things together and we really try to get to. There are two things I wanted to add so endorsement for open standards, go for open standards but there is a comment made once by a standards officer who said a standard is only as good as its industry adoption, which I think is an important fact. There are so many standards I think we have to have a co existence model but finally it's going to be the consumers, are end users, the businesses, who are going to say that standard lives up to the expectation, that's the first comment. The second comment I wanted to make is I do not agree with the definition of an open standard again. Open standard definitions are very well written down by the open standards councils, are in the discussion with the AU commission, an open standard is not de facto intellectual property 3 it is made available on reasonable and nondiscriminatory terms and the whole discussion about this open standard shouldn't be mixed up because in the discussion about being, you know, flexible and being open and innovative, you should not underestimate the drive there is for younger companies, for companies who have invented things to share it for the bigger purposes to have the benefit of their intellectual property. Of the example was made about mp3, that is one of the best examples that these models do work so I'm countering the thought that things be IP free.
>> PRANESH PRAKASH: I would like to point out that some of the most popular implementation of mp3 have Benthos that are possibly illegal and like Lime and those which have not been sued and things like that. Royalty free is not a debate that I would like to get into right now but that mp3 cannot just by itself be used as a glorious example of how a nonroyalty free format is a great success because the implementations makes a difference.
>> WILFRIED GROMMEN: I was hesitant to go into the debate because it would be a shame for this discussion to go into that debate, I'm saying I question the definition that was stated.
>> KARSTEN GERLOFF: I would like to thank you, Mr. Grommen for making off Microsoft's point very clear, because it's not always expressed in front of the European Councils, and so forth, so I believe we can agree to disagree.
>> KEVIN BANKSTON: Hello, my name is Kevin Bankston. I'm a senior staff attorney with the Electronic Frontier Foundation, I prefer to that I of Carson's dream, as something that may be one day fulfilled.
>> VINTON CERF: It's a collective hallucination.
>> KEVIN BANKSTON: It's a vision I share as well, and I will direct my question to Jeremy, although I would be curious for anyone's perspective. I think there is a fair amount of consensus in the social network realm portability is important in terms of maintaining privacy to allow users of dominant players such as a Facebook the ability to move to new and better privacy policies, but I think that social media poses a counterbalancing privacy problem to the extent that unlike a basic cloud storage service that stores your data on a social network the data is very often social and it's information, communications that others an equity in, not just you, so how should we address the issue of portability of this data? If we're talking about, say, a discussion on my wall should I be able to export that or a private conversation in messaging through Facebook, should I be able to export that or should we respect that it was snared in a particular context and not exported? This is a particular concern in terms of contact information which you did not upload but that your friends uploaded and made available to you and the export ability of that data is the most important to facilitate competition because if you can't move your network with you it's hard to move to another social network site. This is a problem that we at EFF faced internally when trying to formulate our bill of rights for social network users, we have a right of control over your data and we have the right to leave which means the ability to delete or move your data but limited that to content that you upload to avoid the sticky question. It is a sticky question, I don't have a silver bullet answer to it, I have a particular position but I won't express my own position, because EFF does not yet have a position on it. I was curious how you think should we treat this as email that we can easily move around or should there be more restrictions to respect the privacy of your friends on these services? Daniel Dardellier certainly
>> JEREMY MALCOLM: It should be specified that the information is tagged differently so it can be treated differently from public information so e mails, obviously are not going to be something that can be transferred as easily, email addresses and things transferred as easily, peer addresses shouldn't be able to be transferred, that sort of thing but public discussions that are post to do a public forum, they should be able to be, so we've got to create a set of rules and it's not a technical problem it's a social problem and we need to decide what things should be what default rules to set down for the portability of different kind of information and the default rules are important because most users won't change from the default rules so it's not good enough to say well if you want this to be private you can go into your privacy settings and click here and make it private, the default rules have got to make sense, and it's something that the developers of these new specifications should be engaging with the broader community in determining and that includes the community here at the IGF.
>> I wanted to point out, if we're talking about public discussions I think that should be exportable but in terms of semi public or private communications there is quite of range of ways we could deal with this, one perspective is you should only be able to export your uploaded content and not that of others, versus a more free expressionist or innovator or hacker perspective which should be I should be able to export every bit of data that I have authorized access to and it seems there is a range of potential possibilities in that space but I don't know where the proper balance might be and I also don't know how much of that is a user expectation so that it might be a problem if I can today export all the services that I have on Facebook but if there were a new service tomorrow that offered perfect portability, we couldn't have that conversation because people would go in with the expectation that it can be expected to another social network, client or service.
>> KARSTEN GERLOFF: Let me make a suggestion about a way to think about it. Let's imagine that you have a collection of information on some social networking sight, doesn't matter which one, and you want to move that information to another social networking place. Let's imagine that you have access clearly to the set of information in the first site. Nothing? Am I disconnected? How's that? Okay, thank you. I did push the button before, I don't know what happened, there is only one brain cell and it's being time shared among all of these!
To come back to this let's suppose you start out and you have information which you have access to in social networking site A and you want to move that to site B. If the movement of that information lands at networking site B in a form which is private, that is to say when you move it, it is only accessible to you, well even if it was accessible to other people in A we start out with the premise you're moving this and only you have access to it. Then we now leave it to the party that moved the information to take viability for making that information available more broadly and now a moment of thoughtfulness here about whether you should or shouldn't make that information more available, it doesn't solve the problem but it contains the problem, at least initially when the information gets moved. This may turn out to be a stupid idea because it's not very workable but it's an example of trying to respond to containment which is what you were concerned about.
>> PRANESH PRAKASH: Go ahead.
>> AUDIENCE: Good afternoon.
>> PRANESH PRAKASH: Introduce yourself and your affiliation.
>> SUSANNA SOGIANO: My name is Susanna Sogiano, and I am from one of the Universities in Portugal. As an academic person and researcher, I like to see this future version of Karsten on peer to peer and cloud computing, but on the other hand I was thinking to myself and these are two opposite paradigms, right? So we have peer to peer, and we have these resources, somehow completed within the network and then we have the cloud where we have, again, the resources in the network but somehow centralized in large, large servers. So my question is, you think that these two paradigms, they can combine themselves? And the other question is I think about all the complexity on management of all of these, or management of all the servers, all the information so what do you think about this?
>> PRANESH PRAKASH: Is that addressed to?
>> DANIEL DARDELLIER: Can I take it? So as to your first question regarding the co existence on the two paradigms, they already co exist, if you look at the way email is used and set up, you have a plethora of small email providers or people who host their own server because it's not very hard to do. And you have huge email providers like Yahoo or Google or what have you and they would fall under the I would see them in the civilized domain, yet you're free with your email server in your living room you're free to interact with G Mail or interactive mail because it's an open standard. So I see these two paradigms co existing into the foreseeable future. As for managing the complexity that is the question we need to work on. I've been reviewing a number of free software projects in this space and individually they're making progress exploring different routes but it's an exciting time, none of them is quite there yet but things like New Net might be worth looking at. So programs that enable distributed, encrypted data storage, very interesting. FSE's web site is using a distributed search engine, called Yacy, where even on a small laptop like this you can install a full search engine and start spidering other sites, you can share the index with the network of Yacy users or just the matters that you elect to share so there are interesting considerations going on right now and I'm looking forward to seeing how it develops.
>> KARSTEN GERLOFF: I would like to add today we have big clouds in mind when we talk about cloud computing and I don't think we should wait long that we will have an enormous proliferation of private clouds. If you look at the technology, the container based systems with endless capacity which are just contained and can be deployed in any country, I think the network of clouds is going to be there faster than we think it's going to be there. I think the two models will definitely co exist, there is no other way.
>> PRANESH PRAKASH: There are so many topics that have gotten addressed, I made a note of a new of them, on one hand we are talking about the standards using in a sense in which we dislocate from a place, and we are talking about the final data and how easy it is to move that, so we were talking about governance and consumers and that's another aspect and standards play a role in both parts of this. And we're we've drawn distinctions between centralize and had distributed, but even within distributed, I believe we're drawing distinctions between those that are distributed and controlled by one company and those which are peer to peer distributed and this in itself is a different distinction from public and private clouds. We've also touched a little bit upon the role of privacy and encryption in managing this, and some solutions that we have as of now and some proposals, don't really account for encryption, hoping that being behind a kind of wall, being in the cloud means that things like encryption don't really need to be thought about. And we're looking at cloud computing as something that drastically reduces costs and hence, this is something that governments are very interested in. . Well that's just a few of the issues that we have addressed.
I have a few questions and I was curious about cloudlets, as Mr. Cerf talked about them, and I was trying to understand what the relationship of cloudlets and of each mobile phone as something that's an equal player in the cloud and how that is related to the idea of peer to peer federated clouds so if Karsten and Mr. Cerf would address that question, I would be happy.
>> VINTON CERF: I'm sure you all appreciate the notion of the cloudlet is very thin but the idea is that we may not care whether it's a mobile phone or a small cloud in your basement or under your bed or a gigantic data centre somewhere that the protocols that allow them to interact could be standardized. The little cloudlet may not have all the function I can't tell and the capability of a large cloud and there may have been to be negotiation over that in order to make sure that the large cloud doesn't overwhelm the cloudlet but you know this is what happens in the internet world already, a small machine on a low data rate system can talk to a super computer a 10 gigabyte pipe and the TCP flow allows in theory the two to talk without one being overloaded by the other.
So as we think about the range of things that could behave like clouds, we could September idea that some of them have small capacity and function I can't tell or limited functionality compared to the others and it's a question of making sure that both agree, negotiate and agree on what the two of them are capable of doing while interacting with each other. I don't think that's out of the realm of reason to establish in the form of standards, maybe you would like to amplify on that?
>> KARSTEN GERLOFF: Well, I could simply not agree more, the internet is built on the idea that all the devices that it doesn't matter what device you connect to it because we've defined the way they communicate and why not simply extend that idea to the cloud? I actually the more I think about it the more skeptical I become, whether the cloud is a new paradigm and an entirely new technology or just a slightly different way of making use of the hardware we have. I'm putting it in a provocative manner, of course, but it's perhaps not as revolutionary as the marketing strategies would like us to believe so there might be a need to throw overboard all the principle and techniques that we have had good experience with on the general internet.
>> VINTON CERF: Let me say there may be properties of some clouds that are rather different than the things we've used before, for example, the resilience, the ability to expand and contract the amount of computing available to someone, the ability to replicate information and distribute it for reliability so there may be some that are different but the idea that we allow for a big spectrum in a standard way so that if I have something working in my mobile and it needs to move from the mobile setting to a more robust cloud like setting, it's trance parent and I can make that process go from my mobile to this bigger are cloud in order to keep doing whatever it is I'm doing, to make that as trance parent as possible requires some common standards and conventions. And I think that's where I would like to see us end up so that we have not bound ourselves into two fixed definitions of what a cloud is.
>> PRANESH PRAKASH: I had a question for Wilfried. You mentioned open data protocol in your talk and I have an interest in this since I read the first blog post about it. We're doing work on open government data and open data is a thing that we're looking at, rights, so however when I saw the open data protocol, some of it went above my head so on the one hand it seems to be saying that standard or what are common enough things like Jason, et cetera, will be used, on the other hand it's mentioning things like Silver Light, so I'm not sure how this entire stack builds into something like open data protocol, and I'm wondering whether by and I quite obviously haven't gone through the extensive documentation and the protocol itself is developing, there is a mailing list with at least 20, 30 e mails a day, but are we specifying too much currently in the open data protocol? And this is a question from someone who is not that technically proficient and has genuine concerns about what we spoke earlier about, about not overspecifing and leaving things at the end.
>> WILFRIED GROMMEN: I think you should make a clear distinction between the kits which can use the protocol as it is and the protocol itself which is a pure extension of the Atom PUP and the PUP protocol, so it's purely a protocol level specified and made available. Now, to make it acceptable in the market and where we see today that the whole concept is can people publish their data basis and make their data available in a very easy in a very easy way and just to make that application live and show what cloud platforms could offer, there are some solutions added like the Silver Light interpreter, or the I'm just reading which are available as the case for dot net, Java, PHP, which are a number of them are available today. The second thing about it is the idea that if you have this protocol, a cloud platform like ours can, in fact, export this and from that moment on you can have or governments or anyone running that kind of application which shows the density of sleep in one area of the country and say, in fact, scalable as an application, it can be used on any kind of level and by any kind of organisation just to open up this idea about governments, using their data, expecting their data and support it go, so for us at the open government data initiative, which is making old data run on our platform but it's just a protocol and we made it available.
>> AUDIENCE: When we talk about cloud computing being more of the same I think one of the differences is that the solutions, the devices that are talking to each other in the past were basically part of specially one solution and you could with a few exceptions draw a fairly well defined perimeter around that solution and except for ESD and banking protocols there weren't that many interoperating computers out there talking to each other and now it's easier to build integrated solutions once you expose a public API you have no idea what other people can build using your data or your application. That's why there is going to be, you know, probably thousands or millions of protocols defined by different parties maybe to interrupt operate with each other and we can't have an official standard for each and every one of those, right? Or not?
>> VINTON CERF: It would be hard from an administrative point of view to cope with that let me suggest that the typical way of coping with that outcome, should that occur is a registration process where you register the protocol and register important parameters about it and reference documents that describe it and so so so there is a codified way of finding out what that protocol is. I think we will probably find, are as we have in the past, that some protocols will turn out to be sufficiently general, that they are widely adopted and used for a lot of different applications, TCP being an example, while there may be also specialized protocols that don't have a very big footprint, even though they get used because they happen to be well atuned to a particular application. I'm not too scared yet that there will be millions of generally implemented protocols.
>> I would like to make a comment regarding the need to get Interoperability clouds, I think we're sort of not there yet, it's like they said, a ha loose nation, it's not a vision. If you look at the way it works out of the internet it's rare that you have a contract of service with one company and then you close your account and ask them to transfer all of your data to another service company, right? It works for health records, insurance, because they want to track your driving habit but it's kind of a long vision. I think there is something to do to solve the problem that people are trying to solve with that Interoperability is a simple protocol, it's like the one click buy thing but it's the one click leave kind of thing, I want to get all my data in one click, give me a tar file, a zip file, create a format where I get all my picture, my email, everything that I have with one particular cloud, social network and once I get it on a key I can go anywhere I want. So the idea of asking them to cooperate is far from, you know, the reality.
>> AUDIENCE: I would like to address a question to Viviana and whoever else would care to take this. So should governments at all get into cloud computing? Now, if you're thinking only of distributed computing then governments can run their own clouds but we are also talking about cost. Right now from the way I understand it, if cost is the issue then outsourcing of that, running it on someone else's servers is what you're really looking at. If that's that's the way we understand it and I'm open to corrections, then the issue of sovereignty and control is not an issue, and if the standards that actually go into the cloud computing than the documents and so forth, how do the standards of how you're accessing the clouds themselves become an important part of an important issue for developing countries?
>> Thank you. I was having trouble hearing. I think the question of whether governments should get into cloud computing is still an open question. I think it depends partly on what is the current infrastructure that the government has in place and secondly, what exactly is it that government would want cloud computing to do for it? So is it because we want to access, you know, the virtual computer system that are offered and that's the idea of lowering cost, right, that we would be able not to have the same infrastructure. Is it because we want to develop new applications that we would be able to have with cloud computing to provide better services in the country? And I think these questions are the ones that really need to be designed according to the specific context of that particular country so taking into account the large diversity of developing countries I don't think we should come up with one answer but really perhaps think more of framework methodology to evaluate cloud computing with the potential services that it can provide for developing countries. On the second issue, I think you were saying what is the impact on access about having open standards? For cloud?
>> No, my question was just the issue of where so on the one hand there are these protocols such as the protocol that Microsoft is developing on accessing the cloud itself and how things get fed rated, right, and on the other hand the other standards, such as the document standards, et cetera, which is how the files themselves are stored in the cloud, now, I'm wondering is the first set of standards of as much concern to governments as the second set? Because so far, when within something like the coalition of open standards, we've talked about governance and about governments, we've talked about the second set of standards, right? And if it's only a private my question basically is if it's only a private cloud, if it's only a way of reducing costs across different computing devices, then how do those standards really make a difference as or do they make a difference? I can imagine how there can be other ways in which because of royalty free, versus Rand concerns, it makes a difference, but to citizens rights, does it make a difference, really?
>> KARSTEN GERLOFF: The first reaction I would have is it's not just private clouds, although of course when we're talking about governments they are thinking about these issues and assuring security and cost reduction. But what ideally should be happening is that governments will consider when they can use public clouds as well as private clouds, depending on what service. For example, you see even the US government is using public clouds for some applications so with this hybrid approach definitely I would say that the open standards question becomes extremely important. Again, to allow the Interoperability between moving from different clouds that might be chosen by governments accord to go what each may offer and I think that should be the approach, try to pick what each might be offering according to what the government will be providing with the different services. .
>> I think the question for governments is they have quite a number of applications on public clouds and we will see absolutely for governments, hybrid models, I think there will be data applications, where they decide for cost reasons and deployment efficiency, for everything what we are doing we go for the public cloud approach but here, you know, this is our registry and this is, in fact, the core data of our government, this is going to be a government private cloud so that's the kind of experience which we have today in discussing what governments want. Regarding your standards the push for governments has been obvious, in the last 5 to 6 to seven years, for example, document standard formats, I should not say from our perspective this was an obvious wish of governments to have, in fact, five format definitions which are not vendor specific so I think that is a thing that is an explicit requirement of a government.
>> I think this is a relevant question. First of all, it seems clear that governments are going to have to interact, if they have cloud systems of their own they're going to have to interact with other systems elsewhere, with the general public, with the private sector, because of contracting activities and other kinds of things or providing information to the general public so standards are going to be very important to make sure that all of those interactions between a government cloud whether it's the government's private cloud or whether the government is using a publically available cloud it needs standardization so that those interactions will take place no matter where they end up migrating their data and their processes. I'm in agreement with the point that hybrid environments are very, very likely. We should also imagine that they may change as time goes on and it could be in the public cloud implementations there becomes enough confidence in segregation and privacy and protection that it feels more comfortable to put that data in a more publically available place. I think, again, I could not overemphasize the important of standardizing APIs, and data object structures and formats and standardizing protocols in order to allow flexibility to happen.
>> IRENA KUSHMARK: Thank you, Pranesh, I'm Irena Kushmark, and I work for Open Access Programme, and I would like to give an example of one particular sector, academic libraries or research centres and cultural heritage organizations. We are setting up open access digital repositories with research and cultural content and more and more the digital content is being moved to clouds, which means it's not as openly accessible as it was when we were starting this process, and for an open source developer repository space in Fidora in July there is an open technology which is called Dura Cloud and the way it works, it's you can store your data in public or private clouds, but this technology enables you to do computing and to share your data and to provide long term preservation for your data and those are examples already available in some sectors.
>> PRANESH PRAKASH: Any other questions? I thought I would pose one final question and ask the panelists to respond quickly to that. Okay, I would like to pose one final question which is that right now we're from the discussion it seems to me that the need for distributed computing and the need for redundancy and protecting against and preserving memory are being pitted against issues of privacy and end user control. As well as developer control to an extent, right?
So I would like thoughts on how we could move forward, whether this is an issue, whether they are being pitted or how if they are how we could move forward.
>> DANIEL DARDELLIER: Well so as I said, we have a running activity and the consortium is open to the people joining activities so there is work going on standardizing protocols that are used on the cloud today. We see standardizing and it's members are asking for standardizing, so if you want to standardize more things, people should come to and you say tell us that. It's a consortium that is working toward what the people need, so there is no pushing of standard, just a realization that for the cloud there are standards used by everybody in the cloud so we have to improve those and there are technologies used only by the cloud that we are not sort of in a hurry to standardize.
>> JEREMY MALCOLM: I think that the world forward we see is on three levels, previously the world was easy you had RPCs, mobile phones, and now we talking about cloud and the internet of things and the only way forward is a multi stake shoulder discussion. Under standardization industry can help with regard to the speed of coming together to create the ecosystem, an industry around this is crucial. On the rest we see on governmental forums and consumer associations and I think today we have proof points in a number of countries or in geo entities like the European Union where the dialogue on what to do on security, data protection, also for the cloud, this kind of open discussion and moving to policies and regulatory frameworks should happen with all stake holders around a table.
>> So at the risk of suggesting something that might be like throwing a grenade in the middle of the floor here, one way to stimulate exploration of interoperability is for example, for the academic community to start trying to move data from one cloud to another and most of these clouds are widely and freely accessible, the Amazon cloud, the Google cloud and I guess I don't know about Azure but maybe you can speak to that, but if you have access to the cloud and you're able to put data into it, trying to get the data back out and in some form that doesn't kill it and move it to another place just trying things like that is going to force all of us who run clouds to be at least aware of the consequences of those experiments. So I think there is a possibility to move forward just to exhibit what the problems are in addition to attempting to pursue this in the standards form as Daniel has suggested. (Off microphone)
>> JEREMY MALCOLM: From there (off microphone) so I think we're very much at the beginning of development but our clients as users and developers to provide this, as we develop these ideas we should be aware that we are shaping the infrastructure for the future, like you were shaping the future in 1973, presumably after that, so let's think hard and let be very careful about where we go but let's not be discouraged. Sometimes the standards set in the past haven't done things and I'm going to give an example what may sound critical of the WC3, but it's not meant to be. Some of the privacy standards in civil society have been critical of the standards because it didn't involve the community enough and I think that and the ITAF has been guilty of developing standards for publicly policy without reaching out to the community enough and I think both of those organizations have moved on since then but it is very important that the standards are not purely value free, standards are rarely technical standards are very rarely value free so I think that the communities that are developing standards around the cloud need to actively reach out to the communities that are going to be affected by those standards through other institutions such as the IGF and other specific communities of interest.
>> I would like to echo what was said before, building a real cloud ecosystem system where we don't have separate clouds, whether they be private or public clouds but that they can talk and interact with each other and finding common solutions as much as possible to these issues about protecting and security of data, especially through multiple jurisdictions, this multi stake holder approach provides a space for doing that and to find common solutions that don't stifle the innovation and the progressive development of cloud computing.
>> PRANESH PRAKASH: I did a bit of summing up through the course of events already so I would just like to thank all of our panelists and all the participants and all the people who asked questions for coming here today. Thank you.
(End of workshop)