NINTH ANNUAL MEETING OF THE
INTERNET GOVERNANCE FORUM 2014
"CONNECTING CONTINENTS FOR ENHANCED.
MULTI‑STAKEHOLDER INTERNET GOVERNANCE"
02 SEPTEMBER 2014
INTERNET TECH AND POLICY: PRIVACY, DATA FLOWS AND TRUST
The following is the output of the real‑time captioning taken during the IGF 2014 Istanbul, Turkey, meetings. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.
>> MODERATOR: Welcome, everybody. Welcome to our panel on Internet technology and policy, privacy, data flows and trust.
We are very pleased to have all of you here in the audience today and I should say not just audience, but participants. One of the things that's going to be very important for the success of our conversation today is that after some introductory remarks from the panelists, who will share their perspectives and areas of expertise, we certainly want to ensure that we have a strong exchange of ideas, so from the floor, it can be a question, it can be a statement, it can be a comment. There are so many people who are in this room as I look around who have a lot of very important ideas in this area.
So I look forward to your engagement on this. Let me just share a few introductory words about the topic and our goals for this morning and then a very brief introduction of the panelists who will do best of introducing themselves through their statements. We're here talking about different internal evolving technologies, cloud, big data, data analytics, they hold the promise to bring a lot of profound benefits addressing important societal issues in all forms and different sectors, and at the same time the power of these tools, they also raise very important societal and legal concerns around privacy, data security, certainly around issues of jurisdiction, issues of competition, things that people here are quite familiar with.
All stakeholders in the Internet ecosystem have expectations of data protection and privacy in their communications, and businesses, governments, Civil Society, users, all presently are engaged in important dialogues aiming to restore and ensure trust in evolving Internet technologies.
So the participants here today are going to talk about different elements of these dialogues, including certainly areas about encryption, privacy enhancements, areas of rule of law across jurisdictional issues, interplay of innovation, data use and of course all the other topics I started off with. All dimensions of this conversation are important, and they all need protecting. We need of course to respect customers' privacy and maintain trust of users. That's critical. That's paramount to any company that has a long view of its relationship with customers. And likewise, it's important to recognize that when subject to adequate procedures and oversight, legitimate access to user data by governments for security reasons also is important for the Government role of protecting safety and security. I'm sure we're going to hear some discussions about similarly when subject to adequate oversight and procedures there are many important private sector activities for innovation and beneficial services.
So among the topics that we're going to explore on this panel, in an age of rapidly accelerating technology, how do we best detect concerns with privacy or data security, whether originating from Government or private sector? What kind of procedures and independent oversight is appropriate to resolve or reduce these problems? How do you best apply procedure when the activity transpires in several jurisdictions? And when we do need to balance privacy interests with either Government security or private sector innovation, how should we measure these benefits and costs on both sides of the equation? What are some successful practices that we can look towards for examples? These are all subjects that the different people on this panel are going to raise, and fortunately, I have a great set of experience here.
So just to be brief with my introductions, then we'll jump into things, next to me, Ana Neves, who is the Director of the Department of Information Society at the FCT in Portugal, and Ana serves as Director there. She's the Portuguese delegate in several working parties related to Information Society and Research, at the UOECD, UN, ITU and at ICANN and participates at the IGF many years. So welcome, we're glad to have you here.
Next is Mukesh, Mukesh Chulani. He's based her in Istanbul, the program director for IDC Government Insights, covering the Middle East, Turkey, and Africa and will bring some good experience here on different developments throughout industry.
Bertrand de la Chapelle is Director of the Internet and Jurisdiction Project, and Bertrand has several important roles with the IGF. From 2006 to 2010 he was the French Thematic Ambassador and Special Envoy for Information Society, participating in all WSIS followed activities and Internet Governance processes, including in particular the IGF and also was the vice‑chair of ICANN's Government Advisory Committee.
Next we have Thomas Grob. Roland Doll, who is in the programme for DT, regrettably is not feeling well, wasn't able to travel. Thankfully we have Thomas with us who is the strategic Director on several legal and policy issues for DT working on international matters and Thomas, it's great to have you here today. Thank you.
Next, Katitza Rodriguez, International Rights Director for the Electronic Frontier Foundation. She concentrates on comparative policy of international privacy issues, with special emphasis on law enforcement, Government surveillance and cross‑border data flows. Thank you, Katitza, we look forward to your comments.
Next, Joe Alhadeff, Vice President for global public policy and the Chief Privacy Strategist at Oracle as well as the Chair of the ICC digital economy commission.
And finally myself, Eric Loeb, I am the Vice President of international external affairs at AT&T and also I Co‑Chair the ICC's Internet and telecommunications task force. So this is our group. The audience will get to know you with your interventions as we go through the day.
The structure of the panel today, we want to talk first about some of the challenges under the topic that I've addressed and then importantly ideas on the way forward. After that, we'll turn to time for interventions from the floor, as I said, either questions or comments, and we look forward to that.
With that, if I may ask Ana, would you like to start by sharing some initial remarks on either significant policy or business developments and concerns that are having an impact on user trust in the medium?
>> ANA NEVES: Thank you very much. Well, I have here the main concerns that we are discussing at the level of governments, and for the time being, I am going to underline some of them, and I think that later on I will talk a little bit on how we are trying to cope with these main concerns.
So we know that we are in an area with different cultural frameworks, different values, which leads to different concepts of privacy. Even myself in 2006, I have a concept of privacy that is not the same that I have today. So if it is only with myself, so I can imagine what is happening all around the world.
The other point is that multiple laws that exist within Europe and cross‑continents. Another point that we are a bit worried, which many in Europe is that the legal framework or the protection is outdated, and maybe that which could potentially slow the innovation process, but it is something that I would like to elaborate a little more later.
And finally, the lack of confidence and the trust. What kind of information is being collected? To whom is it passed or sold? Who owns the information? How are the personal data being used by organisations? How are individuals' behavior being tracked? So these things are really important for cloud computing for Internet of Things. Then I think that we have to really foster discussions with all the stakeholders and the governments have a role here and to mediate the discussion between users, enterprise and other stakeholders to revise the legal framework where it's possible, we should promote practices, exchange of best practices and codes of conduct among different stakeholders and I have several ideas here that I would like to share. And finally, I would like to underline one thing that I think is very important, never forget the capacity component, to empower individuals to raise their awareness on how to deal with their data flows and privacy and to increase individuals to control their personal data is really crucial. Thank you.
>> MUKESH CHULANI: Thank you, Ana, thank you, Eric, it's a pleasure to be here and really present the perspectives of a market analyst which I hope will be somewhat different. I cover really Government, IT procurement and spending patterns for IDC across the Middle East and African region, so why is it important to look at Government IT procurement? It's important because actually Government for a lot of these countries across Middle East and Africa region set the tone for IT procurement. Confidence by Government in procuring IT leads to confidence by private sector. Especially among small or medium businesses, the frameworks, the infrastructure laid, the examples laid by Government actually mean something for procurement of small or medium‑sized businesses.
So when we look across the Middle East and Africa region, we look at really there's acknowledgment of the value of these what IDC would call third platform technologies, so we call it cloud, big data, enterprise, mobility and social business to some extent but we leave that decide. There's acknowledgment that these provide value in kick starting the economy, improving efficiencies and improving Government service delivery through e‑Government. So if you look at the national ICC plans or e‑Government plans of countries like Turkey, where we are at, Kenya, Saudi Arabia, they all have acknowledgment this is important. But when you do look at adoption, it seems that there's a little bit of a gap between what they want to do and what is actually happening on the ground.
We did research very recently in the first quarter of 2013 taking a look at adoption rates of what we call third platform technologies, so these key Internet technologies which Eric spoke about earlier. You look at an overall adoption level across all industries, private cloud which is cloud in my own backyard, 22% adoption, public cloud which is cloud provided to me by a third party, 17%, and big data and analytics 7% adoption.
If you look at Government, that goes down. Private cloud, 12%. Public cloud, 7 and big data 4. So there's a gap between what they want to do and what the actual situation is. And what's holding back uptake? I think there are a few which deserve quick mention, and I guess we'll discuss these in greater detail later on. So the first is a lack of nuanced conceptual understanding of the technology. What's the difference between software as a service and a hosted enterprise application software? What's the difference between virtualization and private cloud? This nuance is sometimes lost and to some extent the technology vendor marketing hasn't clarified things but actually made it worse because they brand everything smart, they brand everything cloud. It kind of confuses the message. You know, the capacity building role kind of gets mixed up in some of these cases.
There's also limited end user due diligence on the value and the applicability within their current environment. So yes, cloud is great, but what workloads should I migrate to the cloud first? What is mission critical? What is business critical? What should I do now? What should I do later? This due diligence is not happening as quickly as it ought to.
Zooming in on issues of data security, privacy and trust, Ana alluded to this as well, the lack of a standardized approach, there's a lack of national frameworks governing cloud adoption, for instance. What this means is that Government entities are driven by champions, they're driven by people brave enough to attempt to implement in spite the lack of a common framework, and what's happening as well is this is requiring them to think of what are the acceptable certifications? What are the acceptable service level agreements? This is happening on a one‑off basis rather than having a common framework across the entire Government to say that this is as a Government how we ought to approach implementation.
Just to give you an example, the latest audit by the South African auditor general on provincial and local Government outcomes on IT controls found that at its best, only 30% of provincial and local entities in South Africa have an effective functioning security management control or IT service continuity plan. So only 30% at its best have that. So how do you move forward? Brave leaders, CIO's and the Government can take a step forward, but there's a certain degree of fear that their hands are going to get slapped if they take a wrong step.
And lastly, just the issue of trust and data sovereignty. So there's the concern of where is this data going to live? The concern of I don't want to place the data beyond my national borders. That's controlled to some extent by private clouds, but the adoption of public clouds are certainly inhibited at least stated, at least verbally stated because of this. We are nonetheless quite optimistic about the future. I've been a downer at 9:00 in the morning, but our view for the future is quite positive and we'll get to that later on. Thank you for your attention.
>> ERIC LOEB: I would say that where Mukesh left off is a great place for Bertrand to pick up on cross‑border issues.
>> BERTRAND DE LA CHAPELLE: It's great to be here. One of the things that I would like to share, as Eric mentioned, I'm the Director of the Internet and Jurisdiction Project, and in a nutshell, the purpose of this thing is to address this major issue of the tension between the cross‑border nature of the Internet and the national jurisdictions. We are in an environment where the technology has been conceived as global from the onset, the system of the Internet is fundamentally global, it is a border rather than borderless, or it's cross‑border. Meanwhile, the legal system that exists is fundamentally fragmented. It is based on the geographic nation states and for a lot of us the history has brought basically millions if not hundreds of millions dead people just to draw the line very precisely between one country and the next. My law here, your law there. The international system is based on that. It's the nonintrusion or noninterference in the affairs of others. The international system is based on the separation of sovereignty. My sovereignty here, your sovereignty there. The only problem is that when we talk about the Internet, the great benefit that the Internet happens brought and not only the Internet but the services that are built upon the Internet, the platforms, the cloud and all the rest are fundamentally accessible anywhere in the world technically, and that's the intention. This is why it supports innovation, this is why somebody who creates a start‑up, which is by the way something I did in the '90s when the Internet was not as developed as it is today, this is why creators of start‑up can immediately reach out to users all around the world.
The problem is as will those activities develop and the interactions between people develop across borders, which is the purpose, which is what we want, which is what we want to enable, the more you get geographic criteria that are attached to the definition and the termination of applicable laws. It can be location of the user, location of the server, the incorporation of a company, any other intermediary that comes into the way can be a trigger for the application of a particular law.
The problem also is compounded by the fact that as the Internet grows and the adoption of it grows, the diversity of people who use it also grows. And this means, very naturally, a diversity of norms, of references in many aspects that are not always compatible. So if you compile all those elements, we get in a situation where there is a patchwork of national legislations, a patchwork of individual sensitivities and shared online spaces. This is particularly true, this problem is particularly acute for hosting platforms, the Facebooks, the YouTube, Twitter and other companies of this world and any other that can be in a different country, can be WABO, can be WeChat and others, because they are serving users in all countries, these users are posting something that may perfectly be illegal in one of the countries that receive it. And we don't have ‑‑ I will not belabor it now, but we don't have a framework internationally to handle this. The mutual legal assistance treaties are completely inappropriate because they deal only with criminal things and sometimes it's not criminal in both countries, that's one of the problems. And the big challenge, and I finish with that, the big challenge, and the thing that I would really like to highlight is there are situations where, you know, not to be pedantic, but Immanuel Kant had this thing about the categoric imperative, like what is something good and what is something bad, irrespective of religion of the rest. And to paraphrase heavily, the categoric imperative says something is good if it is scalable, something is bad if it's not scalable, i.e. if I want everybody to behave the way I behave, that's a good thing. If I want nobody else but me and be a free writer, that's a bad behavior. And a key criteria today, the reason why I mention this is because we are in a situation where by lack of coordination or at least discussion and cooperation, things that are fostered by places like the IGF and other mechanisms, plus the one I do, the chance is that every single entity, every Government, every company and the rest will take individual decisions in a typical prisoner's dilemma situation and each decision will look perfectly reasonable, perfectly appropriate at least in the short term. The problem is that it's a complex dynamic system and the cumulative effect of all the decisions leads to something everybody wants to avoid, the fragmentation, partitioning, reapplication of the Westphalian system on the Internet, so we are in a situation where cooperation is more important than ever to address those issues, privacy is very important number one, but issues related to speech is also a very big challenge, and the Internet and Jurisdiction Project is one of the attempts at designing a procedural framework to handle the transborder requests that are now growing between governments and global operators. But I close here.
>> THOMAS GROB: Yes, thank you very much, Bertrand, this was exactly the right introduction to what I was going on say because coming from an operator based in Germany in a company that has very high standards regarding the protection regarded internationally, we also have to concern ‑‑ we also have the concern that we have fragmentation in this space.
Now of course high standards also come at a high cost, but I am not going to argue to have lower costs, I'm going on say to the opposite that it's actually an investment in this trust, because we need this trust, we see in studies that especially German citizens, which are rather sensitive to this topic, despite being protected already by very high standards, the survey shows that 31% of Germans regard data security and misuse of data as a great risk and 74% believe that this risk is even going to increase in the future. Now, 28% are very worried that their personal data is passed forward by companies without their permission, and again, 74% believe that this risk is going to increase in the future. So despite stepping up the protection on the legal side, we really do have trust issue here with our customers, and as a company, we need to make everything possible to ensure that this trust will come back.
Now, while we are talking about in this legal space here, we also have to acknowledge that we are not talking about what Internet companies are doing with private data being allowed or not allowed by German rules, but also what people that simply don't care whatever the rules are doing, talking about attacks from cyberspace, there are ways and personal data is valued good, as we can see. So as a network operator, we have a very distinct role in doing whatever we can on the network side to protect our customers against those attacks which essentially try to extract private data.
I'll leave it at this for the moment and then look forward to the discussion.
>> KATITZA RODRIGUEZ: Thank you. So we have asked to speak five minutes about a problem. In the second round, the way forward, the solutions. So I work for Civil Society organisation, so for us when we talk about trust, partly we are saying we really have trust, and we remember, I want to remind people what we have learned the last year from these known disclosures about the specific details of how the NSA and its allies have been spying on much of the world's communications. The foreign intelligence agencies of this nation have conducted it at the technical and operational level that spans the entire globe. We have learned the corporation and intelligence sharing among these countries and have a weakness, how material gathers under one country's regime is really shared with others. We have learned that those agencies are not the only one, but many other intelligence agencies are also spying on innocent individuals. We have built an Internet that is insecure, vulnerable to all and every attacker. This last year we also have learned that NSA has strayed far from its legitimate goal of protecting national security. In fact, we have seen NSA participating in economic espionage, diplomatic spying, suspicion surveillance on entire populations. Even worse, the NSA has also covertly weakened products and standards that Internet users need to protect themselves against online spying. This culture of collecting all information, process all information, exploit all information is not sustainable. If NSA is undermining the encryption, the encryption tools relied upon by ordinary users, companies, financial institutions, targets are not targets, as part of the programme and a parallel effort to weaken the security of all Internet users, including you. The NSA program Muscular infiltrated the links between the global centers of technology companies like Google and Yahoo. Many companies have responded to those relevations by encrypted traffic over their internal networks.
The NSA monitor telephones, call and text messages for many Government officials, and also the NSA documents show that not all governments are clear about their own level of cooperation with NSA. As one of the news media report, they intercepts this, few, if any, elected leaders have any knowledge of the surveillance. But certainly the U.S. will not affect what other countries are doing, if the NSA, or the GCQ, other countries like China will do it and probably. There's a flaw armed race argument. We will never solve this if we keep that argument. We have built, again, a secure Internet for everyone and enabled observation of everyone's activities, where freedom and liberty are lost. The mere existence of mass surveillance apparatus regardless of how it's used is in itself sufficient as default dissent. A citizenry that is aware of always being watched quickly become a compliant and a fearful one. Self‑censorship is a problem. We have lost track in technology and protocols and institutions have run the Internet. Again, we have built a secure Internet, but we do have a solution: An Internet is vulnerable to all attackers or an Internet is secure for everyone. In the second part I will propose some ideas. Thank you.
>> JOSEPH ALHADEFF: Thank you. I guess I'm going to be the bridge to the second panel because what I want to do is talk a little bit about the ecosystem we're in and then why that ecosystem creates some of the challenges. So if you think about where we are today, we live in an ecosystem, and this is ever more so, of people connected with people, objects connected with objects, and people connected with objects that exist in computer‑aware environments that can avail themselves of services delivered through the clouds that are supported by advanced analytics. So that's cloud, that's the Internet of Things, that's big data. That is the emerging technologies, but they are not silos, they are a unified ecosystem. And they can create significant benefits. There was an example of a hospital in Toronto that treated premature births, and those babies are very susceptible to fever. And usually what happens is those babies are treated in realtime. They have one nurse to every two babies. The second the baby gets in distress, the nurse comes in and gives whatever treatment is necessary to the baby. But what they decided to do is they decided to collect the information because those babies are hooked up to any number of censors. So they started to collect the information and they found they completely counter‑intuitive result. 24 hours before the baby spikes a substantial fever, a fever that can be fatal, all of the vital signs of the baby stabilize. Normally a sign that the baby is doing better because the vital signs are stable. No one knows exactly why that happens, but the correlation allowed them to start treating babies because there's a very low contraindication for treatment for fever. So they started treating babies for fever the second the vitals stabilized and they lowered infant mortality, decreased the amount of time the baby was susceptible to the fever and dramatically improved patient outcomes. That is the potential benefit that we get from these technologies which is a benefit that is not incidental to lose.
However, when you think about secondary uses of health data, the challenge that we face is we are removing the person from participation in some of these decisions because the ability to get consent in these scenarios becomes more challenging. So part of question becomes as we go to technologies that operate remotely, that operate at a distance, whether it's censors in things that you don't know exist or other ways in which information is collected in a nonobvious manner, or if we have previously collected information that might be useful in order to do things like save lives, the problem becomes our traditional model is a model where someone has been given notice and an opportunity to make a decision related to that notice. Yes, I want you to collect, no, I don't want you to collect. Yes, I want you to use, no, I don't want you to use. These are the limits.
It has been by any stretch of the imagination a very imperfect dialogue to date. But now the dialogue is disappearing because the hallmarks of what used to be the state of that dialogue I can see collection of my information are disappearing. I understand use of my information is disappearing. In fact, our value chains are becoming so complex that it's difficult to think that someone actually understands how their information is being used across any number of different participants to that value chain. In many cases these are neutral uses of information that just include, well, this person is processing the information to support this person who is doing this, all of which may be something you know but you don't know all the pieces of the puzzle. But sometimes it's things that would surprise you. Sometimes it's things that would surprise and annoy you. Sometimes it's things that would surprise, annoy and actually make you irate. So the question becomes, we have in some ways removed the individual from the equation, and you cannot do that unless there is a compelling societal reason where you would want to do that, and if you do do that, what are you substituting in place of that? What is the protection that is being substituted? What is the oversight? What is the assurance that correct things are being done with your information that is being put in place? And we have been told to hold our solutions to round 2, so that was kind of like the teaser on the news programme and the substantial story will be coming later.
>> ERIC LOEB: Exactly. Well, as we get into the next phase of this, people play off each other with solutions. So, you know, from the first round of comments, as you've heard, and many of the things that you're familiar with, there's in the background a context of the range of societal benefits, not just from these technologies, but from the flow of information. And as one person said on a panel I was listening to recently, innovation doesn't know any borders, it doesn't know any boundaries, but we have some jurisdictional challenges that have been brought up. And as we've gone through and talked about the challenges, certainly we've heard about the security attacks and inadequate investments that have been made to protect against those and the issues of trust around that. Certainly strong theme around surveillance practices, both scope and also adequacy of procedure as safeguards, and that's an issue for many jurisdictions to consider. Private sector practices with the use of data and the transparency of those practices, and then of course as Bertrand just highlighted, the cross‑jurisdictional conflicts.
So as we now get into this next phase of the discussion, unless we want to just sit here and say, well, you know, it's a zero sum game of trade‑offs, these are all important but there are conflicting priorities. If there's in fact not just a zero sum but ways forward that can improve the current situation, we want to bring up some different ideas and concrete concepts here and that's really the next issue of focus for this panel. So we'll go down again and talk about different issues on policy stakeholder initiatives, we'll go down the row, but I really do encourage as anyone does raise a point that triggers an idea someone else has, interventions and of course after that as I said, really welcome to have some participation from the audience as well. Ana, if you would like to start.
>> ANA NEVES: Okay. So I think I'm going to use my slot to present to you what I think is the way forward and some recommendations that we are discussing at the governmental level. I have here some, but I would like to share them with you.
So to surpass all these obstacles that we heard, I think that we from the governmental side, we have to foster discussions with all the stakeholders to mediate the discussions between users and enterprises, and to often perceive their interests as being antagonist. So to really make the multistakeholder model work.
Another important thing is to identify responsibilities for each party with an interest in the data‑driven economy and clearly assign such responsibilities to an entity or group of entities and establish accountability and enforcement mechanisms.
To revise the legal framework, where possible. So I'm Portuguese, I'm from the European Union, so I'm in the middle of the European Union discussions, and in the European Union on the 25th of January of 2012, communication data protection law reform package by participants of the Member States. So two new pieces of European law, as general, the protection regulation and a directive on processing for crime and criminal justice. Intend to repeal and replace the EU directive and framework decision from 2008 on the protection in policy and policy operation and criminal matters. So this one is from 2008, so 2012, four years later, European Commission presented another package.
Other mechanisms are also under discussion, including European data, a protection‑sealed programme that would allow certified organisations to transfer to and receive personal data from other certified organisations irrespective of their location. Governments should promote practices, exchange of best practices in terms of conduct among different stakeholders by the recognition of international standards, the main stakeholders in this process should share, should agree and follow best practices, policies, regarding international conventions such as the fundamental rights international law by promoting something that is very important, I think, privacy by design. So technological solutions such as apps and services must have privacy and security concerns from the very beginning of its development or implementation.
To promote technological research with other scientific areas, such as history, culture, sociology, psychology, economics, to better address societal concerns and better understand societies and behaviors. To explore business model, for private sector and individuals, and to clear privacy information security policies to who does what with this data. Users and consumers should be aware of the data collected, the information produced and its.
Governments should also ensure a tailored approach by sector to address sectoral differences and needs as it depends on whether we are dealing with the public health care, energy networks, transport, commerce trade, national security, environment preservation. Governments could establish a certification framework for secure and privacy friendly data management practices and enterprises. So the data privacy certificates could be promoted adds a seal of quality, especially enterprises like e‑commerce services and online retailers could benefit from advertising that their companies' data is secure and privacy friendly. The process could be supervised by a specific entity responsible for monitoring the data privacy practices and issue being the certificates. Such data management, best practices could also play the role of licenses for and supervising enterprises, developing new data security related services such as data safe havens and the uses of techniques such as anonymization, so removing personal information that could be used to link data to an identifiable person. Aggregation, combining data and personal data ‑‑ converting information into parameters, including information about the person and their digital identity. Finally, governments should consider fiscal incentives for enterprises which adopt data security and privacy friendly practices as well as for stimulating the creation and uptake of privacy by design goods and services. And these are my two stands from the Government point of view. Thank you.
>> MUKESH CHULANI: Thank you, Ana. Again, I'm going to give an industry skew, a perspective on what vendors, technology vendors and what end users are actually doing so we start off with the assumption that do you believe that end users of third platform technology, so cloud, big data, et cetera, do you believe that they are expect security to be a given when they work throughout the systems? We've done a survey early this year. We asked respondents ‑‑ these are CIO's globally ‑‑ we asked them to describe their stance on what security responsibilities should be between themselves, their organisations, and cloud service providers. So on one end of the scale we asked them to you believe this is caveat emptor or this buyer beware situation where all the risk is on your side? 1½% said yes. On the other end of the scale we asked them do you ‑‑ to find out what they mean. So from an end user perspective, the expectation is that end users deploying cloud services in particular in this case want encryption at some level deployed within their environments. In addition, they're also interested in technologies on premise, so within their facilities, within their control, that can be used to create and maintain security connections and monitor the use of cloud technologies. So we are talking here about federated identity, single sign‑on solutions, we're talking about host‑based security, firewall, host base intrusion prevention systems. So from the end user side there is that move. We asked them in the absence of any Government frameworks within your nation, what do you use to decide whether a cloud provider is safe, quote, unquote, to use in a public scale? The answer was we would use third party certifications, including third party Government certifications. So you may end up with governments across Middle East and Africa looking at Fedramp or looking at the UK's G cloud as a proxy. So if it's good enough for the United States, it's good enough for the UK. That standard is sufficiently high. And if the cloud provider is capable of providing that, that would be okay with us, with one exception, the data resides within our national jurisdiction, so it goes back to what Bertrand was saying is the fragmentation of this.
But vendors are not sitting idle as well. We have a lot of interaction with vendors. Public cloud is a $90 billion market in 2016, growing at 37%. To give you an idea, that is 14 times the size of overall IT market growth rate. So they're not about to sit and watch that market move away. There's a lot of effort by the technology vendors to actually scale up their security. So you look at the vendors who have hyperscale are actually looking to embed security by design as well. So security by design is a big focus. But in addition, they're also looking to partner with other security focus organisations. So you have, for instance, Amazon Web services partnering with SafeNet to develop a security appliance, cloud HSM. Or you would have a small cloud provider, relatively small, I would say, fire host, who would put together security pieces from vendors to assemble something customize and end to end from their perspective.
So from the industry perspective there are certainly moves to make sure that the future for cloud, the future for third platform technologies is safer and thereby encourage greater adoption.
>> BERTRAND DE LA CHAPELLE: I would like to start by piggybacking on Eric's comment about zero sum game. You know that in game theory you have three possibilities, zero sum game, negative sum games and positive sum games. What I was describing in the first part is typically the emergence of a negative sum game at the moment. The legal competition is actually detrimental for everyone. The problem is that there are not that many alternative options, and zero sum games in that kind of very complex environments is not really an option because you're always hovering above and below. It's not like distributing a portion of a cake.
The key question is, is there a way ‑‑ and this goes to the optimistic or not optimistic question earlier ‑‑ is there a way to discuss this in something that is more positive, a positive sum game? I do think it is, and the reason I think it is is because the issue that we've decided to tackle in the Internet and Jurisdiction Project, just as the illustration or example, is something that is a problem for all actors. In a nutshell, because the international treaties do not exist, because the EMLATs do not function for the kind of issues we're talking about, i.e., content that is posted in one country that is illegal in another one, the solution that has been adopted by a lot of countries, law enforcement and public authorities, is to actually send direct requests to the platforms themselves, to the Google, Facebook, Twitter and others of this world. And this rise in trans‑border requests is something that is a burden for companies, obviously, it is something that is considered as nonsatisfactory by Civil Society groups who are worried that private companies are making decisions on sensitive issues regarding freedom of expression or privacy, and it is something that is not satisfactory in most cases for the governments because to be frank, some of the requests are not necessarily handled or there's no answer because their own level of let's say due process is not what it should be in many cases.
So we are in a situation of tension where the zero sum game that is playing out is that operators are being blocked in some countries, the governments are dissatisfied; and when the interaction is resulting in a decision, Civil Society is legitimately concerned about the system. That's a wonderful situation for a corporation because everybody has a had problem with the current situation. How do you get out of this conundrum? I've been around advocate of multistakeholder approaches from 2000 or 2001, and if I ever needed confirmation that things change when you put people around the same table, that would have been the confirmation for me. In the last two and a half years we've held 13 meetings in more than ten countries, about 70 entities have been participating in one way or another in the process, governments, including India, Brazil, the United States, France, Germany, Netherlands, Sweden, major platforms like Facebook, Google, Yahoo, Microsoft, Civil Society groups like CDT, CIS in India, FGV, the Council of Europe, OECD, but when you get people together and you make them work progressively, you end up defining what is emerging now, i.e., a due process or an effort at a due process framework, something that raises the bar of predictability of procedures, transparency, avenues for redress, and mechanisms for dispute management, because one of the big challenges is that there is no overarching mechanism today. We don't have treaties, and there's no way to arbitrate basically between the country and a company. This is not a way you normally deal with issues. The court in one country cannot decide on the company in another one unless there are boots on the ground and so on, and vice versa, it would be inappropriate to see the law of, for instance, the United States because the company is incorporated in the United States but apply worldwide. So we are in a conundrum and in many cases I take the opportunity to mention that.
The argument that says it is very troubling that private companies are making determination that are quasi‑judiciary is a valid concern, and I share it, and I'm sure a lot of us share it. The key challenge is if we say it should be the courts that make those decisions, the question becomes, yes, okay, but which court? Is it the court of the country that is making the request, which means that if one country says this content is not legal in my country, it should not be visible anywhere makes no sense.
If on the other hand it is the court of the country of incorporation of the platform, it means that the law of that country applies worldwide, which is not acceptable either, even if it is a decision that is being discussed at the moment on the Microsoft case, that a lot of you may have heard. So in a nutshell, the question is, how do you design a process that makes sure that the requests that are being made by law enforcement or public authorities are sufficiently complete, have sufficient information and sufficient basis that there's a format for those requests that raises the bar of due process, that there are mechanisms that collect data to produce transparency reporting, and furthermore, that there are common sets of criteria that can be developed ‑‑ and that's what we're doing at the moment ‑‑ between the platforms, the governments, the Civil Society group, international organisations, to make the decision‑making predictable and to make sure that there is an element of predictability, there's an element of understanding how things are working.
But the most important, and I'll finish with that, the most important and most difficult thing is how to handle the situations where the process has been improved, everything is working nicely in such a framework, but there is still a discrepancy between the country that requests something and the company that says, you know what? Actually, no. I'll give you a concrete example. There are situations, and it is probably pertinent to mention this in this region, there are situations of cross‑border global public order issues. Something has been posted in one country that triggers a local situation or hatred or ethnic clashes and so on. In such cases, it is obvious that each party will want to go on the side of caution. The country will request very early take down of the content that is illegal in the country and so on and the platform very legitimately will err on the side of caution and say I want to make sure that freedom of expression is maintained, and unless there is a real valid reason to do this, we will not take that down.
In the current situation there is no mechanism for dialogue. There is no red phone. There is no procedure preestablished to discuss whether the concerns are real, whether they are not, if they can be communicated in the respects of human rights. At the moment there is just a tension that is growing and that triggers additional restrictions in the national laws and the fragmentation in the negative system.
I give you a very personal experience two days ago. Somebody in the hotel room near mine was making noise at 11:30 at night and I did something that I shouldn't have done, which is after I've been knocking the first time on the door to ask them to calm down, whether they accepted kindly, an hour later when it was continuing, I knocked again and then a very sturdy man, pretty aggressive, opened, took me and threw me on the ground, which was a little bit unpleasant, I would say. The reason why I mention this is because I of course called the security, and the security came and calmed things down and asked the guy to come in and I could make a complaint. If there had been nobody to call, nothing to ‑‑ not arbitrate, but facilitate the tension alleviation, I don't know what would have happened. Maybe I would be here with a big bad eye. We don't have that kind of thing at the international level at the moment. We are at an environment where tensions are growing because people are very different in this space, it is normal that they're different, it is good that they're different, and what we need to build is at least in this space mechanisms that alleviate tension, that ensure due process and ensure human rights but also balance the legitimate concerns of public order and sensitivities in a very delicate manner. So I just wanted to give you an example because when you get people around the table, they apparently seem to be able to develop such a framework, and I can encourage you to come at 11:30, I have a flash session on this.
>> ERIC LOEB: That's great, Bertrand, thank you, very important for discussing mediating, arbitrating some of these critical challenges around procedural enhancements and different due process frameworks. Tom?
>> THOMAS GROB: Again I think I can build on what my panelists have already said. We do believe that transparency is very important and that we have a role in awareness raising also from the private sector.
Now, in the past, all these security issues have been handled a bit secretive, I would say, and so for good reason.
But having seen and having, I think, agreement that we need to work closely together on this, we also have to think about how we can actually provide everyone with the information necessary, and that means also talking about what is actually going on, what cyberattacks are happening and what each and every one can do.
Now, as a company we have to be aware that whenever, for example, we introduce an update to a system, that immediately everyone out there will know, oh, there must have been a vulnerability earlier, so what you want to know is what happens then with all the gadgets that do not make the update? For this reason we have a sensing network out there that simulates had will desktop computers and mobile phones being on the old system, just monitoring what are actually the attacks incoming and those numbers are mazing, so we are going public as much as possible with this. There are websites that in realtime allow to see what Deutsche Telekom comes up with findings. I think it has worked rather well in the past to work with governments for awareness raising in that respect and working on solutions together.
Now, on the way forward, of course, it has been mentioned already privacy by design should be a requirement, is a requirement. In my company we have a board member that is exclusively responsible for data protection and we also have a data protection board involving all relevant political parties. They look at new products, and they really do a hard testing, and more often than not, you have to go back, redesign or improve the design and then go back to the board before you can actually launch it in the market.
Now, as I've said before, German standards are very high and we probably wouldn't need to adhere to them in all of our footprint countries, but we do because we believe that it is necessary to level the playing field not towards the bottom but towards the top. So again, as I have said before, we should in this respect work towards international standards that are orientated towards the high end of the possibilities with regards to data protection.
Last but not least, I think it's also important to keep in the dialogue with the leadership functions here, and for that reason, Deutsche Telekom together with the security conference is holding every year now for the third year in November the cyber security summit bringing together both political and economic leadership, and when I look back like ten years ago, Internet security was something that geeks and in other words would be talking about and being called paranoid. Today we are at the point where it's actually at CEO attention level and CEOs know how important it is, so I think that is an excellent development for the past decade, and if we manage to bring the same awareness to the people in the street and give them the tools to protect themselves, that would be a major achievement.
>> ERIC LOEB: Bertrand, you have experience working with other people in noise in different rooms. Thank you, Thomas. Katitza?
>> KATITZA RODRIGUEZ: Thank you. Had we were talking about ‑‑ I'm Civil Society, so I really have to highlight the issues about the revelations and we are talking about trust and we are talking about solutions, so I want to propose some to the business sector but also to the companies to see whether we can move forward on this real problem because there will be no trust if we don't fix the problem. So a secure Internet I was saying, that one thinks it is important is a secure Internet because it's in everyone's interest and this I was saying previously that this arms race scenario is totally wrong, and we need to change that mentality. So cryptography works. We hear some of the interview, the first one, he said that crypto, properly implemented, works. This is an important lesson. Attackers, whether the Government or unscrupulous ‑‑ well, attackers, whether govenment or other kind of attackers, rely on data that is not encrypted. We have made it too easy for governments to conduct bulk collection. Of course there are many solutions. Many could be many complicated, it's a complex topic, you cannot solve it in just one country because other countries will do it, so we have to start changing the way we are doing things, and one is companies are probably rethinking this collection, collecting everything business model. I know that would be very difficult, but probably is a smart decision, not only for data protection but also vis‑a‑vis Government access to that data.
It's obvious that it's no longer that we can keep this secret corporation with the Government because we now know that it can be leaked, probably in the past you thought that your corporation would remain secret, and you are losing business, you are losing money, you are losing clients, your clients are not happy. So probably as a company of governments, policies, you can promote that we should use more encryption, encrypt the backbone, make spying more expensive, we do not know enough how to protect ourselves against other intelligence agencies in the world if it's a targeted attack, but let's work together to make collection expensive, encrypt the bad one, encrypt the data that is not only in the cloud of but is in transit, when it goes in transit, the problems of service to have privacy enhancing by default, security important points, encryption and promote more anonymity tools, make encryption simpler, like OTI when you chat online. There are solutions not only in the technical area. We know that the ability to eavesdrop on people's communication bears immense power in those who do it. And unless such power is held in check by rigorous oversight and accountability, it's almost certain to be abused, we have learned from the leaks revealed.
So last year in the human rights Council many Civil Society present a document with developed how to apply international human rights law in the context of communication surveillance, and we come up with 13 principles which are found in the necessary and proportionate.org website. And it's legal paper that explains where is the legal roots of all these principles? And with these principles, we are trying to explain Government how existing human rights protections apply to the digital context. We are not asking to invest a thousand hours of drafting a new treaty. We already have a treaty. We are asking to enforce it and implement it, and we are explaining how to do it in the context of communications surveillance, and we hope to have companies rally and support these principles. We have seen very little, and those are obligations, human rights obligations, where companies can help promote so they become or they probably will not become come play sent in the legal surveillance. So I know that there are many things to do, but I recommend you to read if you are a Government or you are a business sector, to read the necessary and proportionate principles that explain very clear how to adopt laws that will restrict human rights or the right to privacy. And if you are designing a new law, surveillance law or reviewing an existing one, you have to know that these laws have to be necessary, have to be proportionate, have to be adequate, it has to comply with the principle of legality. So I recommend you to read it. It's on the website, and it's legal background paper is also there and answers many of the questions about Government access to citizens data, so it's that document, and I only have five minutes, so I cannot go in there, but please visit the website.
>> JOSEPH ALHADEFF: Thank you. I'll be brief so we can get to your participation. I just wanted to bring a couple of solutions to examples that are actually in process. Remember when I spoke we talked about the problem of individual participation especially in previous collected information. So there's interesting work going on at the OECD between two of its committees, one on healthcare and one on security and privacy, where they're taking a look at what happens when you have previously collected medical information and a desire to use that information for another medical purpose but one that wasn't noticed at the time the information was collected and where there is an impracticability of getting back to the patient for getting a new individualized consent for a separate use of the information. The idea was the purpose is laudable, the purpose is societally beneficial, but because you don't have individual participation, the person in the decision‑making process, you have to build new safeguards into the system. So one of the things they're looking at is how you would define use based model says for that kind of information and we have precedents for those concepts, they're called ethical research protocols, they're currently what's used in medical research. But those protocols are use odd a per inquiry basis, so every time you essentially have a new question, it's a new set of permissions that you're getting. The concept here is can you develop use based models which are replicable, which streamline some of the delay and some of the expense while not shortcutting any of the safety or the protection, so you would still be able to say, well, you need to be able to control access, you need to encrypt the information to make secure, you can only share the information in this fashion, the use is still limited to this purpose, so you create a set of control parameters around that information to create assurance that the information is being used in a correct fashion, is probably within the zone of foreseeability of use, but it frees the ability to use that information in a more cohesive fashion with less administrative burden related to it while still maintaining the safeguards.
Another thing which is underway is going towards of Bertrand's issues. Think everything that has a Pacific coast, which means Russia, which means the Asian countries, which means the Latin American countries that have a Pacific coast, they have something called cross‑border privacy rules. Those of you familiar with EU privacy may be familiar with the concept of binding corporate rules. Essentially in both cases there are sets of rules that would bind a company and its affiliates to a set of privacy practices which are validated, and that binding and validation and vetting by an authority then gives that company the right to transfer information pursuant to those rules across its affiliates.
There's a project going on in APEC which is very promising. The French data protection, the ICO, which is the UK's data protection authority, one of the German data protection authorities, the European Commission, as well as the commissioner on these issues for the EU, the data privacy office Peter Husting's office also has a representative, they are working with the data privacy group to do a mapping between the binding rules and cross privacy rules and that mapping is to see what kind of overlap there is between these instruments and they've finished what they call a referential, essentially it was they took the two documents and they tried to do that overlap mapping.
And they figured out there was a fairly large overlap, maybe 75% or so of the requirements are actually fairly similar. So the next step in the process is to start to determine, well, if I validated your compliance to my programme and validating your compliance to my programme is worth 75% of validating your compliance to this other programme, how do I give you credit for the 75% you've already accomplished and then how do I figure out how to hold you responsible for the 25% you haven't met? And again, what this is doing is simplifying the administration of working across different jurisdictional formats, because at the end of the day, privacy is built within a legal framework within a set of cultural norms and expectations and within a set of individual priorities and preferences. And it is very difficult to say we are going to figure out how to develop a global standard on this. Companies can develop an operational standard which tries to bridge a lot of these issues, but the more we can come up with ways in which we can map regulations to each other, map compliance paradigms to each other, and not undercut the level of regulation in any country, but, rather, facilitate the exchange of information and the lack of duplication in filings, the lack of duplication in requirements, then we have made interoperability of systems at a policy level something that is fairly innovative because we're getting better at interoperability at a technical level but we're not very good at interoperability at a policy level.
So this is a very interesting attempt at developing policy interoperability. I'll make one comment to the positive sum zero sum that Bertrand and Eric both mentioned and the commissioner, former commissioner for Ontario has done a lot of work in this area also but I think part of the problem is we continue to use the word balance, we have to balance these, and the word balance by definition has inherently a winner and a user and I think what we have to start talking about is optimizing because there are ways in which you can actually maximize both if you think about it in the correct paradigm. The way I like to call it is one plus one equals three and let's call it the new math because the whole is in fact greater than the sum of its parts if you think about optimization as opposed to less than the sum of its parts, and just because we should probably have a provocative statement at the beginning of this, I just wanted to ask Bertrand a question, you thought it was a bad idea to push some of this ethical decision‑making to companies, what do you think the Spanish court did in the right to be forgotten?
>> BERTRAND DE LA CHAPELLE: If I may, I didn't mean I think it is a bad thing, I said there are criticisms about the fact that companies have this quasi‑judiciary role and to answer your question, if there was a vindication that the highest court in Europe is actually saying that they have this role, this is a fascinating turn of events because this is becoming an element at the same time saying that appeal mechanisms are needed, this is what the Court has said. It's a very interesting question.
>> ERIC LOEB: If I may for a moment, I'm sure we'll get a point in, but a couple quick acknowledgments. First, Constance Weise is our remote moderator and I see we have some remote participants, and Erika Mann, also we'll be looking for a report from the session. At this moment we have a couple of microphones, I want to make sure that we have some interventions, and Constance, why don't we start if there is a question from one of the remote participants to the panel.
>> CONSTANCE WEISE: Thank you, Eric, can you hear me?
>> ERIC LOEB: No. You can come up, programs, Constance.
>> CONSTANCE WEISE: There's a question for Bertrand from Bangladesh, who says, you mention that there is no platform or procedure to resolve cross‑border issues that creates tension and violence. I heard the same thing from you two years ago. What have we achieved through all the discussions in the last two years and are we going anywhere? Thank you.
>> BERTRAND DE LA CHAPELLE: The answer will be very simple. The session that I was mentioning at 11:30 in room 2 will be a presentation of the current state of the Internet and Jurisdiction Project, and there is another, i.e., the framework that is now being developed by the participants and we have another session, a workshop on cyberspace fragment along national jurisdictions on Thursday at 2:30, so that will be the best. At the moment the framework is being organised around the elements that I mentioned and it would be long to explain now. At the session later today would be a better answer.
>> ERIC LOEB: Okay. May I go to the room here for any comments or questions? We have one up front. Can we have the microphone?
>> AUDIENCE: I just want to ask about the privacy seal.
>> ERIC LOEB: Introductions.
>> Penelope from Copenhagen, Denmark, the business authority. I want to hear about the privacy seal you're discussing at the EU level. How far are you, and will companies like Facebook, Google, practicing the free model, will they be able to have the seal like that? And second, private companies, what do you think about that idea of a privacy seal?
>> ERIC LOEB: Ana first.
>> ANA NEVES: Thank you for the question. Of course the answer is that we have this model in Europe and we have ‑‑ and we have these U.S. companies, so it's up to Europe and its governments to understand what it means that we have mainly U.S. companies working in Europe because they are not going to follow the rules of Europe if you are talking about technology like cloud, et cetera, that doesn't know any boundaries., so we really have to rethink at the European level but globally. So the point is ‑‑ I don't know whether I should elaborate a little more on this, but the point is that Europe should wake up.
>> ERIC LOEB: Joe?
>> JOSEPH ALHADEFF: Yes, just from a company perspective, for a number of companies who participate in the safe harbor or already participate in seal programmes, the APEC process also has seal programmes from a security certification point of view, a number of companies also obtain seal programmes related to security. I would say the one concern I would have is some examples of seal programmes try to be technology specific. In other words, they try to say your technology should implement a policy outcome, when policy outcomes are implemented by technology people and process. You cannot just look at technology and say that's sufficient. You have to look at technology people and process. So I think holistic seals are the way to go. I think what we've found in the U.S. also is and the UK is exploring this as well as the EU is how you leverage, because authorities themselves don't usually have the capacity to run a full blown seal programme, and sometimes they feel conflicted because it's hard to be both the enforcement operator and the one who gave you the seal. There's a slight conflict of interest perhaps in that. So part of the question is how you actually create potential private operators who operate under Government supervision in order to administer these seals.
>> ERIC LOEB: And Patrick Azaleas just raised his hand just a couple of moments ago. A couple points that have been raised by the companies could play, some of the things Katitza mentioned and you started the work on the communications industry.
>> Yes, my name is Patrick, I come from Stockholm, there was a question about solutions and as you know in 2011 we had the you know framework on protect respect remedy and the TELCOS, nine global TELCOS sat together and defined what does respect mean in our sector when it comes to freedom of expression and privacy and we launched now 18 months ago March last year a set of guiding principles, industry best practice on how to respect and those are available on our homepage, Telecom industry dialogue.org, in 7 different languages, and we are here to answer any questions to that regard. AT&T and other operators present during the IGF are actively working implementing those principles and engaging in dialogue with interested stakeholders. Thank you.
>> ERIC LOEB: Katitza, I wanted to return to you. I know you had a comment before. I'm not sure ‑‑ still? Okay. No? Okay.
Another ‑‑ okay. Thank you.
>> AUDIENCE: Yes. About the physical incentive that the suggestion of Miss Neves earlier, how effective do you think is the suggestion about giving fiscal incentives given that in developing countries we have been giving fiscal incentives to private sector and we haven't really experienced any progress in services that we've given fiscal incentives?
>> ERIC LOEB: Any comments on this question of incentives?
>> ANA NEVES: So about the fiscal incentives. So again, it's the opinion and level, of course it doesn't work at European Union and commissions level, it has to be up to each Member States because the fiscal policy is not European policy. So it should be up to each member state to decide what to do at the national level. Why? Because they want that their industry will develop and will be very competitive and will be advanced in this ecosystem where we have mainly U.S. companies working. So the fiscal incentives I think is something that we are thinking about, but of course at the national level, and it really stimulates, it depends, but it really can stimulate. I don't know whether I answered to your question.
>> AUDIENCE: My name is Christopher Civilian, I work for the ACLU in Washington, D.C., I have a question for the two gentlemen working for AT&T and Deutsche Telekom on the panel. One year ago Deutsch Bigel revealed that Angela's telephone calls had been captured, a device created in Germany, around for 20 years, this is a widely available surveillance device, you can now build one yourself for about $1,000 and in fact in June of this year Newsweek in the United States revealed that foreign governments in the United States were operating these devices and spying on the phone calls of politicians and policymakers in Washington. To be clear, these are not the NSA, these are other governments just as other governments spy around the world. My question to the gentleman from AT&T and Deutsche Telekom are one year later what are you doing one to warn your customers about the risks and two what are you doing to protect your customers from these risks of private interception and interception by governments other than the people ‑‑ other than those of the people being targeted, the cost of interception has become so cheap, there are technologies that are available, but the carriers have not rolled them out and have not warned people. Why have you not done anything or what have you done?
>> THOMAS GROB: Okay. I'll go first. I think in Germany the awareness of such practices is very, very high. What have we done is the more interesting part. We have made one proposal which has been heavily criticized, which now goes under the heading of Internet of short distances, which means we don't see a necessity for information to take a trip around the world if the destination is actually within Germany or within Europe. I'll be happy to discuss this, as I don't see it as a fragmentation of the Internet, but merely a redesign of the routing.
The other part that we are now doing is we are actively encrypting all e‑mail communication for our customers, that is a direct reaction to the awareness that if you don't do it, everything you send on the Internet is effectively like a post pardon.
>> And as well, it's a constant process of both strengthening the protection of your networks as well as working on procedural strengthening of safeguards within the process and both efforts are ones that AT&T are ones they've been working on, always have been and always will continue to.
>> ERIC LOEB: Do we have any other questions or comments? One final one, then we will ‑‑
>> It's for Joe. Joe, very much appreciated the discussion, but what I see missing, whether OECD, APEC or elsewhere, Telecom regulators, because it is a fact that we have Telco specific rules that may be imposed upon fixed and mobile operators around the world which require many more things than they do, say, on the Googles, Yahoo!s or Facebooks of the world and they're often missing from this dialogue. In terms of improbability, I'm working in countries where there are no formal protection frameworks but the right to privacy might be enshrined in Constitution, so how do we map for that to create this global map of interoperability?
>> JOSEPH ALHADEFF: Well, I think the EU is an interesting paradigm of that discussion even as we speak because the data protection regulation is being rewritten in the shadow of the e‑privacy directive which is a directive more specifically associated with those on the communications side of the issue and then there are complete regulatory structures outside of either the e‑privacy directive or the draft regulation which are also things that need to be looked at. So you're right, the Telecom regulators are not in fact part of those discussions. In all honesty, the APEC process will be difficult to do that in because APEC is a fractured organisation in the sense that there's the APEC and then there's something called the APEC Tel which is where the Telecom side of the equation do not meet, they conveniently do not meet in the same city or at the same time making the meeting of the minds geographically impossible.
So I think what we would look at it in APEC is as a sequential process, that once we figure out how to do it with the privacy regulators, we may then send it over to the organisation that has the Telecom folks because I do think it is important. Now, some of the Telecom regulation which may have an implication in privacy is not necessarily privacy focused but has a privacy impact. That is going to be much more difficult to deal with. Something like the e‑privacy directive is specifically on privacy. Within the EU I'm pretty sure that what happens is when the draft regulation is finished, the e‑privacy regulation will be harmonized to the privacy regulation but that does mean that Telcos especially are essentially keeping this two stack alive until they actually know what to do because they've got one set of regulatory requirements coming from one side of the equation and a different set of regulatory requirements morphing on the other, and there is to this date in the data protection regulation no grandfathering of saying, well, you're already covered by this regulation, so therefore, you don't need to worry about it. That would be one of the elegant solutions were that to happen, but as far as I know, that is not happening unless there's been a development that I'm not aware of related to that. But yeah, it's an issue. Telcos are not the only ones. Financial services companies are regulated with financial services regulation that has privacy impact. Every company that has employees has HR regulation that has privacy impact. So we have an extremely complex Web in which we have not just different requirements but different ways of phrasing things. To give you a case in point, the HIPAA regulation in the United States is the regulation related to healthcare privacy, and there is a privacy rule and a security rule, and interestingly enough, these rules were finished one year apart from each other and they have a different definition of personal information.
So a company is required to follow this law to implement both the privacy rule and the security rule, but the rules within the same law have a different definition for what personal information is. Now, if you can't get harmonization right within the same law, we are all beating our heads against the wall for a long time. The only kind of saving grace in all of this is, those of you in the audience who are involved in privacy, you have guaranteed job security for some time to come.
>> ERIC LOEB: So with that, I appreciate all of the interventions from the panel, the audience. We are going to prepare a summary of this, I think as Joe said well, of the efforts to optimize all of the interests here. And several of the ideas raised here to try to get to a positive sum, it really came down to the need for working on more mechanisms to arbitrate the differences across jurisdictions. A very interesting example that Joe raised of the cooperation between APEC and the Article 29 Working Group, of what Bertrand is working on as well. A lot of focus on the need for procedural enhancements, the need to develop independent oversight of mandates and to come up with methods to ensure that formats of requests work out in a good way. We didn't talk too much about transparency and transparency reporting. It came up some. But again, one of the critical issues.
And of course the need, as Mukesh raised, for active investment on security, to be fully aware of the risk environment out there to analyze where the risk assets are and to make strengthening efforts in there.
So we'll capture these points. We thank you very much for your participation today, and I thank all the panelists, our remote moderator and our Rapporteur.
(Session ended at 10:39 a.m.)
Thisis the output of the real‑time captioning taken during the IGF 2014 Istanbul, Turkey, meetings. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.