IGF 2017 - Day 4 - Room XXVII - WS245 Datafication & Social Justice: What Challenges for Internet Governance?

 

The following are the outputs of the real-time captioning taken during the Twelfth Annual Meeting of the Internet Governance Forum (IGF) in Geneva, Switzerland, from 17 to 21 December 2017. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

Partnering with Edi, which is what Uber kind of came to. There's a dating service, and actually when I talk to people about this and say, don't you find it weird that this comes into your dating life, and women, young women, in China actually have said to me, oh, no, we think it's fine because. Of course, I want a guy to be good‑looking, but I really care about his bank balance. I would like to know that before we date. That is important. It's currently. Let me just tell you the features. Credit history, behavioral habits, what is called fulfillment ability, like the ability to pay off your debt, whether you fulfill contracts, whether you fulfill your obligation, social networks like what the U.S. wants to do right now. These are the five points of the pentagram. It is a black box. Surprise, surprise.

The algorithms, of course, the black lists that are generating are shared very freely with government, and it begs the question of reciprocity. What prevents people from traveling, and it's so finely grained, it prevents how much you can spend when you travel, whether you can get on a plane. It has ‑‑ it gives you between 350 and 950 points that you earn to behavior. So 600 and above is good. When you get to 600, you can get a loan. When you get 950, you can do all kinds of amazing things. It is linked by what everybody around you does. Your friend fails to pay off a loan, ding, your credit rating goes down.

It's also really difficult to project, to investigate and research because nobody wants to talk about it because their social credit rating will go down the second they actually are discussing it and saying anything other than that, oh, my God, it's an amazing scheme. It's all about trust and sincerity. It's going to forge public environment, and it will strengthen sincerity in government affairs, and it's about commercial sincerity, social sincerity, and the construction of judicial credibility.

 It's little things. Do you pay your electricity bill on time? It doesn't completely devoid of contracts. Did you not pay your electricity bill because you have been run over truck and were in hospital? Doesn't care, right? Because you use a card twice to go to the subway, because you're a mother and a child and you want a tax card, or were you actually committing fraud and trying to get a free ride off the system?

 It's completely devoid of context. Something about your shopping habits become a measure of character, and they say the technology direct says, of course, we judge people by the types of products we buy. Someone who plays videogames for ten hours a day, for example, would be considered an idle person. Well, that's pretty much all of us living on our computers all day. Someone who frequently buys diapers would probably be considered a parent so they are likely to have a sense of responsibility. I know gamers who are more responsible than most parents I've met. This puts you in finely grained boxes.

 Sharing what your credit refers to as positive energy online gets you points. If you say wonderful things about how well the economy is doing or how great the government is doing, it will make your score go up. Reaching 650 points, you can rent a car without leaving a department. You can get faster check‑in at hotels. You can get V.I.P. lounges at the airport. You can get lower so you have to produce fewer documents when you apply for a Visa to go somewhere. Roger calls it the bastard love child of a loyalty scheme.

 It's all the loyalty schemes that we have to sign up for because we want frequent flyer miles, except it affects everything down of your chances way beyond miles. It's about whether you can rent a house in a particular neighborhood, whether you're trustworthy to be a baby‑sitter and those kinds of things. Someone actually said something interesting where he said, along, as someone Chinese, I know everything I do is tracked anyway. Wouldn't I rather be aware of details of what's being monitored and learn how to abide by them and actually try to gain and work around the system? If I know what's being profiled and tracked, it's better. Instead I was already subjected to it anyway. Maybe it's a system like this that tracks these issues and creates a system about this, and it's actually better. He said would I rather live in ignorance and hope, wish, or dream that personal privacy still exists and that our ruling bodies respect us not to take advantage, or do I want details about how the system works?

 That's just a very, very quick picture of the system. Of course, eventually it will lead to this being gamed. It's already a huge black market in replication and trust as we see even in India with, you know, whole sort of black markets of fingerprints, of various ways to game. There are various ways in which it can be manipulated.

 Also in the context of people who have been following the net neutrality debate. If your credit scores are low, you have lower Internet speeds. You know, it literally goes down to that level of influence on every single aspect of your life, and I think ‑‑ I'll end by saying it sort of ‑‑ you know, we talk about how our online and offline has merged in ways that he says creates an on‑life, and this is the extreme manifestation of an on‑life where everything you do is seamlessly merged, tracked in a finely grained way to create an overall profile of every single thing you do literally like from cradle to grave.

Careful times ahead.

>> Thank you very much for your description of what's happening over there. Now ‑‑

>> See if I can get back into the country now having said all of this.

>> Thank you for giving me this opportunity. What I'm going to do is try to explain academics in 1983. It's Gunther Hubner. According to Hubner there are four phases of development. The first stage is the –

(Audio fading)

The third phase is the responsive phase, and the final phase is the reflective phase. Basically, the reflective phase is during the ‑‑ where the Monarch was the ‑‑ during the autonomous phase there was a rise of constitutions across the world, across multiple sanctuaries, and the independent judiciaries and the independent judiciary can't have the power of the Monarch. The next thing is the responsive phase when people realize that the high level constitutional principles are insufficient to regulate the behavior that are very active and can cause harm. If you are a single restaurant in New York City, you have several regulations when it comes to hygiene, sodium, trans‑fats, calories, et cetera, temperature of the beverage that you are serving. There is very detailed rule‑making.

 The other feature of the responsive trade is the rule‑making is done competitively. Stakeholders are consulted when the rules are produced, but the overall vision of the legal system is a top‑down system.

 The trouble with the top‑down system is if you want, you can launch an attack and then the system stops functioning. During the last phase, which is the reflection phase, the idea is that you fully take on board both complexity periods and ‑‑ and you try to engineer self‑regulating and auto correcting systems without necessarily having to go up the command and control system. I have no legal training in the law, and all of this didn't make a lot of sense to me when I read it, so let me give you the way I understood it.

 There is a home, and the mother is the Monarch, like most homes, and the father is the judge, and the project for justice is that they have to share kick. During the reflective phase the mother would cut the cake however she likes it, and you would just have to take the slice. You couldn't complain about the slice that you got. During the second phase, which is the autonomous phase, the constitutional cake is produced and stuck in the frig, and the constitution says everybody is entitled to an equal slice of cake.

 The mother would still cut the cake, and if you were unhappy with your slice of cake, you could appeal to the father, and the father would read the constitution and deliver justice.

 In the next stage it is realized that cutting cake equally can be done across multiple axis, and if you cut the cake, for example, horizontally, only one child will get all the icing, and the other child will get no icing, so they have a meeting on the table, dinner table, and they go through rule‑making. For example, on some cakes there are ‑‑ each child wants an equal number of rosettes in their slice. It's not just a high level constitutional principle, but also highly detailed rule‑making in order to deliver justice. The overall process of delivering justice is the same. You have to go to the ‑‑ in order to have justice.

 The final phase, what you really want to do is to eliminate the constitution, to eliminate ‑‑ according to Hubner, eliminate the rule‑making. Eliminate the monarch and the executive and the judiciary, and you want to somehow engineer justice into the top system through process. The solution there would be that one child would cut the cake, and the other child gets to choose the first piece. So justice is delivered through the system.

 Now, unfortunately for us in the modern world, one child is Facebook, and the other child is an ordinary system, and we cannot be pure. We will always need the constitutional fallback. It's like one child is blind, so the other child cuts the cake and puts a shoe on the plate and gives it to the other child. Reasons why the reflection solution can definitely fail.

 What the private sector is telling us today is that in the big data moment all the traditional principles that we have talked about and are articulated like the GDP are obsolete. We should get rid of them. Data is compatible by design with consent and data and so on and so forth, and we will just get rid of it and accept the pulse of the big data moment, but what I would like to submit to you humbly in this room is that we need to reinvent it. We need to reinvent all these principles for the big data age.

 How do you do it? I will give you one example of reflections solution, and then I will shut up.

 So in India 20% of the population is ‑‑ only 10% of the population is English. It's very difficult for no matter what you have in the detail of rule‑making, it's difficult for people to protect their own rights. What is being considered currently is something called the layer of consent focus.

 Now, the consent focus are intermediary. Suppose you go to an insurance portal, and you want to get competing bids. They use the A.I. and big data to provide big and portal gives you a comparative analysis, and then you get to choose.

 Unfortunately, the way the Indian currently envisioned the infinite number of brokers, and brokers make money from the data controllers. We need identity brokers so you have for a population like India 50 to 100 consent brokers, and there is one relationship between data subjects and consent brokers. That means you can only choose one consent broker at a time.

 Like those that give you advice on investments, you cannot ‑‑ the intermediary or the agent cannot make commission from the investment funds. They can only make commission from the investor. Consent brokers will only be able to charge fees from the data ‑‑ they cannot charge fees from the data controllers, and, therefore, be market mechanisms will be used to insure that your rights are protected. Consent from multiple platforms. You just need to get in touch with your content broker. If the consent broker knows the ebbing question fax database has been breached, the consent broker will act on their behalf because they will like to keep your business. You will take your business to another consent broker and so on. Each of the principles, if one tries to look at the reflection solution, you can come up with new ideas, and this is what we are trying to propose in India. However, they should always be the fall‑back, which is that if you are unhappy with the reflection of justice, then you must be able to approach it and you must be able to stick the criminal justice system upon the mega corporation.

 Thank you.

>> Thank you very much, Neil. For now, we go all the way to Brazil.

>> I'll give you an overview and give you an ID and how you can have a unique item for identifying and having Brazil connected to the government. Participation started to emerge for government services, and we have many ID's, which is a number that connects them with financial services. There are two national ID's. You can have a driver's license or work permit. Those are all different numbers. To change the situation, they have unified ID, and to aggregate for the national ID and cards.  There is another question about security and identification. A lot of proposals talk about the security issue around the document and the security of the document being insured by using modern processes to keep ‑‑ it doesn't say too much, and it doesn't say how it's going to be shared. There are standards to talk about the –

(Audio fading)

It was recently involved in a scandal for sharing personal data. 141 million, which is a broker, and it's part of the broker framework and private company that manages credit scoring in the country. It's the court that's also the one that's pushing now for analyzing social networks for the election

and --

(Audio fading)

This is the body that once centralized our ID. You side with the courts, the federal revenue agency.

 Coming back to credit scoring, I mentioned sharing your database with an electronic court. That is located in the country. We also have the emergency ‑‑ we don't know the origin of the database for the credit scoring. We went there for the purposes of ‑‑ we went to the fair and both of the solutions will include more people to the same credit scoring.

 He was, like, well, for instance, if you are a single woman, you probably have less credit. If you have a husband and you go crazy shopping, so then you see the kind of values that are embedded. We have an instrument so we can express what kind do we have? One was denied credit, and we took this case and asked why. Not only why, what was the score, and how the score was composed. Then we could have an answer. We didn't even understand.

 Just to mention before ‑‑ unified net worth for the federal revenue, or court. The credit has changed the credit scoring that they will use to doing what's more related to, you know, financial dockets.

 You also see it's also innovation, but then innovations also are creating in the cities, the ‑‑ my city has which particular ‑‑ it leads to a security issue. Banks, transportation systems are more and more asking for biometrics and want to be more secure with the biometrics. We have no clarity about the standards, the security measures. We have airlines using photos for sufficient recognition. That is also ‑‑

(Audio fading)

>> They'll try to understand you before they get you lower, and go by negative and positive scoring, both negative and positive entries. To know that you don't pay something, that you know everything that you paid with this idea. Or this is the scale so they can give you credit or not. Maybe this is the thing that can improve their credit.

 There are positives or we scale and social metric groups that ‑‑ about the consent decree, I'm not sure this should be the focus because the original concept is something that sounds like is information that ‑‑ it doesn't sound like that for me. It can be online data, and there are things that ‑‑ there is a lot of questions about personal data and personal disclosure mostly. It's the main type of situation, and basically, they ‑‑ it's the kind of data that ‑‑ they end up giving you the amount. They try not to.

>> With before we have a couple of brief responses. Any other kind of just couple of questions for clarification? Let's make it very brief because there's more contributions later. Just one sentence, brief questions. One, two, three. Yes, please.

>> Hi. I'm a PHC student in Brazil International, and I would just like to call your attention to two trends I think that we should discuss further here. The ability to collect the fingerprint of people. I think this is a trend not only for this for security and other purposes. The other is to look at ‑‑ most of this technology begins with solutions and

(Audio fading)

>> The question was about the Chinese social credit scheme and how many Chinese are already in the pilot version and the research about whether you have any indications of the impact of people that are denied access.

(Audio fading)

>> I'll keep it short. The question specifically is also what happens with the kind of framework you have suggested. It assumes a very informed ‑‑ let's say these kind of consent brokers, and they are like any other people and they start telling you that ‑‑ if you allow a little more easy sharing of your data and people do not have that advanced knowledge of what sharing of data kind of harm that they could do to you, and they look at that as more important, and they would go to this which is providing (audio cut out) ‑‑ and that's where the collective comes into picture.

 People in these structures are different. I think between the connective part, which is very important, the responsive, we have to build a good system, political system, and we often in these discussions completely try to give up the political part of it, and that's not that you are doing it, but for you, the question will be the first part. Therefore, I'm saying the political has to be even more. Thank you.

>> MODERATOR: Very brief responses?

>> PANELIST: The gentleman from the ‑‑ clarify the sub-correction which is consent workers should not have any personal information. They only manage artifacts. Who do you give consent to whom for what?

>> PANELIST: That was my mistake. I did not make that clear. The consent broker can only make money from the data center. They can only make money from the data subjects. They have no incentive to save data for the present. The incentives protect rights, and suppose in a good system the consent broker can also induce. If I took a consortium of saving, it can be their own consent broker by preventing the people. Consent brokers are not necessarily for profit.

 It has to be done to prevent any for the data. Only allegiance to the data center.

>> Very quickly. I think the problem with the response the gentleman from Brazil, I think it's not just that they don't have an interest in knowing you and to actually help improve credit in the system. I think the problem is in the merging between commercial data that has a sort of local financial incentive and completely ‑‑ it's that blurring of lines with the credit score that I find problematic.

 I also said in the beginning that you can see why systems like this are helpful in countries there is no other rating system. I think it sort of starts to bleed into the territory of something that actually goes a little too far. There are people that are happy, and he is a foreigner. There is no other system that could help him to get ‑‑ to be known by the system.

 He is very happy with it, but there are also ways in which the fact that it's not transparent, I think, is a real problem because if it's a genuine attempt to actually do just credit rating ‑‑ they've got systems now there are people that can advise you on how to improve your credit rating. When a system is opaque, you don't have the ability to negotiate it or say that the data is extremely inaccurate. It's that black box that is a concern. I think we're seeing with a lot of these systems, they are on search for a genuine market gap or a genuine social gap. I think the terms in which you implement them really matter.

 In terms of the numbers, I don't have numbers. I can help you get them. I feel like they've been doing some research on that. It's really hard to research. There are applications for eight more licenses for people to do more pilots and the government can put that on hold for now, citing conflicts of interest and questions about the credibility of the organization.

>> MODERATOR: We have another example of the citizens from Chile.

>> PARTICIPANT: There's a lack of ‑‑ it's highly risky to the whole situation. I look forward to human rights at the government.

 The first case is to build a database. It's our national office, and in order to make ‑‑ to the Chile political process, they have the whole register. The voter register is ‑‑ first, it's the whole of people who votes in Chile. It's either the voter's name or ID number. How it works, I'm going to show that if the web page that published and I could ‑‑ I did a little emphasize, and I put the name of our president. Next slide, please.

 Then the name of our recently elected president. We have the complete name of our president with his ID number with the personal address, and where is the city that she lived.

 The next slide, please.

 The same thing. Next slide. No, the other one. Perfect. Then we have the same information of our recently elected president.

 There is a massive privacy breach of the people in Chile, including our president. It is a violation to have our identity on social rights because it only shows the legal, without any concern of gender identity.

 We address every single person here, and you can use this information to do massive and personalizing and bring our loans. Through this information we can find out some other sensitive information. Next slide, please. With the ID number, I can ‑‑ it is available to get a certificate of birth. There is a certificate of our president. Even then we can find the complete name of her parents. Next slide, please.

 You can also find the marriage certificate of our president. Her husband ‑‑ his wife name, her ID number. Also, we can find information ‑‑ we can realize how is the implication of this.

 The second case, I will want to explain to you is the ‑‑ in Chile state certifications publish every meeting. Because of this, our transparency agency in Chile launched a website. This web page you can ask and search audience with public authorities and see every single audience between the public and citizens.

 Regarding the upper ‑‑ it was published and reported that stated the 84.7% of the audience, but does not comply with the law because of the matters and issues of those, so the question here is what are the content of those meetings? The situation revealed that the purposes and the motivation of these meetings, an audience, are several sensitive situations of the citizens. This audience consisted in what was the situation, financial problems, racial, and homophobic episodes, et cetera. This situation is a problem for everyone that wants to serve this web page.

 For example, I'm going to tell you about a case where a citizen asked, in these terms, "The purposes of this audience is because I'm a single mother of three kids. My current couple was our only financial support, and he suffered a brain accident that left him with ‑‑ we rent our house, and we are not capable to pay the rent. He did not have a contract, so it is impossible to get any insurance payment. I don't have money for my children. I'm desperate. You're the authority of this beautiful town, and I'm begging you to receive me to show some documents and ask for help."

So this public policy exposed a critical situation of the citizens without any concern of the privacy rights. Also, this validation ‑‑ we already mentioned. What does the office say to this situation? It is stated that they will publish this information, but they are not responsible for the content. Who is to protect our privacy rights in this situation? Nobody knows.

 We believe that it's possible to make a few recommendations here. It's not hard to understand this public policy without protection. We have to think that the protection is the first step to build this kind of public policy. Until now it is totally the opposite. Latin America and Chile in particular, we start to think the public policies would open the perspective, but without any check in balance to privacy and data protection.

 The second term, there is not a clear understanding of our government that the personal data that they have does not mean it's public information. You can understand that if it's personal data in the information, and so they publish this without any concern, check, or balance of privacy rights. Personal data is public only if that information has a public interest and whether half of the check is known through guides or other tools. We can look forward in a way between transparency and the profession.

 Last but not least, it opens the policies about human rights and privacy perspective. To be able to see how are we going to show this information ‑‑ what is the purpose of this information, and if this information is going to violate or not human rights? There has to be a dialogue between principles and also data principles. How do you get this information with the purposes? Is it legal enough? Is it the treatment by other actors that are going to respect or not of human rights?

 In sum, it has to promote participation, but that must also include a respect and emotion of human rights. Particularly privacy. Thank you very much.

>> MODERATOR: We heard now a bunch of voices and perspectives on data selection and data as a form of governance and implication for social justice from there and civil society perspectives. We have two more respondents to this in a way from government and from ‑‑ and a perspective on what this means for Internet governance.

>> PANELIST: I'm responsible for the human rights policy department and the cyber department. I would like to share some examples that will show how people and governments are thinking about these issues. It might be something that has been shared before, so I apologize for that, but I feel it's an interesting insight in what's happening within government.

 Let me first say that government officials are very aware of the challenges that are put forward by the emergence of data and the tremendous responsibilities of collecting and using data. The responsibilities it could have for policymaking.

 The challenge lies and the risk lies in good intentions. I think sometimes it's also on side the government. Not being able to think through and also the very fast speed of development of technology and possibilities that do not agree with the very slow pace of thinking and policymaking and law making. It's really a challenge.

 My experience is that most of my colleagues in the government find it a very interesting challenge, struggle on how to deal because especially government policy and regulations are very impactful on people nationally, internationally.

 Let me illustrate shortly three different examples. First, we have been working on a new thing for tax purposes. It has been accepted by both parliaments, and it will enter in in May. In the meantime, there will be a referendum on the norm because that has been for by collecting a lot of signatures in society. There's actually quite an exciting time ahead to have a referendum on something so complex as the transit services.

 The region of this law has been a very long process of thinking about how new leads in a new technological society has to be enforced, but also embedded in an oversized structure that is good. Human rights respecting and where services act, and it's really on the fine balance between the mere existence of preaching human rights and still protect human rights. I can say this is a very long and very detailed and very meticulous verse in the government.

 Let me just think about the most important thing to say about this. If you have any questions, I'll dive in later on.

 The second thing I would like to share is that a Dutch advisor comes on the WRR. It's a report of big data and using big data for security. I think you have to ‑‑ it was published in 2016. It's for security and protecting freedom. This report basically very short, and I think the author, like me, if I was creating one sentence. You can assure respect for human rights and values of old faces of the new sort of big data, and it's actually pointed at the Dutch government, and he did quite a good job of collecting data. Then it hadn't really developed the right framework and measures for the subsequent process of what you do with the data. It's the combination of data sets, and it's formed as a shortcoming and thinking, which is, in my opinion, quite understandable, because the collection is the most important. It seems to be the most important because there are participants that are doing things that have been described in elaborate ways.

 The government is very important to have the right framework for what you do and how you ‑‑ the government has endorsed the advisors and is still doing ongoing work on how to deal with it. I'm not really very concerned with that team working on this, but I can say that this thinking has been used by the Dutch government, and it really -- it's been used in the policymaking.

 The third instance I would like to point out is another report. This report is about human rights in the digital age, and then recently they also published a report that is on ‑‑ it's called urgent upgrades, protect public values. This is helping the Dutch government also to give ideas for policymaking. There's an ongoing discussion on how to integrate thinking on human rights online in policymaking processes. Very abstract.

 What I really admire about the process is that they do it in a very complicated way. They are having discussions within the government and dealing with all kinds of perspectives from raising social media to foreign affairs to health. It's intergovernmental and stakeholders for policy on this. It's really surfaced through the political agenda.

 For government it's very important to have external reports and external inputs to get things going. Now, there is a lot of thinking, but you need the excerpt and often you also need a permanent variance asking for it, and then programs will speed up. Then maybe, referring to the first example, it will be evaluated in two years. It will go into force. There's a lot of uncertainty. Will it work out the things that we design as a government, the policy structures? Will it actually work out as we have foreseen it because it's super fast speed of development? I think in this world there is evaluation and also external in policymaking purposes. Very important. Maybe it's communication for government.

Thank you.

>> MODERATOR: Thank you very much. We close with a few thoughts on what all this means for Internet governance. No pressure.

>> PANELIST: Thank you. As we race down to the final minutes of the session, I guess I was asked to kind of link this back to the issues of Internet governance and there are a few points that I would make just very quickly, ways that I can see this intersect. One is in the context Internet governance is based around a few principles, and one of those is multi‑stakeholderism. I would say that we see ‑‑ we can kind of look back and see a trajectory where Internet governance began in the 1980s and early 1990s and looking at governing and network, and at that time there was a small community, a pretty homogenous community as personal devices became increasingly connected. We saw this understanding that more actively needed to be engaged in this. I would say that now we're looking at not securing the network and not securing devices, but the security of individuals.

 I feel, I hope, that this emphasis on individual security will drive further multi‑stakeholders in this process. The second point that I just wanted to make very quickly was that for people that work in this space to remain engaged with the technical issues because I would say that one of the fundamental changes that is happening now is the migration to IPD6. It's been a long, long process and still has a long way to go. The migration to IPD6 will have implications on privacy because it is the IP address end. I think when Internet governance was first getting off the ground, there were no civil societies, and there were no privacy advocates engaged, and it didn't work out as well as maybe it could have. I would say now we're looking at a similar situation where it's essential that people who are engaged in these issues are also engaged in the technical issues. They'll just leave it there.

>> MODERATOR: Thank you very much. This is an issue where there's a discussion for the rest of the day, which, unfortunately, is not the case. We have to start a bit late because the room was occupied at the beginning, and we started a bit late, which is unfortunate, but also, I think we just wanted to hear from our wide range of perspectives.

Unfortunately, we reached the end of our session, but at least those of us ‑‑ those who were on the list of the panelists, are there any final thoughts from you on what you've heard?

>> PANELIST: On what we can do looking into the future if possible?

>> PANELIST: I think one thing I would like to say is we don't explore other alternatives enough. We do a lot of problem solving around what is wrong, what’s broken, what's terrible about a system. I think we don't spend as much time on unpacking what type of system was meant to solve and looking at is there a better way to solve the problem for lack of credit is an issue or lack of identification is an issue.

 Can we actually spend a lot of time proposing more benign, better, more privacy protected versions of the systems for that particular problem without the externalities that don't have the social justice implications that these carry. I think we often miss the fact that even though they call in the systems, we're meant to solve the genuine problem for poor people and people that are not in the systems, and that's why they find the systems useful because otherwise, they are invisible to the state. We make them visible in ways that are empowering.

>> MODERATOR: Are there brief thoughts on what should be done in the one sentence, Tweet, or something?

>> PANELIST: I think public policy is one. When we start to work, I think it's very important to have all the perspectives of that policy. They have a lot to promote public policy.

    >> MODERATOR: This is kind of a stopping point, and hopefully I think we probably have to leave it here.

>> PANELIST: If you want follow‑up there, you are welcome. Otherwise, enjoy it and hope you have a nice Christmas break. Thank you for coming.

(Applause)

(Session concluded at 10:30 a.m.)