IGF 2021 – Day 0 – HIGH LEVEL EXCHANGE PANEL: How to promote inclusive and diverse innovation, investment opportunities and corporate social responsibility in digital technologies?

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> PAUL MITCHELL: Good afternoon. Can you hear me? All right. Let's ‑‑ 

>> JOSH KALLMER: Yes. 

>> PAUL MITCHELL: Apologize for the slightly rocky start here. I guess with the coordinating between the technology operators and those of us who are going to be speaking today, but thank you very much and welcome to this conversation about social responsibility and the Internet. I'm Paul Mitchell, I'm your host for the day, for this particular topic just to provide a bit of a setup, and then I'll ask each of our panelists to introduce themselves and then we have some prepared questions for everyone to take a shot at. We'll hopefully have some opportunity left in our time slot to be able to have some interaction between panelists. 

Just as a setup, digital transition is well underway, albeit at very differentiates and in different places. Every sector of society is touched by the drive to digitalize and to realize the benefits that are possible. There's enormous amount of potential for good, but this requires strong awareness and capacity‑building efforts to meet the challenges to create technologies that are inclusive and crafted to the real needs of societies in a people‑centric way. Today the panelists ‑‑ I hope all of you who are panelists are going to discuss Corporate Social Responsibility in the context of this drive for digitalization and the potential to address some of the most intractable challenges that face the world. There are many ways to define Corporate Social Responsibility, and for our purposes today, I will start with a basic definition where Corporate Social Responsibility has the sustainable governance of digital technologies and respecting the Human Rights. 

The CSR role is really key to handling the new relationships between labor and workers and technology and for achieving equity in access among all of the different sectors of society. We have an excellent panel today. I have three questions that I'm going to ask the panel: First, what does Corporate Social Responsibility in the digital world mean to you? What are the chances and challenges of Digital Transformation? Is it possible to regulate digital technologies without slowing down development and innovation? If so, what and how should it be regulated to address social responsibility issues? Finally, how should the relationship between business growth, Sustainable Development and governance of digital technologies be defined. 

I will ask each of you to hold your initial interventions at 3 minutes so we have enough time to have everyone have an opportunity to speak, and then to close out the session we'll ask the panelist, they'll have the opportunity to inform us all about any involuntary commitments addressing digital growth and the items being discussed today. I'm going to ask each panelist in turn to introduce themselves and I'll start with Noel Curran. 

>> NOEL CURRAN: Hi, there, Paul. Good to join you. I'm Noel Curran, Director‑General of the European Broadcasting Union. We represent 115 different Broadcasting Public Service Broadcasting media organizations in 57 countries. I'm Irish. I was previously working in public, commercial media, a Director‑General of the broadcaster in Ireland. 

>> PAUL MITCHELL: Thank you. 

>> JAN KLEIJSSEN: Good afternoon, Paul. Thank you very much. I'm Jan Kleijssen, Director of Information Society and actually of the Council of Europe, an organization bringing together 47 European countries plus 5 Observer States, and it caters for some 820 million European citizens. My responsibilities include Internet Governance, data protection, artificial intelligence, cybercrime to name just a few and I look very much forward to the discussion. 

>> PAUL MITCHELL: Thank you. I have on my screen I assume placeholders, but if any ‑‑ if we have any other people on the line, if you could please unmute yourself and let us know that you're here. 

>> LUCIO ADRIAN RUIZ: I'm Lucio Adrian Ruiz. I'm the Secretary of the Communication of the Holy See. we're in charge of the communication around the world with the different media and I come from Rome. 

>> PAUL MITCHELL: Thank you. Do we have anyone else? 

>> MIYA BARATANG: I'm Miya Baratang, founder and CEO of Women Who Code movement, and we focus on teaching women and girls digital skills to make sure that we bridge the digital gap that exists in the tech space. Thank you. 

>> PAUL MITCHELL: Thank you. Anyone else? I would like to go straight to the questions and the discussions and Jan, perhaps we'll start with you and ask you to answer the question, what does Corporate Social Responsibility in the digital world mean to you and the challenges of Digital Transformation? 

>> JAN KLEIJSSEN: Thank you very much, Paul. I'll try to stay within the 3 minutes. Like you, let me stress that, of course, digital transformation can be a tremendous force for good and offers great possibilities to make as well a more equitable, fairer, safer, and also more sustainable place. At the same time, digital transformation can bring consideration risk, and I'll come to that in a moment. As an organization, the Council of Europe, which was founded to defend Human Rights promote the rule of law and democracy, we have started to look at the fairly early stage. 

For instance, data protection, digital transformation brought enormous possibilities to enhance the treatment of data but of course also the manipulation or abuse of data, the infringement of the right to privacy which is why 40 years ago the Council of Europe established the world's first protection ‑‑ convention on the first treaty of the personal protection of personal data. 20 years ago, the first convention against cybercrime which has been followed by a number of specific treaties since to fight the abuse of children, specifically innovation, thinking of child pornography and other forms of abuse, grooming, more recent treaty we have, it is on crime, falsification of medical products, to just give you some examples of issues that need to be dealt with. 

We're a lawmaking organization. We try to deal with the challenges through the adoption of common legal standards, but binding legal standards treaties transposed into national law and where parties that have the treaty, help each other in the execution of their obligations and of the provisions, of the treaties, that's important to stress here. In order to do so, we very much need the private sector. It is clear that Corporate Social Responsibility which I understand as the awareness, willingness of the private sector to shoulder responsibilities that come with the development of their products in a way that from the Council of Europe point of view respects Human Rights, respects the rule of law and that companies shoulder the responsibilities and accept them. 

A number of the treaties I mentioned, the legal text directly addresses the private secretary, for instance the convention on the sexual abuse of children has a couple of provisions that directly are addressed to the private sector. We have a whole series of policy recommendations that do the same. In order to involve the private sector, even more with our work, with the law making, the Council of Europe has opened up a partnership, some five years ago now, and currently some 28 major tech, telephone companies from around the world have signed with the Council of Europe in exchange of letters in order to promote Human Rights rule of law and democracy through their work with us and we give them a seat at the table. I'll stop here now, and would be very happy to elaborate further in relation to the more specific questions. Thank you. 

>> PAUL MITCHELL: Thank you. I would like to actually move to Miya Baratang and have your view on this particular topic. 

>> MIYA BARATANG: For me, I believe that the Sustainable Development Goals of the UN focus on addressing Human Rights and the planet's wellbeing, we have 17 goals, that already encompasses poverty eradication, and the promotion of health, women, climate and more. Essentially what the SDGs are doing is focusing on transforming the world already and giving a guidance to corporate social investment. This is a good foundation to guide for a better world. We are trying to transform the world, but we aren't doing much to transform the individuals leading the movement and the individuals that are creating and deploying technology. 

The focus on individuals needs to be immediate in that way we shift their minds from being driven by profit focal point to one of growth and addressing the 17 goals that are already elaborated by the UN. That way we can have tangible motive and the betterment of all human kind. Sustainable Development and digital technologies for all intents and purposes are connected, however the biases and they're inherent as old as human civilization, this is and currently sees businesses and the technology deployed inherently biased towards race, gender, disability, even age, historically and to date, members of dominant groups or nations make bias decisions to the detriment of other groups or nations. Transforming of individuals which I believe is the biggest thing, especially the leaders in the tech space with transforming their mindset is truly at the core of inclusion. 

That's the power to total societal impact. I come from Africa, a continent of early adopters of technology. The youth of Africa are extremely hungry for inclusive opportunities, not just by words but by deeds. And inclusive business models does not have to necessarily come from governments. It can come directly from business leaders themselves. There are companies currently bigger than nations and their profits that are bigger than some African countries 'physical budgets. What we're asking technologies and creators, big business to do, is to focus and to realize that humanity is at its core and it needs to be saved because at the moment at the rate we're producing technology and AI people are already scared of losing job. 

People are already scared of what's going to happen to them in the future. Technology is not there to create such mindsets, technology is supposed to create opportunities and change people's lives. As a black female, the most significant part of my contributing to the bridging the gender gap in tech and building inclusiveness in this space, it is realizing that it is going to be a big challenge. Especially because the industry still is very white and male dominated. I have also ‑‑ may job at the moment, it has been part of going around, encouraging big tech businesses to realize that essentially we are their partners and customers. If this is understood, big tech businesses will comprehend that their success in the next 100 years will depend on us buying the products. 

If we're not included, we won't be part of buying. How do you buy a product that's against you, that's not improving your life in anyway but making profits for someone? I find that very challenging. At the moment, we live in a world where a global generation is extremely conscious of justice, people are taking it upon their hands equity and equality, it is something that is at the forefront. The injustice against humanity and our planet is not being tolerated anymore. The global youth have high standards and are holding people accountable for actions. We have seen that happening all over the world. Business leader, they're becoming more influential at this time in our lives and come from the tech space, that we already see where the trends are going, what decisions are being made and what kind of the future they're holding for us. 

Because of this, we need to bring them to the table and realize that IGF and any other movement that's trying to govern sustainability and inclusion has to have them in the forefront of decision making because they're the ones that are building tech and deploying and deciding the future. At the moment, tech is really deciding the future. The personal ambitions of profit‑making model is slowly moving away, I have seen that happening at the World Economic Forum discussions and how the people are discussing in negotiating businesses. Business deals now, the language is moving from profit to how can I make a difference? Those businesses that are speaking like that on how can I make a difference, that will show in their profits and will ensure how loyal us as their customers will be. They would be inclusive and they would include all of us as human beings. Those are the businesses and investors that I see succeeding in the future that focus on sustainable global impact. Thank you. 

>> PAUL MITCHELL: Thank you very much. I'll turn to Monsignor Lucio, please. 

>> LUCIO ADRIAN RUIZ: Thank you. About the Corporate Social Responsibility, it was said for every action there is an equal and opposite reaction. More generally, we can say for every action there is a reaction, and in our life we can say every action has consequences, both positive and negative. Keeping this in mind is fundamental, and for me is the heart of the social responsibility. It must be applied in all situations and in regarding individuals, society and our common home. We must, therefore, consider the positive and negative consequences of our actions. It is not possible to talk about the real progress and real development if part of the humanity can be displaced, replaced, confused, and lose the meaning of life. As many know, there are many developments that have resulted with these consequences. It is therefore important to talk about the chances and challenges, the wonderful of everything that can contemplate in the digital age. 

Also the promises for emerging technologies that is not talked about, the great potential that the digital transformation produce is really a beautiful dream, but that is also our challenge, remaining with the dream being trapped in fascination over achievement without looking at the negative consequences, without looking for those who are left behind. You have to look at the harm that can be done. Not only this, but the beautiful things that can be achieved to really dream big, we must consider any possible effect of our action and decision. Today's world is experiencing too much suffering, and also death. Development, nutrition, health, education, also freedom, privacy, security, peace are not aligned with technological progress. They don't grow at the same pace. Well, for me, social responsibility in the digital world means being attainable to the sign, accountable by the sign, to look at all consequences, positive ones to promote, negative ones to avoid for all of the actions for today and for tomorrow, for the individuals, for the society, for the world, our common home. 

>> PAUL MITCHELL: Thank you very much. I really liked the beautiful dream concept and the accountable by design. I think we may come back to that in a little bit. Moving on to another question, to build our framework here. Is it possible, do you think, to regulate digital technologies without slowing down development and innovation and if so, what could be regulated and how should it be regulated to address social responsibility concerns? Just going the same order I think if that's okay. Let's start with Noel. 

>> NOEL CURRAN: Start with me? Is that ‑‑ 

>> PAUL MITCHELL: Yes. Yes. 

>> NOEL CURRAN: Yes. In terms of the ‑‑ yes is the answer. You know, I'm in the media business, broadcasting business and it is one of the most innovative areas over the last 20 years that we have seen in terms of reacting to audience trends. It is also regulated. I think the two are not mutually exclusive. I think it is important that we realize that, in particularly realize it when it comes to looking at the digital sector. I think the last thing we want to do is stop innovation in the digital sector. It is a massive benefit to our society. It is bringing a huge amount of positive, a lot of negatives as well that we really need to be weary of. 

It is bringing huge amounts of positives. The kind of regulation we need to see, it is back to the basic values, like transparency, understanding better the choices and decisions made around algorithm, et cetera, by these platforms. It comes down to sharing, you know, data, elements like that. It comes down to a key component, findability, it is who makes the choice when you're looking for information, who makes the choice and designs the algorithm that will present information, some information to you before others? What are the decision making processes, how do you find trusted sources of news and information and what are the technological decisions that are being made behind that? I think you have got a whole range of things like that. You know, I think there are strides being made. I think we have seen with ‑‑ we have seen the debate shift I think in recent times around what are the responsibilities of those in the technical sector. I think we have seen a lot more focus on ‑‑ we have seen the negative impact of disinformation. 

I think we have seen a lot more focus on what is causing this, how can we stop that. I think the ground has shifted in some regards around all of this and I think we have seen it at the European level with the Digital Services Act, the Digital Marketing Act, we have seen it with commissions and groups looking for information and disinformation and we're actively involved in a lot of those. We have seen it at organizational level where organizations are working, trying to work with some of the tech companies and tech giants around what we may achieve in terms of voluntary or not regulation to ensure that the public receives trusted sources of information, also to ensure that the public are protected against all of those negative downsides that you have when it comes to some of the innovations and some of the technology that we're seeing. 

I feel that there is a realization. I think it is a realization of the need for some regulation. I think we're now at a stage of trying to shape exactly what that is. I say again, this is not about stopping innovation. I do not believe having regulation necessarily stops having the correct type of regulation necessarily stops innovation and development in the technology sector. Paul, I can't hear you. Paul. I'm sorry. 

>> PAUL MITCHELL: I'm sorry. I'm having microphone problems today. I wanted to thank you for the outline there and turn to Miya next. And Miya, could you share your views on the regulation question? 

>> MIYA BARATANG: I agree with the previous speaker. We can't really stifle innovation and creativity. I just think regulations have to be there for ethical reasons because I think what happens with technology is that it enables us to solve lots of human problems, and that come truce creativity and innovation and the freedom of the creator to be able to test different algorithms. What we need is to get our governments to really get the infrastructure, right, to get the data access, the models that are being tested in terms of creation of new technologies have a better framework in terms of the research being done, the rollout, people have proper Internet access, to be able to participate in the early stages of technology. At the moment, we still are building tools, testing them if they work or they don't work, if you regulate that, we're really going to stifle innovation. 

Technology enables us to live longer. Look, we just came from COVID and we need lots and lots of research and innovation. With that mind, if we regulate too much, then we lose the point of the revolution of the industrialization and technologies. The challenge at the moment is that it is generating a massive, massive divide, the digital divide and the divide between the rich and the poor and it is now perpetuating rich nations and poor nations to be in charge. I don't think it is regulations that we need, but I think it is interventions of coming up with good standards and good models and proper governance that is united globally to make sure that both quantitative and qualitative research from both industry and society is done properly and it is increasing the real average global human life and betterment of all of us, the ethics should be right, whose ethics is it? It is something that we all need to discuss and it is something that we need to get right as the global community because at the moment all our ethics are quite interesting. 

I think if we leave technology to the technology deployers and the technology creators to do whatever they need it is a loss. We need a balance. Now, how that is going to be balanced, I think it is something that we need to discuss through standards and better norms and it doesn't have to evolve with time. It has to be done making sure that our planet is not damaged in the process of creating technology and human life is not going to be more injustice against humanity because somebody is creating technology. If they can work together that would be better. 

>> PAUL MITCHELL: Thank you. 

>> JAN KLEIJSSEN: Thank you very much. Of course, self‑regulation is often heard or advanced as an alternative to regulation. I would state here, I would like to argue that in the field of new technologies where a self‑regulation can be a very useful complimentary tool, it is not sufficient. Let me perhaps take the example of the current crisis we're in, the pandemic because I'm afraid ‑‑ the fact that we're meeting online, is on one hand the tribute to the power of technology and innovation, and on the other hand, of course, also an example of our limits there. It is clear that artificial intelligence, for instance, has greatly enhanced the development of vaccines. I think the realtime sharing, the extremely rapid analysis of data, other machine learning processes that contributed to the development of the vaccine has given us tools that we wouldn't have had, say, ten years ago. At the same time, staying with the pandemic: Artificial intelligence no doubt is playing a role, continues to play a role, in the polarization of our societies. 

Because of the way social media for instance and Internet providers channel information, algorithms that are at work there, the filter bubbles that are created, the lack of access to alternative use at an increasingly large number of people are having. We have seen this with the pandemic, and seen this in the overall democratic debate and in the political landscape in an increasing number of countries. The question was how does one define the relationship between regulation and innovation, I would say it is mutually reinforcing. Again, staying with the pandemic, the medicine industry, medical industry, it is one of the most tightly regulated areas on the planet, sectors of industry on the planet. 

Yet, as the vaccines have shown, also highly innovative, extremely innovative, and very strict regulation has in no way hampered innovation but it has produced good innovation the for improvement of the situation. Self‑regulation, ethical standards are useful as a source of inspiration, but let's make no mistake, they don't confer rights on users or citizens, and they do not confer obligations, legal obligations on the companies, of those that use them, importantly, they also don't offer remedies if something goes wrong. With government regulation, coregulation, if you like, of course, with industry, very much we're on board in designing the regulation, one can, I think, create like for the medicine industry, the medical sector a framework which enhances innovation but does confer rights, does provide for remedy, and does create obligations. 

It also has the initial ‑‑ additional ‑‑ sorry, not the initial ‑‑ the additional advantage of creating level playing field with self‑regulation, certain companies may be much stricter than others. That may put them perhaps in a comparative advantage or disadvantage according to your point of view. When you have regulation, international regulation, you will create a level playing field for companies which I think is in the interest of their business models and certainly also in the interest of I think of their customers and of the citizens. On artificial intelligence, for instance, the Council of Europe has just finished the elements that will go into what we hope will be the world's first treaty on artificial intelligence on which negotiations will start at the beginning of next year drafted with Member States and non‑member States and importantly industry and Civil Society, a multistakeholder concept, very well‑known to the IGF discussions. 

That format is a best guarantee to ensure that innovation will be innovation for good, that the products will be trustworthy, that the processes leading to the projects will be transparent and that the users are not just objects, but subjects with rights and remedies. Thank you. 

>> PAUL MITCHELL: Thank you. That's a great lead‑in to actually a bit of the opportunity to discuss a little bit more the relationship between business growth and how businesses make decisions about things like deciding on self‑regulation or trying to propose various ethical standards and how we can create alignment at a global level where it is possible or it may not be possible, how do we address the differences that then will be part of the ecosystem that we are all sort of trying to work towards together. I'm open to any of the pan listers for a little bit of discussion on that. 

>> NOEL CURRAN: I will come in on that briefly, Paul. You know, I go back to something that Miya said earlier on. I think businesses are realizing the benefits of policies and goals that look at issues like sustainability, that look at issues like diversity not across the board but a lot of businesses are. You know, I certainly ‑‑ certainly within our own field we know, first off, they're core to our value, we also know they're increasingly important to our audience, particularly young audiences. We know we have a long way to go in some of these areas. You know, again, I agree with Miya, the media industry is too white, too male, to particularly ‑‑ too particularly demographicked. 

There is a realization of the importance of the issues, so that elements like that, they're really positive. I think you mentioned there about, you know, how businesses will make decisions around issues like self‑regulation or not. I this a lot of businesses will go ‑‑ will opt for self‑regulation as a first option. I would totally agree with Jan that self‑regulation has a role to play in certain areas and in certain areas of media. My experience of dealing with the tech companies is that self‑regulation doesn't work. I was on the E.U. expert group on misinformation. You know, I think some of these larger organizations need to be pushed and they need to know that regulation will follow if they don't adhere to set goals and sign up and deliver on those. 

I think, you know, in those kind of areas, there are some areas where there is common agreement and I think it is now a matter of the governing bodies finding how we can implement those in terms of sustainability goals, in terms of diversity, in terms of issues like regulation and what is the most practical and effective way of getting real change in those areas for us. 

>> PAUL MITCHELL: Thank you. Monsignor Adrian Ruiz, I wonder if you have some thoughts on this area? 

>> LUCIO ADRIAN RUIZ: I want to just add a little idea on the argument about the regulation and promotion. I think it is necessary to put together both things. Promotion and regulation are not opposite things, but both together must work and allow us to work in the right direction. No one that really loves, that really wants to live understands that they control the limitation of something that could be wrong is something bad. For this reason, the question of the control, the regulation, it is to define the meaning, the objective, the goal of this regulation, because if the regulation is good, it is something that promote more and more the action. 

It is something that allows us to develop better and for all the people in the world. We need to assure that everybody in the world can access to the technology because there are many people in the world that doesn't have the technology and needs also just for the life ‑‑ for me, that's extremely important, it is not the opposite thing, it is something that must work together because one and the other working together can allow us to really develop something better and something for everybody. 

>> (Moderator on mute). 

>> NOEL CURRAN: Can't hear you, Paul. 

>> PAUL MITCHELL: I wonder if you have thoughts about the role of standardization on this particular topic? For anyone. No takers. We have talked a little bit about how to think about the movement towards Corporate Social Responsibility and the idea that we have self‑regulation and regulation, formal regulation happening at the same time. I wonder if you could speak a little bit to where you think the boundary line should be between the self‑regulation and formal regulatory body regulation? 

>> JAN KLEIJSSEN: If I may, Paul. 

>> PAUL MITCHELL: Yeah. 

>> JAN KLEIJSSEN: Yeah. I think the boundary line for us here at the Council of Europe, for the Member States of the Council of Europe, would like where the effects of the technology, the use of technology rather and the way it is being implemented would have a direct effect on what would be considered real day‑to‑day Human Rights over baring elements of rule of law or democracy. To give you an example, to stay with artificial intelligence, it is a form of Digital Transformation, digital technology, when the state, when the governments use artificial intelligence, for instance, law enforcement uses facial recognition, which we know can lead and has led to clear cases of discrimination, of clear cases of interferences with people's individual rights, then there is a clear need to go beyond self‑regulation. When it comes to a lot of other users rather of artificial intelligence, perhaps the music industry, whether you are suggested a beetles song or a rolling stones song is not such a major impact although some may argue, it is perhaps not such a major impact on your basic direct effect on your basic ‑‑ basic self‑regulation may do the trick. 

In all areas, the use of new technologies, digital technology, especially by the use of governments and where governments allow private actors to use digital technologies, which have a huge impact on the rights of citizens. And I'll stay with facial recognition because I mentioned law enforcement when used by the state, but for instance a lot of private actors use facial recognition with, of course, consequences for people's privacy and other rights. When that's the case, I would argue that's where the line is crossed, where self‑regulation can be a useful form of inspiration, but that it is not sufficient because it simply does not offer enough protection. 

>> PAUL MITCHELL: Thank you. In the last minutes, if you could outline what you plan to do, if you have anything on the drawing board that's new and interesting in this space, and share your thoughts and potential commitments in this space. I'll ask Miya first and take it away u. 

>> MIYA BARATANG: I look at the new technologies that are coming up now as a big concern for public health. Most women are using digital technologies, they're not creators of technology. The biggest, biggest driver of social ‑‑ most women are seeing psychologist, young people are so into technology so much, they're not taking into consideration what ‑‑ what are the effects of technology that are not written. 

So we look at the benefits, but what it is doing to the society and the health, emotional health of young people and what the driver of what women should be doing and not women having no jobs in the tech space being the biggest driver for what I do. For me, I would continue making sure that there's enough investment going to women and advocating for women rights and the biasnesses that are coming through AI to be taken into consideration and work with the big businesses themselves. I firmly believe that the creators of technology don't intentionally want to be biased or racist or injustice that's happening outside of the use of technology, that's not the intention. 

They want to solve issues of the world. In the process of solving problems of the world, other issues are popping up and that balance is where I fit in and where we fit in as making sure that history does not repeat itself again of a patter excluding the same people over and over through history. 

>> PAUL MITCHELL: And Monsignor Lucio Adrian Ruiz. 

>> LUCIO ADRIAN RUIZ: Well, if the challenge is how to promote inclusive and diverse technologies, I personally believe that the key is education and the challenge is the personal responsibility as a basis of the social responsibility because as an advocate of freedom, we can find the ‑‑ using what digital culture can offer, it is wonderful, follow also an ethical path to value the human person, request transparency to know exactly what happened and demand legislation in order to control and promote the creativity. For the reason from our part, our commitment, I can say three steps: First is trying to understand every day better what is exactly inclusive and diverse innovation, which is the role of the emerging technologies in order to share it around the world. Second step, the training in the application of the digital technologies in order to apply it in the promotion of the human being, the human dignity and the meaning of life. The third one, it is to work in a team to develop a thought in an ethical way, an ethical path for a digital age. 

>> PAUL MITCHELL: Thank you. Noel. 

>> NOEL CURRAN: Yeah. I think for us as well, three things: One, that we will continue to invest our ‑‑ our members continue to invest in innovation in this area. We have a huge technology and innovation department seen within the EBU. We're pushing a lot of new innovation, digital services around European news, around detecting fake new, AI, et cetera. I think we would definitely commit that we will continue to innovate. We will also commit to continuing to try to provide trusted sources of news, be it through our own service, be it through what we invest in training of journalism and fact checking and all of the standards that we apply because I think that that is also critical. 

We also work with others, we're part of the trusted journalism trust initiative with organizations like Reporters Without Borders, and we're part of the Trusted News Initiative which the BBC leads, media organizations around the world, so we will continue to commit to that. And then finally, we will continue to push for a regulatory change and push for proper governance of the kind of technological and digital environment and innovations that we're seeing. We will continue to push for that at Brussels level, international level, at national level for us. Those are three things we happily commit to in terms of this. 

>> PAUL MITCHELL: Thank you. And Jan, you have the last word. 

>> JAN KLEIJSSEN: Thank you very much. For us, the coming year, as mentioned, we'll start negotiations on legal framework, hopefully a treaty on artificial intelligence based on our standards, Council of Europe's standards on Human Rights, democracy, rule of law to come up with overall principles and we'll negotiate that as mentioned together with business and also with Civil Society at the table. We also will be ‑‑ we're ‑‑ we have started working and will continue work on more specific legal standards on particular applications of artificial intelligence, for instance of self‑driving cars, work is underway of a possible, again a convention, a legal treaty, a legal standards on the responsibilities relating to self‑driving cars, and we'll also be looking at very much like Noel on the impact of digital technologies on Freedom of Expression in the lightest possible sense. These are the three priorities in the coming years. Thank you. 

>> PAUL MITCHELL: Thank you very much. I thank you all for participating today. We're right on time. I hope you all have an excellent IGF this week!