IGF 2021 – Day 3 – Main Session Economic and Social Inclusion and Human Rights

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.



>> COURTNEY RADSCH:  Good morning.  Let's try this again. 

Thank you for joining us today at the Main Session Economic and Social Inclusion and Human Rights.  My name is Courtney Radsch.  I'm a MAG member and Co‑Chair of the Dynamic Coalition on Sustainability of journalism in the news media, I'm delighted to be here moderating this important main session on one of the most important topics in Internet Governance as evidenced by the fact that that the most workshop proposals, 34% were actually submitted on this topic which underscores the importance and knew tall tri of economic rights, social inclusion and Human Rights to Internet Governance.  I will spend a few minutes laying out the stage.  We have had a new approach this year at the IGF to hold preparatory phases to really bring us to some outcome‑oriented initiatives here at the IGF.  I will review kind of what happened in the preparatory phase and then we're going to get into a discussion with our panelists who are really a stellar group of people whose expertise is deep and wide.  We will be joined by Sarah Kiden, technologist and researcher at the Mozilla Foundation; by Jess Kropczynski, associate professor at Cincinnati University; by Steve Crown, Vice President and Deputy General Counsel at Microsoft; and Scott Campbell, Senior Human Rights and Technology Officer from the UN Office of the Commissioner for Human Rights.

Before we meet the panelists joining us from around the world in this hybrid session, I want to talk about first of all some of the trends and opportunities and risks, which is where we'll focus the first part of this conversation, and then some possible governance strategies to address these trends.  I want to invite everyone who is in the audience, here in Poland as well as everyone who is joining online to actively participate.  There is someone with a microphone, if you're interested in making a comment, asking a question, we welcome that.  This is your IGF.  Similarly, online I'm joined by a fellow MAG member that's done so much to bring the panel together, she'll moderate the chat online, really welcome this opportunity to have an inclusive discussion.  First let me layout a little bit of the sense of what we want to get at today.  We know that political and Civil Rights is playing a central role especially with the Ryan Clough of COVID, new forms of data collection and analysis giving rise to new opportunities to address the economic, social challenges posed by the pandemic but concerns about excessive surveillance, activities by state actors and potential Human Rights abuses by private and state actors..

A thing that came out clearly, we talk about Human Rights mainly as political and Civil Rights, but that social and economic rights are very important as well and that they hold the potential to use data for promoting the common good and soar although these are addressed, they're not prominent.  We'll try to mitigate that today.

As we think about the issues around state actors and private companies and this issue around mass surveillance, the individual behavior through tracing apps we have seen the conversation evolve from what can tech do to what are its implications?  A recognition that with appropriate safeguards there are great economic and health benefits that can come from Human Rights‑centered use of data and tracing apps.  We need to center the right of privacy, Freedom of Expression and information because there is a growing movement that's advocating for the respect of Human Rights and the in the digital sphere.  The other thing that came out clearly from the preparatory phrase is that knowledge is power.  It is vital for the promotion of Human Rights and we need to protect the ability to use technology, the Internet, and our digital businesses and apps to create and spread knowledge but that we also need to make sure that bus practices and state laws and regulations implemented are also respecting and enhancing that knowledge without fear.

I want to talk ‑‑ before getting into the governance strategy, we'll discuss for the first couple of minutes some key questions and I would invite first each speaker to make remarks about 4 minutes to talk about what you see as the primary trends that are most important in terms of economic in social inclusion humidex man rights, what is new, over the last year we have seen, and what you're thinking about from your role or perch.  First I would invite Sarah Kiden from the Mozilla Foundation.

Please, go ahead.

>> SARAH KIDEN: Good morning, good afternoon, good evening, everyone.  Thank you for this opportunity to join my colleagues in this session.

To respond to the question about trends, I want to say something that we have been discussing for the last three years, that digital inequality, they have always existed but because of COVID now they are brought to the fore and different countries and communities have responded in different ways.  On one hand, you had the group that responded very quickly and took everything online because people went to work online, shopping online, everything was online, but there was another group that was completely cutoff, and they're not cutoff from luxury, they're cutoff from basic, education, healthcare, so on.  It is really sad that three years on, some people are still closed. 

In Uganda, schools are still close, you won't believe it, there are children that have not attended school for two years.  This is really sad that you have people moving on, people in places where they have to compete with people who have continued with life as usual.

Interestingly though, while all this was happening, we also saw other things come up we have seen some talk about things that have been around for 20 years or more, there is a nice science fiction book by Neal Stevens talking the metaverse and it is called Snow Crash, I highly recommend you read that book.  We have seen innovation games, a game second life, games like that are raising over 1 billion‑dollars in revenue.  We have seen people buying virtual homes apparently or going online for homes and these homes are millions of dollars, we have seen virtual controls from Gucci, I think it was called Gucci, something like that, people are buying virtual bags for $4,000, $5,000, $6,000.  All of these things are happening.  My worry, if this is the reality we're choosing to live, what does it mean for people without access to basic services?  How can I create in a developing part of the world where connectivity gets problematic sometimes, competing with other counterparts, you don't have access to all of these services.  Is this even a priority right now?  There are just so many questions I'm asking myself.  With that said, we still have an opportunity to change things right now.  Also because of the COVID‑19 pandemic we have a use case ‑‑ not that we needed one ‑‑ at least now we have every day research, numbers, statistics that we can use to advocate and ensure how harming digital inequalities are for everyday people.  It shows if we don't address them, we're not only not respecting Human Rights, but we run the risk of advancing in technology and yet we have not even solved the problems that we have right now.

Also, we run another risk of having technologies that don't have everyone.

Finally, me being optimistic, I believe it is an opportunity for us to look, to reflect, evaluate, see what's missing, where we went wrong, and what we can do better.

I'll stop here for now.

Thank you very much.

>> COURTNEY HADSCH:  The distinction between luxury goods and the basic goods that ‑‑ really, a lot of us knew that was happening, but the pandemic as you rightly note has just ex everybody identified that differential and that basic connectivity is at the root of enjoying economic and social rights.

I want to turn now to Steve Crown, the Vice President deputy general counsel at Microsoft, one of the world's most I think biggest and wealthiest technology platforms that works across a lot of different types of businesses.

What have you seen in terms of the rising significance of the issues and public awareness and the key trends that have emerged along with risks since the pandemic?

>> STEVE CROWN: Well, thank you, Courtney.

It is a real delight to participate in this.  I planned to be in‑person and my trip fell through literally in the last week, ten days.

Let me start by just noting something pretty obvious, I'm coming at it from my perspective inside of a major technology company.  As Courtney noted, it is important to lay the stage and understand where we're situated and a little bit about how we got here.

I start from the perspective that some of this is not new, a lot of it is not in fact new, we're focused on it in a different way.

The major technology company, the global Internet, we for a long time have been looking and engaging the UN Guiding Principles on Business and Human Rights, the Office of Human Rights in the UN where Scott is, there's an ongoing dialogue with the big technology companies on digital ‑‑ applying the UN guiding principles to technological challenges with the global Internet.  In some ways, this is a continuation.

I do think we have seen a new focus.  I actually up level it a bit.  I think there is a discussion happening that's not always explicit, it is really more about the social contract, the idea of why do we have businesses who are operating the way they do, how should they operate?  Why do companies exist?  Should they be allowed to exist if we see some of the harms that we actually are finding out there.  I take the position that it is actually to our collective benefit that we do have private companies engaging in the way that we do.  I love one of the ways of framing this, you know, this notion of why the companies exist, Professor Cullen Mayer up at Oxford University, at the school there, he has proposed a corporation's purpose ‑‑ and I do want to keep it at this high‑level before diving into what we're actually doing and how we approach this ‑‑ a corporation's purpose is to provide profitable solutions to problems of people and planet.  It really is a notion of companies are doing it, we should not be confused, they're doing it for profit in a way that tries to find economic efficiencies and find ways to bring technologies more effectively forward.  But it must be about solving problems, and these are problems that are facing people and problems facing the planet which is why we as Microsoft and other companies are so engaged in technology to address global warming.

How should we understand today's concern with social and economic inclusion?  I think we're not going to understand the exact things that brought us from where we were preCOVID to COVID.  We can point to COVID.  I don't think that really does a lot to advance our understanding.  I think it is most valuable to look at the dynamics that we're seeing, citizens' expectations that are now clearer.  We need ‑‑ that's the value of IGF in part, it is making explicit these expectations, but then we need thoughtful engagement by companies and by all participants in the ecosystem, NGO, Human Rights defender, making sure which get the right questions asked and we get the right and range of possible solutions actually identified, not leaving that solely to companies.  We need much more multistakeholder dialogue happening there.

So a few of the things I would like in the opening comments, we're clearly in a world in which there's a renewed focus, it has been there before, I think it is a sharper focus on inclusive social and economic development and the role of technology companies and other corporations, it could be pharmaceuticals, in providing meaningful access to the benefits that companies can bring.

This has to happen on every continent, every country, it has to happen in every community, urban and rural, we see this with the Internet especially, the rural urban divide, the Global North, Global South, meaningful access to technology is a major challenge for us and it has to reach into every business, small and large, and every person, including the billion plus people on the planet with disabilities.  There's a strong piece now that understanding that is a chance to refocus for a sustainable future, moving from carbon neutrality to carbon negative, the idea of removing carbon, not just slowing the continued deployment of carbon into the environment. 

There's a big piece responding to COVID and the pharmaceutical companies, of course, have theirs, technology companies have contributed massive amounts of computer research, computer resources to allow global engagement and collaborations.  I would also point out the incredible need we now see for addressing education in the time of COVID.  You know, kids who have not been in school, they have not been socialized the way they have.  We at Microsoft are actively engaged in working with non‑profits to see how we can improve digital learning and as in this one, the notion of hybrid experiences to make sure that we're not leaving the youngest generation behind as we manage our way through the COVID challenges.  I'll stop by noting this notion of multistakeholder, deeper collaboration and discussion that includes governments, includes companies, but it especially includes those who can represent the voice of those who need their perspective understood and addressed.  We at Microsoft refer to human‑centered activities, everything we do, we make sure in the company it starts from this notion of how does this address human need?  What problem are we solving?  Do we have human beings at the forefront as opposed to we also need to address these issues of accessibility.  Is that notion of rights, respecting solution, we do believe there is a continuing role for technology companies to leverage technology but it must be directed at empowering people and providing meaningful, deeply desired solutions.

I'll stop there.  I look forward to the conversation with the panel, and with the audience.

Thank you.

>> COURTNEY RADSCH:  Thank you so much, Steve.

You raise many critical issues and really touched on so many of the diverse topics that do fall under this category.  It was ‑‑ it is refreshing to hear you refer to human‑centered activities to human‑centered design.  We often talk about Internet user, let's be clear, we're people, we're citizens, we're humans.  When we talk about people, we're talking about real people, there is no real world versus the virtual world.  There is an idea of a refocus for the future. 

You mentioned climate change, a thing that came out in the preparatory session is the shift in how we talk about these issues and how we approach tech, how we approach data from what am I doing with my data to Human Rights that are always with me, no matter what I'm doing.  How does technology serve humanity versus just putting up safeguards around the edges.

So much to delve foo there.  I want to turn to Jess now.

You mentioned, Steve mentioned, the importance of profitable solutions to corporations.  The fact is, many of the solutions we need might not be profitable.  Jess, your research focuses on its design and evaluation of technology to support community network, from planning to emergency management.  Talk to us about what have you seen over the past year, and as you think about the key policy questions around the rising awareness of these issue, the risks and opportunities that have emerged since the pandemic.  What are you seeing in this civic tech space?

>> JESS KROPCZYNSKI: Thanks, Courtney.

So I guess I would start off with you had raised some interesting ‑‑ you had raised transparency.  This is really one of the first jumping off points that we can think about when it comes to opportunities and ways to garner positive change.

The first thing that the Internet can do is create transparency around certain issues.  While there certainly are haves and have‑nots and there are the people with limited access that are lacking that transparency of those that are engaged and active we can start to become more transparent about inequity and social injustice and economic injustice.  When it comes to inclusion, you know, it all starts with being aware of what the issues actually are and having new wants to those issues before we can start to take some kind of positive change.  Transparency isn't enough obviously.  When it comes to all of our decision makers at the policy level, it is also important for them to be, you know, not just for the issue to be transparent, but they ‑‑ you know, they have access to the information, but also that they become aware of some of the nuance around these issues.

There are certainly policy policymakers that have begun to talk about ‑‑ they have begun to talk about the digital divide, they have talked about these issue, with regards to the pandemic, engaging with the Digital Transformation, trying to make more dollars available for infrastructure, for ‑‑ starting to become aware of different policy has should be made based on issues.  However, at times there's limitations with this.  Our dollars may be being put into certain ‑‑ to bringing Internet into areas that didn't have it previously, however, it is this game of there is some areas that are, you know, we have gigabit cities coming online, simply providing Internet, it is not going to catch up in time because as new technologies become available, we now have certain levels of expectations.  The game of catching up can be difficult.

There also needs to be awareness around those issues before we can get to true engagement with trying to address some of the challenges.  So having some transparency of where there are the most challenges, the nuance surrounding the challenges, starting to have more awareness around the best ways to tackle things.  What are challenges with the ways that policies have been addressed previously and how can we overcome them and then starting to engage with new and diverse opportunities around addressing previous or persistent challenges.

>> COURTNEY RADSCH:  Thank you very much.

This focus on transparency you mentioned does indeed seem to be one of the developments that we have seen over the past year in terms of really recognizing just how important that is for the accountability and ability of these platforms, these services and technologies to serve the public.

I want to turn now to Scott Campbell, you know is the Senior Human Rights and Technology Officer at the UN Human Rights Commissioner's Office.  Let's talk more now about Human Rights.  We have also seen in the past year revelations about some of the major impacts of technology companies, the fact that research that has revealed deep Harms to Human Rights to women and children, health harm, addiction, et cetera and that much of this research is held within companies but, you know, many of U including your office, have raised concerns about the negative impact of some of these technology, both in terms of political discourse, the ability to homoscedasticity elected officials accountable, the role on the public sphere.  You have in your office focused on disinformation among other issues.

What are you seeing in terms of the evolution of the policy debate over the past year on this respect, Scott?

>> SCOTT CAMPBELL: Thanks very much, Courtney.  Thank you very much to the other speakers and those involved in the prep session.  I think you have made my job very easy this morning.  I can be pretty brief I think in my opening comments.  There are so many good points made.

I will just try to underscore a few things that have just been said and also from the prep session and try to build on them a little bit.

I think the point just made on transparency is so crucial.  I won't dive into that.

A few other points, just to consider, when we think about, you know, why is this debate, why are the concerns becoming so prominent in societal debates and why is there such increasing concern about these issues?  I think Steve pointed to one of the important factors that a lot of what we're seeing, the types of Human Rights violations, the types of concerns around Human Rights we're seeing aren't new.  Both COVID and digital technologies paired, both have been accelerators of existing inequality, existing discrimination both within and across societies.  I think we have seen more of the most vulnerable, the most marginalized being further marginalized and those who have historically been the most exposed to economic and social impact of discrimination have also been the most impacted by both coronavirus and accelerated by the use of digital tech in our world.  These same marginalized populations we have seen with COVID I think have the least capacity to resist and to respond also, as I think Sarah spoke to.

Secondly, COVID has made many of these existing inequalities just more visible, just brought them to the surface in our discussion, in our daily lives.  We see the data on this.  That's the other important factor.  We see the data, we see higher death rates among the marginalized, excluded across societies around the globe.  We see higher rates of unemployment of those who have had the most difficulty getting and staying employed.  I think COVID has really just brought a lot of these concerns to the surface.

Thirdly, and I think what's also maybe just needs to be under scored, the risk, the Human Rights risks we have seen are simply so important.  Fundamental threats to democracy that we have seen in terms of a very important, damaging trend over the last couple of year, since the last IGF, less open, less safe, less inclusive public digital space, less space in that digital square for debate and expression.

I think just the fundamentally grave nature of what's happening has also pushed these questions to the forefront.

Maybe I will ‑‑ I think I'll leave it at that for now.  There are so many good points made, I will leave it at that now and look forward to the discussion.


>> COURTNEY RADSCH:  Thank you so much, Scott.

I want to invite anyone in the room that's interested in asking a question of our panelists to raise their hand and there is a microphone that will go to you.  Similarly to our audience online if you would like to put your hand up in the Zoom room.  We're going to get to that in a moment.

Scott, before I let you off of the hook, can you get a little bit more detailed on what your office has seen in terms of the major Human Rights issues that you're focused on with respect to your mandate on Human Rights and Technology, specifically, you know, we talk a lot about Freedom of Expression, data privacy, not just data privacy, but privacy more generally, disinformation, please talk ‑‑ you know, what are the key trends that you guys are focused on, working on right now with this issue?

>> SCOTT CAMPBELL: Yeah.  Thank you, Courtney.  It goes right to one of our main concerns.

What we have seen in short, it is really a whole new range of opportunities, if you will, of new entry points for authoritarian regimes.  We have seen new opportunities and ability to curtail public civil enterprises, particularly online, new opportunities to violence voices of dissent and critical voices.  We have seen often the very legitimate public health concerns, my own background is in public health and Human Rights, and our high commission, of course, is a medical doctor, we both ‑‑ we recognize the very legitimate public health concerns of COVID‑19 and the need to respond and in some places restrict rights to get the best response in public interest.

However, what we have seen, and really it is a trends, that many of the legitimate public health concerns are being twisted.  You know, into methods to exclude people from participation especially to include people from economic opportunities.

There is certainly some restrictions on civil, political rights that you mentioned freedom of movement, freedom of assembly ‑‑ freedom of movement ‑‑ rights to privacy has gone up, the use of apps, some cases to track people, and abuse of data through invasive use of technology.

Maybe the last thing I'll say in terms of concerns, there is a real risk that many of these short‑term responses to COVID‑19 will ‑‑ do have damaging impact on Human Rights will be made permanent, and there one of the other important trends we have seen, that is very concerning, shut downs.  I think often when we think of Internet shutdown, we think of the impact immediately on access to information, on Freedom of Expression, on civil and political rights.  I just want to draw attention to the impact on economic opportunities and social and economic opportunities.

I think Sarah and Steve both spoke about access to education and, of course, you know, while we're trying as the United Nations to promote connectivity hand‑in‑hand with the private sector and build bridges across the digital divide the increasing trend in shutdowns goes fundamentally against that.  I think that's really a concerning trend that we have seen, the increase over the last year in shutdowns.


>> COURTNEY RADSCH:  Thank you for highlighting that.  I see a question in the audience.  I will ask the microphone to go here.

Also maybe contextualize, you know, today the Committee to protect journalists released its annual report on journalists imprisoned for work, a record 293.  Again, a record.  Tomorrow Maria Ressa and Dimitri Muratov are getting the Nobel Peace Prize, both journalists when working in the Philippines, one in Russia and they're recognized, you know, because of the Nobel Peace Prize Committee recognized that efforts to safeguard Freedom of Expression is a precondition for democracy and lasting peace and Maria Ressa looking in the public sphere and particularly in the Philippines, Facebook, extremism, we have seen major companies revelations that govern our public sphere and the choices of how they design their platforms as Steve notes are ‑‑ they're corporations, so they're ‑‑ you know, this is profit, that has an impact on the public sphere and that seems to have been made more clear in the past year as new internal research, particularly from Facebook now known as Meta has been revealed.  The research can only be done when you have access to that data and that kind of picks up on the point that Jess made on transparency without transparency and access to that data we can't even know what we don't know.

Before we delve back into that, let's go to the audience.

>> AUDIENCE: Thank you very much from the Gambia IGF.  Thank you to all of the speakers.

I want us to get the perspective of the speakers in terms of the roadmap on digital cooperation by the UN in which all its key action areas are very important, but in terms of ensuring digital inclusion for all, including the most vulnerable, I think if that's not achieved all the other gaps will not be fulfilled.  I would like the perspective from the speakers how best do you think we can go about within the world community to get digital inclusion for all, especially the most vulnerable which as Sarah said has really affected a lot of marginalized communities.  Thank you.

>> COURTNEY RADSCH:  Thank you.

I will take another question from the online chat, Amir Mokabberi, if you would like to unmute and briefly ask your question.

>> AMIR MOKABBERI:  Thank you very much.

Hello, everyone.  Hello to the distinguished panelists and all dear colleagues.  I would like ‑‑ first of all, I should thank IGF secretariate and Host Country for organizing this well‑organized IGF and also I would like to thank you all for convening this timely session.

I would like to ask about issues with Human Rights in digital space.  In cyber war, and I think, UCM, the unilateral collective measure in cyber war, and it is highly related to the topic of the valuable session.  As you know, more than three countries are now suffering from this issue.  As all know, the negative ethics of unilateral digital sanctions on some nations could have become intensive and more destructive, especially during COVID‑19 pandemic and other emergencies.  Digital sanctions on many areas like investment in Internet Protocol infrastructure, technology, sanctions on digital solutions, digital licenses, digital resources like IP, the large DNS system and access to network, they're key values and obstacle had in achieving national development goal using ICT.

We believe the digital strike active measure constitutes Human Rights violations in cyberspace, especially right to development, right to education, right to business and so on.  Some of our Universities have problems to access scientific data and some of our digital, Iranian entrepreneurs, digital entrepreneurs, and Iranian application, they have been removed from digital stores like Google stores and Apple because of the sanctions of their respective country.  My question is: 

What would be the role and contribution of United Nations family and IGF Plus community to address this vital issue?  Shouldn't there be any norms regarding the prohibition of applying this kind of unjust and unethical digital sanctions against nation notice UN process or the evolution of other players.

Some countries using sanctions, digital sanctions as a pressure tool for achieving their legitimate goals, non‑discrimination in access to Internet Protocol technology and cyber capacity building for all nations could be a new norm for having inclusive cyberspace and inclusive Internet.

I would like to request that our concern will reflect that in the final outcome and final message of IGF 2021 in Katowice Poland.

Thank you very much, Madam Chair, for giving me this opportunity to share my words.

Thank you very much.

>> COURTNEY RADSCH:  Thank you for that extensive intervention.  You raised some very interesting points around, you know ‑‑ this is not an inadvertent, you know, lack of access, this is a focus, sanctions are a choice to deny connectivity, access, et cetera.  It is of course an interesting dynamic, ultimately that's about state sponsored denial, and we can think of other aspects ever state Human Rights abuses, including the denial of the Internet service, net shutdowns and of course, restrictions on people's abilities to freely express themselves online and offline.  I want to go to our panelists to address these two question, one around digital inclusion, especially for the most vulnerable and the question ‑‑ if you can ‑‑ you can address the question around sanctions and whether this should be addressed in this framework.

I also want to pull out another aspect which was the role of the IGF.  That's one of our key policy questions, what is the role of the IGF to promote these rights.  Can I please ask the panelists to address these two questions starting with Sarah.

>> SARAH KIDEN: Thank you for the questions that you have raised.

If I may start with providing basic service, basically I think I'll use the example of education again.

There are so many pieces that have to fit together for this to work going beyond just, for example, even if you have good background, national backbone infrastructure, you have access to good Internet services, maybe even affordable, there are many other things that fit into the piece.  For example, do educators actually have the right skills to be able to engage with the learners in a way that they can use digital tools to support the learning.  This basically is for ‑‑ I think in many countries, especially the Developing Countries, you have to up circumstances you have to learn and relearn things there.

Are so many things that fit together.

If I may talk about shutdown, I will use just what was being said about awareness by Jess, in most cases, it is governments who do the shutdowns.  In Uganda we experienced the shutdown earlier this year.  It was really crazy, the whole Internet shutdown, everybody was cut down.  It is something that we have to continuously advocate, let the governments know that these are the dangers.  It is not just that you're cutting people off from service, you're also effecting the network.  Even when services were brought back, there are some things that are inaccessible, you still can't access Facebook, some people still struggle to access GitHub, it is a conversation that we have to continue to have with governments and letting them know, creating awareness, telling them about the dangers of some of these things.  Yeah.

I think we're just getting started.  I don't think it is something that we can solve in one day.  That's what I would say.  Thank you.

>> COURTNEY RADSCH:  Scott, let's go to you for the questions around sanctions as an aspect of this, but also, you know, the role of the IGF and how we see the IGF playing in terms of Human Rights violations.

>> SCOTT CAMPBELL: Thank you, Courtney.

If I can also just pick up on I think the excellent question posed by the speaker from the Gambia IGF, he just underscores how important inclusion of the most vulnerable is and what can we do about that, what are the approach, for me what strikes me as to how important Human Rights due diligence, conducting the impact assessments, how important that is to avoiding or to mitigating the risks of inclusion being enhanced by the use of digital technology.

Here I think there is a role both for companies, as Steve is well aware and for states.  Right.  I think it is both that need to be thinking much more ‑‑ more than thinking but embedding in the process, in their design of tech products in their development, the rollout of tech products and the monitoring of use of digital tech, assessing the impact of those products on Human Rights and the full range of Human Rights.

In particular, I think back to the question, how are these ‑‑ how is the use, the development, the use of these tech products going to either favor or disfavor inclusion?  Inclusion of the most marginalized.  I think that approach, having a Human Rights‑based approach, applying Human Rights due diligence, that will take us a long way to addressing some of the concerns raised and, you know, addressing the problems of exclusion.  Very quickly, hopefully we'll come back to this in the future.  There's a lot to say about the role of the IGF I think and what's going on in this space.  There's so much going on in the governance space, a lot by states, massive efforts around the world to regulate the online space I think in the last couple of years we have seen more than 40 new social media laws, there are several dozen under consideration, you know, as we speak, and a lot of the governance frameworks, new legislation, put Human Rights at risk and there is a lot of risk of fragmenting the online space.  There's a lot of work to be done there.  I think the IGF just has a critical role to play both as convener and as helping to set the compass to guide the development of these different governance frameworks. 

Ultimately the IGF is a form that can push to making the digital space more diverse, more accessible and open and safer for all.  I think, you know, as Steve and others were say, putting Human Rights, having a human‑centric, Human Rights based approach can take us a good way down that road.


>> COURTNEY RADSCH:  Thank you for that.  Hello?  Can I have my mic?  Thank you so much.

I want to go to Steve.  You mentioned importance of inclusion, it is hard to include people under sanctions.  How do you look at sanction as a Human Rights issue, you have to apply as a U.S.‑based company, but how do we think about the different roles of states when it is ‑‑ it is not just about they don't have access to ‑‑ they're being denied access by other countries.

>> STEVE CROWN: Well, fascinating question.  Happy to engage.  Of course, I'm not giving a Microsoft perspective here, it is not something where we have said here is our policy on that.  These will be my reflections on it based upon decades working inside of Microsoft and more than a decade on these very issues.

First is, there is such a thing as international law.  There is such a thing as state to state diplomacy, there is the United Nations and one of the things states try to do, it is influence one another.  At some stage, it is not the role of companies to be making those decisions.  We're not above the law.  We don't make the law.  We're actually ‑‑ if we, as we do at Microsoft, we believe in rule of law, we're subject to laws that doesn't mean we don't lobby for positions and urge what we think are better and worse solutions to challenges especially the United States where we're based.  A lot of work, lobbying the U.S. Congress, but it is a little bit more ‑‑ different when it gets into a U.S. company telling a foreign government and its people what they need to do based upon our position as a private company.

Within that, I would say we have a bias for engagement.  You are not going to find Microsoft saying we wish there were more boycotts of governments and people.

Our mission is to empower every person on the planet to achieve more and we take that literally.  It means every person on the planet, it doesn't matter where you were born, the current challenges that you may be facing with Human Rights today, if there is a way for us to engage responsibly we seek to do that.

I really can't provide an answer and a solution of the sort that I think the question was asking about, other than to say we do believe dialogue and engagement is far better and far more responsible than taking a position we ought to avoid those who are presenting challenges but within that, it has to happen within the international governance environment and the United Nations has a significant role in that.

I will say, just as we seek to engage, we want more inclusion you may know we're active in combating cyberattack, either nation state or private on the Internet infrastructure.  There are lots of reports we put out, we put out one again this week on some major actors that are out there actively looking to interfere with the operation of the Internet.  My final comment will be that on this issue of boycotts and denials of services, there's a tension in that. 

Scott know this from our many discussions on Human Rights around the United Nations..

Rights are typically intentioned, they don't necessarily conflict.  They are intentioned.  You don't have an absolute right to privacy, whether in the universal declaration or elsewhere, the right is to be free arbitrary and unlawful interference with it.  There are boundaries to most of these things and the goal is to always maximize the benefits.  That's one of the challenges and one of the things that UN guiding principles push companies to do, is how do you make sure that you understand the various potential harms as well as the good things you're trying to achieve and then develop a responsible and I will pick up on that notion that's come up, we will talk more about it, transparency.  Actually talking about these things and making the trade‑offs and the considerations that factors within to a decision, more transparency leads to better decisions, just ‑‑ not just in the industry but across the globe.

I'll stop there.

>> COURTNEY RADSCH:  Thank you.  More transparency is helpful in addition to sanction, we have attempted to get questions to your answered, Ami.  R, can't do much more to make anyone respond.  We have seen how technologies is being used, for example, social media technology and conflicts where you have the overthrow of democratically elected government, for example, in Afghanistan, in Myanmar, several places around the world and there are ‑‑ there are tech companies that don't necessarily even know where their technologies are going to be used and don't have clear policies on whether, for example, sanctions regimes apply to some of the Internet service platforms.  It is a very interesting issue.  I do want to move us on.  We have a question in the audience I'm going to go to and then take two in the chat.

>> AUDIENCE: Hello.  Thank you.

Good morning to all.

I represent JAMAR Technology, a user‑centric‑based company from the corporate side. 

My question it is from the corporates, we do a lot of work around Corporate Social Responsibility, but how can IGF help companies like us so that we can go back to the communities and help people and look at ‑‑ we do a lot of work around the educational sector.  Is that a platform, companies, they're not as large as Microsoft, but medium‑sized companies, we also want to participate actively in this forum.  Is there a way that IGF could help companies like us or partner with companies like us so that we can go back to the communities and do Corporate Social Responsibility kind of activities from that side?  Just need your thoughts on that.

Thank you.

>> COURTNEY RADSCH:  Thank you so much about that important question on how to engage with the business community and, I would say all of the stakeholder communities that are of different sizes and capacities.  Thank you for that.

I also want to bring in two questions from the chat that were written in full, one from Evelyn who is asking a question to Scott about whether the COVID pandemic worsened the issue of government surveillance or just made it more visible and a topic of political discourse.

I do want to note, the issue of surveillance came up as a significant concern, a limiting factor to some of the beneficial uses of technology and access.  I would like our panelists to, you know, delve into this a bit more on how we do protect Human Rights and ensure Human Rights compliant and centered access and inclusion in light of both state‑sponsored mass surveillance, private corporate mass surveillance and of course the targeted advertising that's the basis of so much of our public sphere.

Before I turn to you, I also want to bring in a question from online, saying if you don't ‑‑ sorry.  Just kidding ‑‑ I'll go first then to Jess on this question.

If we can talk about surveillance.  You work in civic tech, public interest.  How are ‑‑ how does surveillance figure in, how can we ensure that as we broaden and improve access and inclusivity that we're not also embedding the potential for surveillance into that.

>> JESS KROPCZYNSKI: It is a challenge.

It is a great question.

Some of the things that I work with deal with crisis response.  When it comes to trying to get help during an emergency at times we can say I will give the government all of my information, I will ‑‑ I would like, you know, people to know where I am, what my situation is, and how to best deliver emergency response.  However, the other side of that, how do you only make something available during that emergency situation and not at other times.  It's a balance.

There is also situations where we are ‑‑ when phone lines are down, people are turning to social media in order to post the fact that they might need help during the crisis and emergency response doesn't necessarily have a good way to see and respond to those issues methodically.  Some of this is coming up with human centered, some is need finding work, to understand, you know ‑‑ approaches ‑‑ to find out what's happening when someone is in need and then what is it that that groups can provide response, how can we deliver information to them while also preserving the privacy of individuals.  Especially when it comes to those social media situations, when you're in crisis, you're posting, there's ‑‑ you're wanting the attention perhaps of a government, a civic entity and what you're really getting the attention of is perhaps somebody whose able to market something to you based on your location, your ‑‑ the words, context that you're using.  We're already giving up a lot.  How can we balance that in an appropriate way is a challenge.

We do need people thinking about this from a research perspective.  We have to make sure that it is not only industry coming to the table because there might be ‑‑ you know, just a certain ‑‑ they'll have their own profit‑based perspectives.  That's appropriate.  There needs to also be government and other more neutral research parties thinking about these issues.

>> COURTNEY RADSCH:  Thank you so much, Jess.

I would like to turn now to Sarah.  How are you thinking about these issues in the communities that you work with?  What are you seeing as risks posed by surveillance?

>> SARAH KIDEN: Actually I wanted to respond in a different way.

There are things we can do to make our voice heard in terms of vary Lance.  I would like to call this acts of resistance.  If I can give an example of earlier, the start of the year, when WhatsApp sent out a notification to our phones telling us how if by a certain date we didn't accept the terms by default, that particular day, we're moved, they start sharing our data.  People started to move to Signals, Telegram, even others.  You know, my mom moved to Signal and to this day she calls me using Signal.  She doesn't know much about the difference, just knows that people are moving.  These acts of resistance can make our voices heard.  Simple things if everyone decides that we're going to just cover our cameras on our laptops.  Much as it may not be impactful, but if everyone is doing it, then the companies may start to think about something.  I think that's what I would say about the surveillance thing.  Yeah.

>> COURTNEY RADSCH:  Thank you, Sarah.

Let's turn now to, Scott.

How does the Office of The High Commissioner look at this?  There are various reports from Rapporteurs looking at surveillance and there is a factor with the COVID tracing apps and the pandemic. 

Talk to us about this.

>> SCOTT CAMPBELL: Thank you for the question.  Our office uses this with the utmost concern.  The question posed, as surveillance, made more visible, been more visible by the COVID‑19 pandemic or is it actually getting worse and I think the answer is it is not an either/or but an and.  We have certainly seen both, a rapid growth in the use of surveillance technologies and COVID‑19 bringing all of that to the surface.

These aren't ‑‑ you know, these aren't ‑‑ surveillance is not new.  I think we have said that a few times during the discussion this morning.

We have seen such growth and increased visibility around it.  Our office has been extremely concerned.  We have published a number of report, we have put one out just in September on artificial intelligence and the right to privacy and we have been speaking out pretty regularly about some of the more contention issues of the day, the use of the Pegasus app to increase surveillance and tracking.

The High Commissioner has called ‑‑ the high commissioner for Human Rights has called for a ban on artificial intelligence applications that cannot be used in compliance with international Human Rights and asked for a moratorium on sales of this technology until safeguards are put in place, until companies and states have done due diligence, homework to make sure that this technology will not be used to harm Human Rights.  I think in responding, you know, to some of the challenges there, it is important to note, we're not starting from scratch.  Both companies and businesses have guiding principles from the United Nations on business and Human Rights and these can provide a framework for developing product, developing service, developing policies that will assist them meeting their Human Rights responsibilities.

I think central to that, it is to companies and states, governments, using Human Rights due diligence when thinking about either leg ladies and gentlemen, thinking how to regulate in this sector, surveillance particularly and the companies when they do the actual development themselves, that they look at potential Human Rights Harms and Human Rights impact of what they're developing and take measures to identify the risk, try to mitigate them, prevent them where possible and make sure that they have baked in some kind of remedy for abuse of products or when things do go wrong and people are victims of violations.


>> COURTNEY RADSCH:  Thank you, Scott.

Interesting points.

These are guiding principles.  They are not regulatory requirements or legally binding treaties.  You know, this idea that companies need to mitigate, prevent abuses is also predicated on the idea that the technology is not inherently Human Rights abusing.  You have mentioned, for example, the calls for moratorium on the use of facial recognition, I would say also biometric and sediment analysis and we have seen that the COVID‑19 pandemic provided cover for the rollout of many of these types of pervasive surveillance systems as have national security issues, but kind of the combination of those, for example at borders where you see pervasive facial, biometric analysis, no one knows what's happening with that data, what's the oversight of that, how it is used.

I want to come to Steve.  We saw that Google and Facebook ‑‑ sorry, Google and Apple, tried to implement a COVID tracking approach that could be used by governments in coordination with their apps and they asserted it was privacy protecting.  I think you saw very little uptake that have, there is very little trust in the companies and in any sort of rhetorical promises made by the companies.

Could you talk to us about the trust deficit and how that links to these critical issues around Human Rights, around surveillance, ultimately that is linked to this issue of access and wider inclusion.  How are you grappling with that as a company and trying to mitigate the trust deficit?

>> STEVE CROWN: A great set of questions.

I'll start by saying, my sense ‑‑ and I was not deeply involved in any of this when Microsoft developed similarly an app in collaboration with the University ‑‑ my sense, the uptake of the COVID apps, for example, that Apple and Google had was less about distrust of the companies than governments actually wanting more control.  It wasn't so much the trust issue, which I think is a real issue, do we trust companies with the data they collect and how they store it, what additional uses it is put to and whether ‑‑ I will actually shift right away to the notion of use.

My perspective, this is generally the way you will find Microsoft speaking on these issues, it is we strongly believe in regulation of uses.  It actually goes back to the way I started my comments at the beginning of the session which is almost a social contract.  We genuinely believe that good regulation is exactly what all of us want.  Good regulation, this builds on Scott's notion of due diligence, it is deeply informed regulation about benefits and harms, understanding the technology, understanding how it can be misused, and regulating that, not leaving it.

The UNGP, it is not a weakness that they're not biting, it is actually the very nature of them.  These were to be voluntary activities that companies undertook in the absence of regulation to address these Human Rights harms.  I view them as a very strong positive in the way certainly that technology sector has evolved in the last decade. 

One of the things I think that's out there, it is not so much in the COVID space that we don't trust the companies.  I think governments wanted more access to the data to do more things with it.  There may be epidemiology sort of studies that people want to do that's difficult if it had to go through the companies.  That's a place why I think we have informed decisions.

On the broader question of surveillance:  What we have witnessed in the last decade, not just COVID, what we lost, it is not the idea of secrecy when out in public, it was actually this notion of what some academics refer to as anonymity, there was practical anonymity.  I knew if I was walking in a public square, somebody may take a photograph of me and I didn't have any violation of my privacy but I was still anonymous, it was not identified to me, the uniquely human being with all of the characteristics and all of the things that somebody could learn about me.  It was just a picture of a random person walking in a square.  So technology, you are no longer anonymous public but a disservice to many of us it has done, I don't for example operate on any social networks out of privacy concerns because I do think there is value on sharing what I want to share with the people that I choose to share it with rather than having it be collected, just vacuumed up, and then used for reasons I have no ‑‑ for purposes I have no control over.

I think you will find many of us are in favor of deeply informed, technically credible discussion of particular uses and that's where we ought to focus attention rather than thinking that we're going to stop countries across the globe who have desire to use these technologies, stop all of their development.

I don't think it is a net benefit to have only the worse actors developing the most powerful technologies.

>> COURTNEY RADSCH:  Thank you.

That's a perfect transition to the second portion of this important issue, which is possible governance strategies.

Some of the key issues that came up in the preparatory session which has been really ‑‑ we wanted to delve into as we get into the last half hour here.  It is government strategy, governments are accountable that Human Rights are respected and private sector has role, but it is fundamentally the role of states and governments and that there is a consensus that governments need to be inclusive and that there should be some international coordination for society to fight injustices and Human Rights abuses.  I want to draw on a question from the chat as we talk about this idea of inclusiveness because there is a question in here about whether I internally displace people ‑‑ sorry, trying to find the question ‑‑ how people who are displaced by conflicts in their own country can also have access to Internet Protocol and the Internet, so another element of inclusivity.

In addition, one of the things that came out of the preparatory phase, this idea of needing an internationally legally binding instrument on the use of the Internet and digital technologies and data in accordance with already existing Human Rights frameworks.  I think partly because there are only guiding principles and commitments and soft non‑binding law.  I definitely want to get the panelists to respond to whether you agree with this idea and what your thoughts of a legally binding agreement are.  I want to address other issues that came up from the preparatory phase, it is the obstacles to anything that has to do with coordination, regulation globally, it is the national security issues and market competition among private companies, so obviously national security being primary for many states and how an international framework could even work, given those two primary focus by both governments and private companies, and that governments should be multistakeholder and representative.

You know, that's very tough when you're talking about how to do an international treaty, when talking about national security, when you talk about, you know, market competition, how do we make sure that there is multistakeholderism and representativeness that simply having stakeholders is not sufficient.  There needs to be a representative in the stakeholder groups.

There was also this focus on ‑‑ as we have alluded to, touched on the discussion already ‑‑ data governance has to respect Human Rights and centering Human Rights within the person no matter what they're doing online or how they're interacting with digital technologies, to be a shift to center that and thinking, therefore, about responsible technology.  Is that a promising framework for thinking about how we can ensure that the most vulnerable online and offline are protected, that this is in accordance with Human Rights and that as digital inclusion proceeds, that there is a responsible technology perspective embedded at the center of that.

I want to turn now to our panelist to address that, those questions.  You can pick those and then we'll go to a round of questions online and in the room.  So please prepare those.

First off, I'll go to you, Scott.

>> SCOTT CAMPBELL: Thank you, Courtney.

There is an awful lot to chew on there.  I think maybe I'll start off just by picking up on the last points that were made I think about the trust, trust deficit and linking that with transparency and accountability.

I think the question raised, you know, about the Google and Apple efforts to develop a COVID tracing app, I think it reflects both a lack of trust in the companies, but also at a broader lack of trust in governments around the globe and decreasing confidence in governments.

What can be useful here, I think in building that trust are applying the Human Rights principle, that, you know, we have mentioned several times now.  In particular, increasing transparency, transparency in algorithmic development, product development, how policies are applied, rules, company rules applied, interesting transparency can take us a long there, making sure that when things go wrong, people are held to account.

The binding treaty question, trying to further develop the international standards and norms, I think ‑‑ I mean, just to note, you know, as you said, Courtney, the UN guidance on business and Human Rights are principles, guiding principle, they're not binding.

There is an effort over the last several years among Member States and Civil Society to transform those guiding principles in to a binding international treaty.  We see a lot of resistance there and Courtney, you actually underscored I think some of the underlying factors as to why there is resistance to making those more binding.

I guess I would just point in general to, you know, as efforts are made to increase regulation, there's just ‑‑ there's a risk that we actually move backwards in terms of the inclusivity front and not enough forward.  I think the IGF has very important role to play there by bringing people together, creating a multistakeholder setting where responses can be developed and people can ask questions, how can I ‑‑ where can I find tools and resources about applying a Human Rights based approach or integrate that into how we're doing business.  The objective I think, of course, to make the digital space more diverse, more accessible, more inclusive, and I think that's generally what we're looking for and putting Human Rights at the center of those conversations I think is, you know, again is crucial.

I think badly developed regulation ‑‑ you know, as Steve and others have said, getting the regulation right is just so important in terms of consolidating democracy.  I think on the other hand, poorly developed regulation may help us to consolidate deeply undemocratic, discriminatory approaches and work against inclusion.  Again, I think the IGF has a really important role there as we're seeing today by bringing in different voices into the conversation.

I will finish this in underscoring again the importance of transparency and accountability.  In decision making and that needs to be both at the company level and the state level and public participation, you know, in these discussions, it is just so fundamental.  Thank you.

>> COURTNEY RADSCH:  Thank you for that.

Who else from the panel would like to jump on this?  Who wants to jump in on that?  Jess, I see you smile, does a legally binding treaty hold promise?  I'll come to Steve, I would like to hear, you know from Sarah or Jess, from the Civil Society perspective and especially, you know, Mozilla is a non‑profit organization that has different types of implications and what do you think?  Does this hold promise?  Is this the right direction?

>> JESS KROPCZYNSKI: It seems like a step in the right direction but it is important to have many multistakeholder meetings to understand where everybody is coming from went talking about the issues.  It is too often the case when it comes to certain types of issues we can all be having a discussion about a topic but we all ‑‑ we're not necessarily on the same topic because threes are such big issues and so everyone, especially when it comes to policy, everyone has an agenda that's associated with that.

There's ‑‑ it seems like we're just getting to this space and there's going to be a lot of different, you know, sub plan, agendas within that.

>> COURTNEY RADSCH:  Excellent point.   .

Jess Kropczynski, what do you think?

>> JESS KROPCZYNSKI: I agree.  There is not something that you can leave to one company, one group of people, they self‑govern, that means they pick the route that's easiest to them. 

There are groups doing international standards, things like that, we could use the same words.

I wanted to just add, to say that we have to go beyond a multistakeholder approach, we do that through the IGF, but outside of that, I think Steve earlier talked about human‑centered approach, where the designing of technologies, and we actually have a very big opportunity to leverage skills that our communities have.  You have ‑‑ in my work, I work with artists, local technicians, creatives, people like that.  Working with them to help ‑‑ for them to help deploy technology n means that then you have technology that's inclusive, it is relevant for local context.  Yeah.

>> COURTNEY RADSCH:  Thank you.  That's an excellent point, using that technology around codesign is an interesting way to think about this.

I will skip Steve, you want legally binding resolutions.  We only have two minutes left.  We have two hands in the audience while we incorporate a question from the Madagascar hub on whether artificial intelligence play as significant role in Human Rights and does it help protect or infringe them in a certain way.  What I'm hearing in this question is let's hear a little bit about specifics and before I turn that to the panel, and I'll maybe call on Steve for that, to get some specifics on how you think about Human Rights with artificial intelligence both the pros and the cons, I want to go here into the audience.

Introduce yourself and keep the question brief.

>> AUDIENCE: From Pakistan.

As arrival of the COVID‑19 use of Internet becomes more necessary and even many were forced to use the Internet and bring this facility to further the education system and working from home, et cetera, and during this period we have noticed that cybercrime has enhanced to a great level as well as these countries were not well acquainted with the Internet facility and new to this system. 

The cybercriminal is performing on transnational level.  The communities are not performing at such level.  Their governments, they're not even able to get the data and specific evidence from the different social media providers when it is needed to fight the crime.  What are the suggestions with the IGF and our panelists to look at this aspect.

>> COURTNEY RADSCH:  We had another question in the room, back here.

As the mic is going back there, I just want to remind the panelists that there is a question about responsible technology and whether it makes sense to be talking about that as a specific dimension.

Now we're back here in the room, please.

>> AUDIENCE: Hello.

I'm from a high school.  I want to ask a question, a very simple question, it is in our era of the pandemic we have to wear mask at school, at work, everywhere else.  The teachers have asked that we have to put them on and the students, we often say that they don't respect their rights.  Are they rights or are they not?

>> COURTNEY RADSCH:  Thank you, that's not really an Internet Governance question, we will see if any panelist wants to address that.  Thank you for attending the Internet Governance Forum, it is great to see high school students here.

I want to now go back to our panel, we have 10 minutes left.  I will ask for concluding remarks but specifically if you can address the relevant questions that were asked in the room, specifically, you know, including in the chat around whether there needs to be more hard law rather than just soft law around surveillance, around cybercrime, around Human Rights protections and we have 5 minutes left.  I do want to get you on the record around this.  I think this is really important for helping us to move forward with what comes out of the Internet Governance Forum and I'm going to turn to Steve first and also in addition to answering those questions, if you could briefly describe the Human Rights benefits and challenges of artificial intelligence.

>> STEVE CROWN: Well, we only have 10 minutes and we have other panelists.

Very briefly, I want to take up first the question on responsible technology.  That's at the core of what we do at Microsoft.  We have a group, the Office of Responsible AI, a good friend, colleague of mine runs that.  She has ‑‑ her formal title is our Chief Responsible AI Officer for the company.  This notion of responsibility is at the core of taking seriously Human Rights and human‑centered design. 

Just this morning I was in a meeting that was ‑‑ you know, it was ‑‑ it would be probably I would imagine without exaggerating it was dozens of people.  What we do in these review sessions, such as the one we had, a particular technology being developed out of Microsoft research being then adopted and adapted by a product group looking at here is a new service we can provide.  We go through a deep challenge of the potential Human Rights issues related with it, when talking about machine learning, questions of bias, fairness comes in.  Really it is this something that Hanses and empower, or is it something that we can do, just because we can do it, should we do it?  We have those discussions in the company and other major companies do that as well.

A challenge frankly, at the smaller level, at the start‑up level, you don't have the luxury of having the ability to gather people and to have those internal discussions.  We have, you know, academics that are on the payroll who help drive and focus these academic discussions of fairness, bias in artificial intelligence, deeply informed computer scientists working with our researchers to make sure that we're getting the most valuable kinds of perspectives on that with deep understanding of the technology as it is and where it is developing.

Then I will very briefly, knowing we only have a couple of minutes, I will say on the notion of a binding instrument, if you look at the way that the UN operates, we're very strong supporters of the United Nations.  I happen to sit in an office that the umbrella is actually the United Nations and international organization, even though I'm the Human Rights lead.  We do believe in that process.  I don't know though speaking personally, not as a Microsoft statement, that we think we would end up with a better statement of Human Rights than we have in the Universal Declaration, the International Covenant if we sat down and said we're starting from scratch today and every country, including those who have the least‑demonstrated respect for the existing Human Rights regime are going to draft the new binding rules for us.  I actually am personally in favor of breathing more life and more meaning into the existing instruments and using the UN as a means of driving that improvement of the Human Rights situation rather than saying, let's start over, let's do it fresh.

We do favor, as I said, real regulation, binding regulation that is deeply informed.  If we don't enjoy, it is a distraction of resources to figure out what to do when we have clear guidance from democratically institute regulations that we can follow and we know that it is actually addressing the desires of people who were heard through their representatives.

I will stop there.

>> COURTNEY RADSCH:  Thank you.  A lot to address in concluding moments of final remarks.

Sarah, can you address the question on artificial intelligence and Human Rights?  You have done such important work on gender and AI along with the other questions.  Please go ahead.


Thank you.

So let me just start by mentioning something that Mozilla has been doing for a while and with the artificial intelligence, I will put them in the same response, technology, trustworthy AI, basically what was realized, what we have tried to do is moving to artificial intelligence that's helpful rather than harmful to human beings and there are so many aspects of that.  So this does include privacy, fairness, trust, safety, transparency, so basically shifting the conversation from personal behavior, asking users you should hide this, you should do this, that on your phones to asking for systematic change from companies.  By doing this, also trying to hold companies accountable for what they're doing.  There are some projects, a YouTube regrets campaign, many other campaigns where Mozilla has held some companies to account.  There is one by Venmo, the data was public by default, I don't know if you knew, but advocacy through Mozilla, they managed to change that, now it is no longer public by default.

Now talking about the work we did about three, four years ago with the research Internet Protocol Africa, it was funded by Microsoft ‑‑ thank you very much ‑‑ basically we did a landscape study trying to understand gender and artificial intelligence to the African continent, looking at who is in the space, what countries have policies in relation to AI, we even went much further and looked at who is actually coming into computer science, computing degrees, things like that.  I can share the link to the study if anyone is interested.  It was very interesting to see that some of these biases happen because, for example, in terms of gender, there are not many even people and as many panelists said, if you have just a particular group making decisions on behalf of everyone then the technology will not be inclusive.  I think that's what I'll say.

Thank you very much.

>> COURTNEY RADSCH:  Thank you.  Excellent points, shifting focus from personal behavior to the broader context would be one thing that could be helpful by an internationally legally binding treaty.  I will also note there is a comment to the fact that, you know, there are existing legally binding treaties on Human Rights and as Steve alluded to.

Jess, coming to you on this final round of questions and concluding remarks.  Your thoughts?

>> JESS KROPCZYNSKI: My apologies, my Internet connection has become unstable.

I just want to thank everyone for the great comments and I have learned a lot through this session.  I'm going to pass my remarks on to somebody else.

>> COURTNEY RADSCH:  Okay.  Thank you!  The issue of access, connectivity is live in person!

Scott, we're going to transition to you for final remarks on these vast array of questions and really if you could talk to us about your thoughts on responsible tech on this idea of legally binding treaties, do we need a new one, are the existing ones enough?

>> SCOTT CAMPBELL: Sure.  I'll be brief as I realize we're just about out of time.  We may lose our interpreters and become less inclusive.  I'll really be quick and just make three very quick points.

I think the issue of artificial intelligence as being both, you know, an enhancer, a potential threat to Human Rights is just fundamental here.  With, you know, the Secretary‑General last year put out a report on how artificial intelligence can take us forward in meeting the SDGs, enhancing social, economic rights and at the same time, pointing out the risks they're in.  I think that's ‑‑ that's the key.  It is using a Human Rights lens so we can mitigate the harms, potential harms of artificial intelligence and fully benefit the power of AI and pushing us forward towards the SDGs.

On a binding international instrument:  Legislation, regulation, I think the answer, you know, in short is yes with a comma, but.  Yes, we do need more rules, hard, binding rules out there to regulate this space, but the comma and the but is getting it right is so important.

In January, there will be international discussions to develop a new treaty on cybercrime for example to address the very serious issues of cybercrime that have been highlighted by some of the questioners earlier.  Getting that treaty right will be so important and the room and possibility of moving backwards on Human Rights through the development of the international binding treaties that are not, you know, carefully, carefully contracted I think is very real and again I think participation, transparency in the development of new governance framework, be they national, international, it is so important.

Lastly, on responsible tech, I think it is very useful to call upon companies to live up to their responsibilities.  Of course.  In the use of technology.  I think that's helpful.  I would just ‑‑ I think it is important to go a little further though, to clarify, I think as Steve was starting to do that when talking about responsibility, what does that really mean?  How are we defining what is responsible?  What is ethical?  What is human‑centric even?  There again, I think Human Rights can give as you framework that is very clear, is legally binding, crosses border, reflects, you know, universal values of the UDHR but I think it is important to push us to go a little further and define what do we mean by responsible behavior and putting Human Rights at the forefront there.

Thanks very much.

>> COURTNEY RADSCH:  Thank you for the very thoughtful comments from all of the panelists and for the audience who have participated online and off, both in person here in Poland and around the world, including our panelists from the Western hemisphere who are literally getting up in the middle of the night to be a part of this discussion because it is so fundamental to Internet Governance, to the Internet Governance Forum and, of course, we're on the eve of Human Rights day.  There is really no better day to have this discussion.

We will be putting together a report.  We have a lot to think about and to digest out of this conversation.  We hit on a lot of topics and have seen just how incredibly diverse, complex and interconnected issues of economic and social inclusion and Human Rights are.

I want to thank you for really fascinating thought provoking session and encourage you to keep engaged over the next year as we bring these policy considerations forward.

Thank you so much.