IGF 2021 – Day 2 – WS #204 Stakeholder Roles for Human Rights Due Diligence

The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.



>> We all live in a digital world. We all need it to be open and safe. We all want to trust.

>> And to be trusted.

>> We all despise control.

>> And desire freedom.

>> We are all united.

>> SARINA PHU: That was great. Hello, everyone. Can you hear me okay? Is everything sounding good? Great. We'll just give it maybe one more minute before we get started. I hope that no one had any issues or any troubles connecting. And I just ask everyone bear with me as it's my first time hosting and moderating at IGF.

Okay. So it looks like we may have nearly everyone here. I'll just give it maybe one or two more minutes to see if we'll have our guests joining us.

Okay. Maybe I'll begin with some quick remarks and get started then. So good morning, good afternoon, good evening. Thank you, all, for joining us here for today's session. My name is Sarina Phu. I'm the Research and Programme Associate at the Global Network Initiative.

So before we begin, I'd like to take a moment to briefly introduce our discussion. The Global Network Initiative is very pleased to be convening this discussion and we're very grateful to the Internet Governance Forum organizers, the Multistakeholder Advisory Group, and our very distinguished panelists for making this happen.

This is a really timely moment to be discussing the human rights due diligence. Companies are voluntarily conducting it. More stakeholders are being engaged. Governments are considering mandating it.

GNI long worked for due diligence through tech companies including through principles of freedom of discussion and privacy and detailed implementation guidelines.

Over the last decade, GNI fostered insight and discussion on these matters among our members including through our unique assessment process. Where we've examined member companies, due diligence policies, and practices in detail.

More recently, our members have expressed interest in deepening these discussions and making the lessons learned in good practices that we see publicly available in the form of guidance and recommendations.

Last year, we created a new Human Rights Due Diligence Working Group to take this forward and entered into a partnership with Dunstan to engage with us on this work.

Today's discussion, we'll touch on the importance of ensuring that our human rights due diligence work engages key stakeholders and regulatory efforts that relate to human rights due diligence facilitate and support collaboration on this work. Including through co‑governance initiatives like GNI, rather than unintentionally disrupting our potentially creating barriers to them.

So this workshop will utilize a roundtable format in which our key questions are addressed by all of the participants. And our speakers here today represent key perspectives from government, Civil Society, and Academia. They'll provide short framing comments in response to the questions. As the moderator, I'll actively involve all the participants in the discussion.

So I'll take a moment now and we'll pass it on to our speakers. We have a great lineup today. First is Catherine Bloch Veiberg, Senior Adviser and Responsible Value Chains Programme Manager at the Danish Institute for Human Rights. Mr. Magi Hassan, Director General of Pakistan's International Cooperation wing. Ramiro Alvarez Ugarte, Centre for the Study of Freedom of Expression and Access to Information at the University of Palermo. Dunstan Allison‑Hope, social responsibility. And Rene, Senior Policy Adviser at the Netherlands' Ministry of Economic Affairs.

I'll give the floor to our speakers to introduce themselves. Speak for two to three minutes on what they'll bring regarding human rights due diligence. We'll have time for discussion and question & answer at the end.

So with that, maybe let's get started with Dunstan. I know you've been doing a lot of work with human rights due diligence. And could you talk a little bit more about your experience with it and introduce yourself as well?

>> DUNSTAN ALLISON‑HOPE: Great. Thank you, Sarina. Thank you for the introduction. So as Sarina mentioned, I'm with BSR, Business for Social Responsibility. We're a non‑profit organisation that works with companies on human rights issues as well as other topics like climate change and social justice. And very much looking forward to this conversation. Appreciate the invitation.

We've been undertaking human rights assessments with technology companies for the past decade or so. Some of which are publicly available. But most of which are not. We've undertaken many dozens of assessments. And in reflecting for this session, what struck me was the way in which they come in many different shapes and sizes. So we've undertaken assessments of companies entering or leaving or staying in a market. And we've undertaken assessments of companies introducing a new product or service. Or adding a feature to an existing product or service. Might be a merger or an acquisition or a sale. We might look at customer relationships and an entire industry vertical. Sometimes it might be a single customer relationship.

Sometimes we look at content policies for social media platforms. Other times, it's about access to remedy mechanisms. We've undertaken sector‑wide human rights assessments and also assessments for multistakeholder efforts.

So these assessments come in a wide variety of different shapes and sizes. And the timeline available to do them can also vary dramatically. Sometimes the company says we have a decision in a month's time. That's how long you've got to do this assessment. Other times it can take two years or more.

So in reflecting for this session, what struck me was given this diversity of assessments and timelines, how important it is to keep front of mind the essence of a human rights assessment.

And I wrote down three things. First of all, an assessment should be an assessment of impacts on people against all human rights contained in international human rights instruments. And it sounds obvious, but for companies, this emphasis on impacts on people, not impacts on business, is an important shift in mindset.

Second, that engagement with rights holders and stakeholders, especially at those of heightened risk of becoming vulnerable and marginalized is especially important.

Then third, that we need to identify appropriate action that the company should take to address those impacts.

And in that context, there are just a couple of other things I think is important to remember. First of all, human rights assessments are a good starting point to stimulate dialogue and discussion inside companies and create action plans to address human rights issues. They are only one part of a broader system of human rights due diligence. Sometimes we fear an overemphasis on the assessment as distinct from the broader system than which it sits.

And second, that I think we're only going to realize human rights in law and in practice if we think of entire systems. Sometimes when we're undertaking human rights assessment for an individual company, we fear too much emphasis on one company and not emphasis on the overall system. I encourage us to think about overall systems and how different companies, different stakeholders, different actors play together.

So with that, looking forward to the discussion and hearing from our next speaker. Sarina, back to you.

>> SARINA PHU: Thank you so much, Dunstan. I thought that was an excellent way to sort of introduce the discussion and frame it and remind us that human rights impact assessments are about the people.

So next we have Mr. Mangi Hassan, Director General of Pakistan Ministries International Cooperation wing. Mr. Hassan, could you tell us a little bit more about your experience with human rights due diligence as well as the Pakistan‑specific context?

If you're speaking, I think you may be muted.

>> MUHAMMAD HASSAN MANGI: Are you listening to me now? Thank you very much for letting me describe some of the things related with the team. Thank you, organizers, Catherine and Dunstan. If I don't pronounce the name, please forgive me. Because of specific issues with the languages. We had a very good talk from Dunastan Allison.

In Pakistan, particularly, even though the guidelines have some respect of the United Nations General Assembly. Let me assure you that Pakistan commends IGF for conducting this session. Very important session, indeed. And it is high priority of importance because nowadays, particularly in the eve of having human rights businesses and digital issues, online issues, and the workforce involved, and human rights, particularly. For those who are signatory and state party to a number of conventions, particularly the core conventions, they are responsible, actually. So we appreciate the stakeholders, including government's role, precisely in human rights due diligence.

How to do it, it is a still new phenomenon. We will be able to learn from each of those experiences. Particularly, in our countries. We're fastly growing toward that end.

We are demonstrating our government to the United Nations Guiding Principles by launching our first ever National Election Plan on business and human rights during September. Very recently this year. We involved each and every stakeholder, trade unionist, at the multiple level unions and business entities. Chamber of Commerces. And also with support of the United Nations Development Programme. So it is the first ever National Action Plan in South Asia. You may recognize it.

So now when the National Action Plan has been dealt through multistakeholders, considered the process, there are recommendations from both duty bearers and right holders which have been incorporated in the Action Plan. This Action Plan has been approved by the highest firm of the cabinet, by the government. We disseminated this Action Plan. Particularly, because it's a cross‑cutting due diligence of human rights. Whether it's fundamental human rights. Because (?) should provide for fundamental human rights.

Beyond that, we have extended chapter of nine core human rights conventions. Wherein, the civil liberty right to expression, right to freedom, dignity, nondiscrimination principles, all are in there. So human rights and businesses cannot be taken into consideration in isolation. It should be together with the all‑human rights instruments. Declaration, conventions, and other treaties. Most of the states are signatory and have ratified these treaties and protocols and United Nations guidelines. In some cases (?) it talks about.

It's imperative that you should involve every stakeholders in order to disseminate information, promote awareness, and building some protocols. Because ultimately, you will need to adopt some of the mechanisms wherein you can (?) in terms of the due diligence.

So due diligence must have some mechanisms together, government, as a regulator. And also authorizing the regulating body to have due diligence in an environment with the industry communities, with laborers, with other stakeholders, in order to have oversight and review of the human rights and businesses. Particularly, digital startup things.

So let me take two more minutes, if you allow me. Quickly, I will go through this. Election plan around eight priority areas including financial transparency, anti‑discrimination. Very important. Human rights due diligence. It's one of the areas, it's cross‑cutting also. Also one of the area. (?) occupational health and safety. And access to remedy. That is important. Because I talked about the mechanisms should be in place at various levels to have due diligence and insight required for such availability of human rights.

Corporate human rights due diligence as we know is a key component of the global business and human rights scores. The National Election Plan includes these areas as I told you. Due diligence through a smart mix of measures. This includes exploring opportunities for legislative and policy reform. As well as incentive mechanism to promote business (?) for human rights. That is also very important for us as well.

Like other countries in the region, Pakistan has a large informal economy. (?) which may be multistage or fragmented. In nature, with the combination of several formal and informal corporate sectors. Therefore, we create dialogue. Effective for the application of Guiding Principles and human rights due diligence in this complex environment.

So I will not take more time. HRDD is an ongoing exercise. We will learn from each other. Already identified actual potential of adverse human rights impacts. Integrate findings from impact assessment across the process. Effectiveness of measures. Communicate how impacts are being addressed. That is important.

So since the launch of first ever in the South Asia, National Election Plan on business and human rights. Ministry has motivated that and the Secretary. Which should be the mechanism (?). Together with the concentration of the stakeholders, we'll proceed forward. Look forward to the next decade of the Guiding Principles and committed to its part. Thank you very much.

>> SARINA PHU: Thank you so much. It's really good to hear about Pakistan's National Action Plan on business and human rights. I'm sure for the discussion later, the other speakers and panelists will be interested in speaking more about some of your actions.

So next up we have also from Pakistan Mr. Usama Khilji, Executive Director at Bolo Bhi. Can you take a couple minutes to introduce yourself and speak to your expertise regarding human rights due diligence?

>> USAMA KHILJI: Thank you very much, Sarina. I'm very happy to be a part of this session. I think I also want to congratulate Sarina for doing an excellent job in organizing this session and moderating it. I know it's your first IGF. You've been wonderful.

I just want to start by commending the Pakistani government. The Ministry of Human Rights for setting up their National Action Plan for business and human rights. As articulated by Mr. Mangi just now. I want to speak a bit about Civil Society's rule in human rights due diligence. I want to address a few layers. Especially when we talk about technology companies. I believe there's a lot of work that has to be done by especially global technology companies when it comes to human rights due diligence where they're entering markets. So for what we've noticed is a lot of their decisions are motivated by the business and profit motive so much so that considering for the impact and human right impacts of their presence in certain countries and territories hasn't been taken into account.

Technology companies' role in Myanmar, Ethiopia, also other places such as India where we've seen the proliferation of viral misinformation and disinformation. And we've seen that in Brazil as well. Especially during the COVID‑19 pandemic. We've really noticed really grave human rights impact that technology companies' presence can have.

I think it really speaks to the critical need for technology companies to carry out human rights due diligence. Be transparent when they do so. And involve local Civil Society voices and other voices in these processes. So they understand the market they're entering. They're able to do business in that territory in a more informed manner. And their presence does not really perpetuate further issues or violence. I think that's very important.

The second thing I want to really address is the lessons we've learned from the COVID‑19 pandemic. It's ongoing. The reason we're all speaking on Zoom right now is because of the pandemic. Otherwise, we'd all be freezing in Poland. What's very important is social media companies, technology companies, also consider how unintentionally they may be perpetuating inequalities because of their businesses.

And I'm speaking about very vulnerable populations that are impacted by technology. So, for example, if we look at education, so what are technology companies doing to make education more accessible to a larger number of students worldwide, right? But really, also are they educating people on how to use their tools in a better manner? So I think it's not enough to sell the technology in certain markets. But I think part of the human rights obligations of these corporations should also be further training in these ways. For example, it would be wonderful for technology hardware suppliers and software suppliers to come into countries where there's literacy issues and maybe conduct trainings along with local actors on how to, for example, improve access to education through technology. And that's just one example that I'm giving.

So I think the point I'm really trying to make is that human rights due diligence should not only think about what the impact of technology would be negatively. But also think about what's the positive impact that these technologies and their companies' presence can have in different territories across the globe. And how they can further the cause of human rights and fundamental rights in different territories. Thank you.

>> SARINA PHU: Thanks so much, Usama. Thank you for your kind words at the beginning as well. I wanted to pick up on that last point you made that, you know, human rights impact assessments don't necessarily have to be about the negative aspects or impacts of human rights but also the potential positive impacts as well. I think that's maybe a little bit less touched upon in these discussions.

So maybe next we can go to Ramiro Alvarez Ugarte. Ramiro, I know CELE recently published papers and your search on human rights assessments.

>> RAMIRO ALVAREZ UGARTE: Thank you, Sarina. It's a pleasure to be here and share this conversation with you. We're a research center. It's an academic researchers. We thought about looking into this business practice. About a year ago. And we were happy to do so. We started in a way with a little bit of not so much knowledge about it. We were not experts on human rights. Some of us worked on human rights before not from the point of view the relationship of business and human rights.

We did two reports. One of them is a history of human rights and assessments as a tool which is probably less useful and interesting for those of you who have been working on this issue for so long. We thought it was interesting in and of itself but also useful, especially in Latin America which is our region of focus. Because we felt, and this is the reason we launched ourselves into this research, that a lot of practitioners, especially in Civil Society and NGOs and activists, were not aware of the tool and the usefulness of this tool, especially in the ICT sector.

The second paper addressed, and it was desk research, which probably needs to be complemented by further methodologies. Our desk research tried to address the issue of the use of human rights impact assessment by SET companies. And we think we got out of that research with three main insights I would like to share with you. Because I think they're useful for the conversation moving forward. I also think they're good parting points for further research.

The first of those insights has to do with a general issue which is transparency. We believe there is no (?) transparency in terms of how human rights impact assessments are used. We were sometimes able to access summaries of those reports. We weren't able to find full versions of those reports except in a couple of circumstances.

So there is a first thing that seems important to me about the ‑‑ to us, I should say ‑‑ about the transparency around human rights impact assessments. And we also found a general problem, which I believe it's a general problem that we should address, which is there is a general lack of clarity in terms of how human rights standards can guide conduct. Especially on this field, and especially with regard to platform and Internet companies. There's simply not enough knowledge, we believe, in terms of what are the effects that these technologies have on populations when they're deployed? It's very easy to deploy these technologies into populations. But we do not know exactly what the effects of those technologies are.

I think this poses a very substantial challenge to the idea of assessing impacts of technologies and human rights. Because if we do not have a clear picture of what mechanisms are under way, what their effects are, and if we don't have enough research of those effects, especially research produced in different societies and in different contexts, it's very hard to actually assess the impact that these technologies have on human rights.

And finally, we find that ‑‑ we find that companies are really eager to acknowledge negative impacts when the bad negative impact is the outcome of a traditional bad actor. For instance, asking for censorship or requesting them to eliminate certain content that is protected by freedom of expression. And so on. We see a little less eagerness in recognizing themselves as bad actors. As guilty, to put it in a way, of producing a negative impact on human rights. And this, I believe, is related to the previous point which has to do with lack of adequate research in terms of the effects technologies have.

Those are our main insights from those papers. We look forward to deepening and strengthening our research. And we look forward to taking part in the conversation.

>> SARINA PHU: Thank you so much, Ramiro, for that very great overview of the findings from your research. And I'll in a little bit put in the chat a link to the research as well if we can find it.

So next up we have Rene from the Government of the Netherlands. Could you ‑‑ I'm curious about the Digital Services Act and specifically the component on human rights due diligence and mandatory human rights due diligence. Could you talk a little bit more about that?

>> RENE VAN EIJK: Yes. Sure. I hope you can hear me well. I had some issues connecting. It seems to be working fine. So first of all let me thank the members of this organizing committee for inviting me to join this panelist discussion. And also the ‑‑ Sarina, you, in particular, for organizing this session. I know it's not easy.

I'll try to keep it brief. I'll talk a little bit about my involvement in the negotiations on the Digital Services Act. Sort of an EU‑long content (?) online. For those not so familiar maybe with the legislative proposal, really quickly, it's a legal text that sort of regulates the conflict moderation obligations on social media platforms such as Facebook or Twitter has and operate across Europe. It sort of sets standards in that sense, for companies.

I'd like to hone in on three aspects of the due diligence section which I believe are kind of conducive to the protection and promotion of human rights online. Particularly, freedom of opinion and freedom of expression online.

When it comes to ‑‑ so first, when it comes to content or speech moderation online, because that's often an issue that we talked about in the context of the DSA, when it comes to those rules specifically around removal, I think it sort of, the discussions boil down to two fundamental questions which are who draws a line and where do we draw the line? I think the DSA provides the right answers to these questions.

As for who draws the line on what is legal or illegal, the DSA does not define what is constituted or constitutes as illegal content or speech online. Instead, it relies on the definitions or refers to the definitions that are being used in the relevant EU or national laws. In other words, laws that have been passed by democratically elected governments.

And for the where question, the due diligence requirements that deal with removal and decisions by online platforms or social media platforms such as the so‑called ‑‑ there's an action mechanism. We can talk about that more in a little bit, if you'd like. Applies to legal content only. That is to say they do not sort of touch upon any other content that is maybe deemed harmful or undesirable by some but isn't necessarily illegal. It's very important that the scope of those removal obligations remains limited to the illegal content.

Second, there are due diligence provisions in the DSA and throughout the DSA that sort of force online platforms, social media platforms, to be transparent to their users about the ads that are being served to those users. Also the content that is recommended to them. I think this is a really important point. This transparency has come back, I notice. It chimes with the April 2021 report of the UN Special Rapporteur, where she explains the right to develop one's own opinion should not be impeded by any involuntary consensual manipulation. She sort of equates those, that consensual no manipulation used through online platforms. Through microtargeting.

I think the rules in the DSA, about advertising transparency and recommended systems are sort of a welcome step in this respect. And help or enable the user, the end users, to sort of inform themselves of how this information is presented to them and why it is presented to them. This helps them to ‑‑ it's meant to sort of set up a protection for the end users' freedom of opinion.

Last, the DSA also sort of removes barriers for due process for end users. Agree with content moderation decisions that are taken by online platforms. So there are a couple avenues that are being presented to end users such as ourselves. Platforms have to set up internal complaint (?) States, member states, such as the Netherlands, or member states like France, have to set up (?) For out of court dispute settlement systems. They can turn to the out of court dispute settlement systems if they disagree with a decision made a platform to remove certain content or carry certain content. All of this doesn't prevent users from going to court if they really feel like their rights have not been represented enough.

So there's a couple of those points that come back in the DSA that are very helpful I think for the detection and promotion of fundamental rights like freedom of expression.

There's a couple more, but I'll leave those out for now then we can turn to the Q&A session afterwards. Thank you very much, Sarina.

>> SARINA PHU: Thank you so much, Rene. Last but certainly not least we have Catherine from the Danish Institute for Human Rights. Catherine, I know the Danish Institute has published some guidance on human rights impact assessments as well. Would you like to speak a little bit more to that as well?

>> CATHERINE BLOCH VEIBERG: Yes. Thank you for that, Sarina. Thank you for inviting me. Very pleased to be a part of this discussion and in a group with such great speakers and fellow panelists.

So, yes, as mentioned, I work as the Programme Manager for the Responsible Value Chains Programme which is part of our Human Rights and Business department at the Danish Institute for Human Rights. The Danish Institute for Human Rights has been an organisation working with the topic of business and human rights for a long time. We've been working for over 20 years on the topic of business and human rights. Really, to begin with, focusing on what you can call more physical business activities. Engaging a lot with extractive companies, with food and beverage companies. Helping them to identify, assess, and address their human rights impacts on the ground.

And a couple of years ago, we decided to embark on trying to see how our experienced methodologies around human rights impact assessment, more physical business activities, would look like when looking at digital business activities. Which resulted in the development of the Guidance on Human Rights Impact Assessment for Digital Business Activities which was launched last year in 2020, which was really developed in very close consultation with various stakeholders. Many of which are in the panel and also in the room here today.

And through that process, it really became evident to us that one of the most important areas to address when it comes to the digital ecosystem and where it differs most when it comes to the challenge of comparison to the more physical business activities and assessing and addressing those impacts, is really stakeholder identification and rights holder engagement. That's really where one of the main challenges were.

There are a number of reasons for this. Just to highlight three kind of central differences compared to more physical impacts. First, there is a lack of a clearly identified location for the youth of the digital activities, which means that rights holders and stakeholders, more broadly, can be, in theory, anyone, anywhere. So how do you kind of create that context of application? Context is really needed to understand, well, how do these digital activities actually have an impact on specific groups at a specific point of time within a specific context?

There's a risk that if we discuss impacts of digital business activities at kind of the theoretical level or at the level of the digital activity as such, it becomes much too high level and too theoretical. So how do we get it down to the ground? It's really about having an understanding of that context and how a digital application in one context can have a completely different impact on human rights in the digital application in another context.

The other point is that new technologies and new solutions mean that there are limited lessons to learn from the mistakes and successes of others. So, really, the constant development in technology, of course, means that the potential impact of those technologies, of course, always develops. And the potential of which it has an impact on human rights will also develop. So that really also highlights the need to work across the stakeholder groups to really understand the complexities of how new technology, new application, will impact on the rights of individuals.

And then, thirdly, impacts can be very difficult to identify. They can basically be invisible, which makes it really difficult to engage with rights holders. There are worries. Understanding the potential impacts at every stage. Because it's simply not evident what these impacts might be.

Just to give an example, for example, in the context of using automated credit risk ratings. The person who are being subject to that type of credit risk rating might not even know that they have been assessed by an algorithm as to what type of risk grading that they get in terms of receiving credit. So they don't even know that there is an impact on them. But there is actually an impact taking place.

Speaking kind of the broader stakeholder roles, we really believe based on our experience, based on development guidance and dialogue we've had with various stakeholders that there is really a need to come together to address some of these topics. Because there is really, in our understanding, and also can be seen from the panel here today, a good understanding of how there is a need for different stakeholders to come together with different perspectives. From the state side. From the business side. From academia. Et cetera. To discuss some of these things. So that's why we've actually been engaging also with the Danish government on the Tech for Democracy Initiative to put in place an Action Coalition working specifically on responsible business. And including under that Action Coalition also the aim to develop some resources around stakeholder identification and engagement in the context of digital activities.

And we're doing so also in collaboration with GNI, with BSR, and with the B‑tech project. We're also involved in that process.

I think kind of from our perspective, something to discuss now that we get to the next point for discussion is really what can we do together to actually make progress here? You know, there must be enough noncontentious issues, quite basic issues, that we can actually address collectively. So that we can have a very kind of targeted discussion on the topics that require more debate. And joint lifting. Thank you.

>> SARINA PHU: Thank you so much, Catherine. I think you're doing my job much better than I am with segueing and transitioning into this next part with that last question.

Maybe I'll pause there and see if any of the other panelists, speakers, want to take a stab at addressing that. Or, you know, if there are any lingering questions you also wanted to raise, wanted to open the floor for that as well.

We also have I think a hand raised and welcome questions from the audience. Collin, would you like to unmute yourself?

>> COLLIN: Hi there. Sure. I'm joining here from cold but beautiful Katowice. I wanted to chime in here because I've actually worked with several of you from different stakeholder perspectives. I'm now representing Ofcom which I think is an underrepresented perspective here right now out of the independent regulator, which will soon be taking up duties under the forthcoming UK Online Safety Bill.

From my perspective, Civil Society organisations like CELE and others do a great job of helping to clarify these kinds of impacts. Obviously, DHR is great at setting the bar for best practices and standards for assessment frameworks. GNI and BSR and others are really great at translating this into business practices. Or governance models.

But I do think that sometimes there's a risk of talking past each other. Still though we're working toward common goals more often than not.

When I was part of a company and implementing human rights impact assessments and trying to coordinate our GNI assessment, it was coming in with the best of intentions and best frameworks didn't always land with the kind of operational teams or product teams and things I was seeking to help inculcate these ideas with. You figure out what is the cheat sheet, what is the minimum product I can transmits, the minimum idea I can transmit to these developers or these salespeople to help them realize the more salient risks.

Then also factor this into existing governance processes. You have corporate governance like risk assessment processes. This is something BSR does really well, as well.

I think now coming at it with my current hat on as a regulator, I would ‑‑ I thought Rene did a great job of underlining different parts of evolving regulatory frameworks which can be supported by human rights due diligence. So things like risk assessment provisions or transparency mechanisms. Or complaints flagging, reporting mechanisms.

But I still think that there's work to be done in terms of translation oar modulation, if you will.

And then also recognizing where the kind of levers are or overlaps are in parallel or concurrent evolving regimes that could support these goals. So even if we're trying to maintain the high standards of HRAAs or different impact assessment methodologies, I think we shouldn't let the phrase is don't let the perfect be the enemy of the good. Right? And acknowledging where research agendas, other research agendas, might be able to support the goals. So, for example, there's many different national agendas coming out around things like media literacy. And maybe that's an interesting lever that could help with human rights due diligence. Or, obviously, transparency would be a good one.

And then, again, I just have to stress, again, I know that, I hope that DRHR continues to be the standard bearer for the high level, the best practice. I think there's also a role for human rights practitioners to be developing the floor as well. And I think that's where potentially regulatory regimes like that of the Online Safety Bill which in its current iteration contains provisions for both risk assessments and impact assessments. I think an underappreciated aspect of the Online Safety Bill is it will require category 1, essentially, very large players to carry out freedom of expression impact assessment, which I think some of us will acknowledge that that is easier said than done. But, yeah, so I think that that's another critical role that either existing stakeholders or maybe a new category of human rights practitioners or in‑house practitioners, can play in this collective challenge.

And we've got another hand in Katowice.

>> SARINA PHU: Good. I don't think I can see it. Please feel free to go ahead.

>> Hello. Thank you. I'm Doudy. I wanted to ask when conducting human rights impact assessments and maybe with products. I don't know, because it is actually a bit confusing because there are a lot of organisations, institutes, that include AI in their deployment. And rights impact assessment. Which principles are you taking into consideration, or how you dough evaluate these?

And the second question would be also how are you conducting these human rights impact assessments in a different sectors? I mean, in government and civil society. Unperfect settings. I mean, how do you do it altogether, and how does it work? Thank you.

>> SARINA PHU: Just a quick clarifying question. Did you have a specific speaker you wanted to have answer the question? Or is this just more broadly for the panel?

>> Well, I think that the person from CELE, he mentioned something about a different, like, stakeholders. So maybe he could answer the second question and the first question about the AI principles, actually, it could be any person.

>> RAMIRO ALVAREZ UGARTE: Sarina, if you like, I can take the second one on different stakeholders. I should clarify that we from CELE, we're just looking at this but we haven't conducted a human rights impact assessment. We're trying to create a research line that will look into them more deeply.

But from our point of view, and after reviewing different guides, there is a lot of experience and shared knowledge constructed by practitioners. People who do human rights impact assessments. Some of them are here. On terms of what human rights impact assessment is and how it should be conducted.

In that sense, there's a special part of it that has to do with engaging different stakeholders on the ground. So as we see it ‑‑ and I welcome corrections or additions to this. As we see it, one of the most fundamental parts of conducting a human rights impact assessment is engaging people on the ground. It's very easy to see if you have, for instance, an extractive industry. You're going to get oil out of somewhere. You can pretty much be clear about what your impact will be. You'll have an environmental impact. You will affect directly the lives, maybe the livelihood of people living there. And if that is a risk that you can create, then you have to engage them. You have to talk to the people who will be affected. You have to talk to governments in charge of regulation and so on.

The challenge we see on the use of this tool for technology is that that is much less clear. But in any case, this idea of involving all relevant stakeholders seems to me to be a central building block of what the tool is. I will say that.

>> DUNSTAN ALLISON‑HOPE: Sarina, I'm happy to at a point. The floor is risen and we're seeing more traction across more functions within companies than, perhaps, we did five years ago. That's the good news. The thing I would encourage to move in that direction is clarity and consistency across all the various places that talk about human rights assessment in some shape or form.

So we have the Digital Services Act. We also have the potential for managing human rights due diligence in Europe. We also have the Artificial Intelligence Act. Going back, we have the GDPR. Requirements for the Data Protection Impact Assessment. Lots of different things. We would really encourage conceptual clarity on things like risk to whom. And how do you prioritize the most severe risks. What criteria go into prioritizing risks? How do you determine appropriate action?

There's a risk that we had fragmentation across lots of different things. Then put yourself in the shoes of the company. It's hard to know which, you know, which concepts to run with. Which ones to build into. To the extent that we can have clarity across everything, that would be tremendous. And UN Guiding Principles is the place to build that clarity from. The point we can use that as the foundational text and the Danish Institute and others are doing a really nice job of sort of translating that into the tech field so much better.

>> SARINA PHU: Thanks, Dunstan. It looks like, Catherine, you raised your hand. Then we can have Rene go afterward.

>> CATHERINE BLOCH VEIBERG: Yeah. I just wanted to support what Dunstan was just saying on the policy coherence piece being quite important.

I think another piece which is interesting and also really important is also understanding the kind of full digital ecosystem. Because to a great extent, you know, the emphasis is, maybe rightly so because of what's been happening and the type of cases that have come up. Very much on maybe some of the bigger players or bigger actors. But there are also other actors within the digital ecosystem that provide infrastructure. That are intermediaries. And all of these actors also have a role to play. And are maybe less in the forefront but still central actors to actually address some of these issues.

So in that context, of course, the Danish Institute, we've worked a bit with Internet infrastructure providers. With top‑level domain registries and registrars. Which has been extremely fascinating because these actors are not necessarily used to thinking of themselves in the context of business and human rights. But a really central exercise in understanding which role they do play in that ecosystem.

So in terms of just kind of building upon what I also mentioned before about getting different stakeholders together and what we're also working on in connection to the Action Coalition is also trying to get a better understanding of that ecosystem. Working with BSR, with GNI, on trying to get kind of a bit of a mapping going and understanding the different players. What their role is. And responsibilities are. And what can be expected of those different roles and responsibilities. Both within tech but also the ones that enable it or create the framework in which they operate. So both kind of government but also financial institutions and their role in supporting the development of these technologies.

>> MUHAMMAD HASSAN MANGI: Great appreciating the other input with regard to certain issues. I think since it's a global situation, and also a business situation, there must be some policies as has been said by Catherine and others.

Based on the policies, in terms of the UN guidelines, we need to further dissect that into similar pieces. For instance, SOPs. We must have SOPs in terms of the digital (?) diligence of the human rights. In various sectors of the human rights and businesses. Vis‑a‑vis the protocols on each and every issues. In some countries, they must be having certain protocols in terms of health and safety, occupations, and other things in terms of influencing economic opportunities. In terms of policies for the workplace and the workers and the trade unions. And how to involve their participation, not only involve, but effectively participating. Opportunities for them.

So multiple stakeholders and setting recommendations about protocols, SOPs, policies, legislation, actions. And I think by adopting such things, we can have (?) as has been said from the ground levels. From the operation level.

So to review policy and legislative frameworks, regulatory advice if there is a need. So as to provide a good due diligence mechanism and systems in place. Thank you.

>> SARINA PHU: Thank you, Mr. Mangi. And Rene, you've been waiting then we can have Usama maybe finish us out with the comments.

>> RENE VAN EIJK: Yeah, very quickly, I want to respond to a couple of the comments that have been made. Maybe two points. One point on the UN Guiding Principles on business and human rights. What we've done as being the council of sort of the member states deliberating on this legislative (?) At DSA, we made sure from now on intermediaries, access providers like Catherine I think briefly talked about, have to take into account these UN Guiding Principles when designing their terms and conditions. It's one way to make sure systemically speaking these sort of actors or companies are already thinking of these Guiding Principles when designing their services and their technologies.

But a second comment would be on the risk assessment. I think that's something that Catherine and Collin, if I'm not mistaken, from Ofcom, briefly touched upon. The impact of the technologies is sometimes hard to quantify. Also they develop really quickly. As Catherine pointed out. I think the DSA meets that somewhat by asking or subjecting these sort of online platforms, the biggest online platforms. We call them BLOPs. It's a bit of a stupid acronym. That's sort of the acronym that's being used to subject themselves to sort of risk assessments. Whereby, they have to actively check sort of how they use the services but also how now technologies they roll out, what kind of impact they have on fundamental rights including rights to privacy. And those are audited every year.

And the guidelines on mitigating those risks are developed by the European Commission and member states. I think there is a role for stakeholders like non‑profit bodies, but also human rights facilitators to sort of provide input on these guidelines. I think that's a very important role that they could play. Thank you.

>> SARINA PHU: Thanks, Rene. Usama, please go ahead. Just to remind you ‑‑

>> USAMA KHILJI: Thank you. I know we're at the last two minutes. I wanted to quickly add that we've been speaking a lot about companies and about Civil Society in the private sector. I think it's also very important to think of the governments' roles especially when it comes to, say, export controls of technologies that may be harmful to human rights.

So we've seen, for example, the export of surveillance technologies. By a lot of companies in western democracies that would not allow such activities by those companies to take place within their own countries. But are happy to allow companies to export such technologies related to surveillance that really violate the fundamental rights to freedom of speech and privacy. And I think that's also something that needs to be worked on a lot more. And these discussions need to be had under the UN Guiding Principles.

>> SARINA PHU: Thank you, Usama. That's a really important intervention.

We have one minute left. So just wanted to take that time to thank everyone for your excellent participation. This is a topic where we'll have many more conversations, I'm sure. And so I wanted to especially thank all the speakers, particularly for dealing with my organizational skills. And as well as with everyone who's attending and IGF virtually or in Poland. And for all of the support that IGF has provided.

Thank you, and please take care.

>> USAMA KHILJI: Thank you.


>> Bye, everybody.