IGF 2022 Day 0 Event #91 Global Peace Tech Atlas

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> MICHELE GIOVANARDI:  Welcome.  We can't see your face.  We have a few online participants as well.  I think those that are here are curious to learn more about what is Peace Tech, which is a trending word going now in the peace building and Peace Tech area.  I see Uma shared a document.  I will introduce this ‑‑ these two speakers here on screen, in a second.

But before I will also like to thank the IGF for this opportunity to speak about this.  It will be a rather informal session.  So it will be nice to engage with you in Addis Ababa as well.  We don't see you.  Feel free to take the microphone and speak to us, because the most important part is to, you know, get questions from the audience and engage.  I will start with the global peace tech and Global Peace Tech Atlas.  I'm Michele Giovanardi.  This is a recent initiative that we launched here in Florence at the School of Transnational Governance and we have two core partners which are the New York University, the governance lab here represented by Uma.  I don't know if you can see her.  And we also have another core partner, which is the University of Lucerne Institute of Social Ethics which is represented by Evelyne Tauchnitz which should be in the room.  I don't know if you can hear us.  If you are, hi.

If you agree, I will give you a brief introduction on what is the global peace tech hub and what is Peace Tech.

And I will turn it over to Uma.  Sorry.

So there's a comment from the audience if you would like to use captions, please use the link just posted in the chat.

So there we go, I will share a few slides with you.  Please feel free to interrupt me, otherwise, I will feel very lonely here on the screen.

So yeah.  I also ‑‑ Kalypso is also sending her regards that couldn't be here with us today.

Global Peace Tech.  Technology is getting into the news for bad reasons today and we know that technology can be weaponized in different ways.

In the '90s this was big hopes of using the Internet as a democratizing tool, it would drive peace and social cohesion, but we soon realized that that wasn't the case, and with the few scandals about misuse of data, from social media, but also about the spread of fake news, misinformation, disinformation, and the use of deep fake or the Internet to achieve ‑‑ to achieve from groups, for instance, to recruit organized terrorist attacks.

So we were quite hopeful and that turned out to be something bad.  We are quite aware of that at the Internet Governance Forum, but that's not the case only for the Internet.  Any other emerging technology, we can see a similar trend in AI, in blockchain, in space technology.

So we see that this common trend of tech weaponization looks like ‑‑ it can feel like something that's unavoidable and there is quite a lot of interest about this.  So technology is not this driver of peaceful globalization, but yet another thing that divides us.  So Mark Leonard in his book of "The Age of Unpeace."  Connectivity, instead of bringing us together is, in fact, tearing us apart.  Is this the end of the story?  We don't think.  So.

We think that technology can also be used for peaceful purposes and that can be many different levels and peace technology has been used for peace and peace building quite effectively in the last 10, 15, 20 years and we realize many organizations are labeling themselves as peace tech organizations and they run quite a lot of process around the world, in the Global South as well to achieve peace building purposes.  So that's something that is already there and it's happening.

What's the problem?  The problem is that for any given technology, you can achieve a lot in terms of peace building, but you also have risks, risks and dangers.  So try to capture this dynamic in what we call the tipping model.  This idea that technology is, of course ‑‑ it starts with something that is positive purpose and then is used and misused to achieve something negative.  And we will reflect on how we can make it tip back or avoid technology to tip on the ‑‑ on the wrong side.

And for any kind of technology, you have this duality.  I was talking about the hopes of Internet and online information, but also the opportunities of blockchain and digital identities with migration, and applications I can talk about, versus privacy and data ownership issues.  We have a chance for empathy building, to the presence and trust building with social media but this becomes a vehicle of deep fake, hate speech, online violence that turns into physical violence.  We have apps for and data for early warning and response system, but they are used for surveillance.  Surveillance capitalism.  We have the Internet of Things but also this comes with ‑‑ with increase also increased vulnerabilities of cyber attacks.

You see this balance.  You see, how do we ‑‑ what do we do?  How do we make the model tip the positive side?  And it's a challenge that is also ‑‑ is a regulation challenge, of course, but it's also a challenge that ‑‑ it's from regulation and post‑innovation and it goes towards promoting a different culture on how all the means and technology are used and also at the end of the day, it has to do with shifting investments from meter technologies to where they are leveraging technology for good.

Before we go into that, you might wonder what is Peace Tech, maybe I will leave this to Uma.  This is a very scary slide with a lot of examples, but it's something I can also share with you, if you are interested.  We are putting together a Global Peace Tech Atlas with case studies and initiatives for global peace for good.  This is something we will publish in the coming month and our team worked on the mapping, on a big mapping, a huge mapping, a physical map that we'll show you now with different cases around the world and we are also crowd sourcing this information, asking different think tanks and research institutes to contribute to the mapping.

So peace tech is a technology used for peace, and we tried to define peace tech and there are many definitions out there.  The definition that we give is the following one:  So a field of analysis applied to all processes connecting local and global practices, aimed at achieving social and political peace, through the responsible use of frontier technologies.

So our definition is an inclusive definition.  It's not just about the Internet but all kind of frontier technologies, disrupting technology, that ‑‑ that connects people at the local and global level, and so for us Global Peace Tech is kind of a new field, where we're trying to put together global affairs and global studies with more practical studies and work in the peace fields.  And we try to study this on a global perspective, because this use of technology for peace raises a lot of questions, you know, who are the actors involved.  Questions about power concentrations and different level of governance.

So I will try to be sure, not to take too much space from the other speakers as well, but, of course, this is an interdisciplinary endeavor, and we thought about this in three steps.  So the first step was the mapping.  So we tried to, like, have a bird's eye view on what is going on in the field of peace tech.  And we did that in collaboration with the Governance Lab from New York University, and it was both a topic mapping but also a mapping of Peace Tech initiatives thanks to Lucia, who is on screen here, working at the European institute and different projects related to tech and governance, tech and policy.

The second step will be after this first step, and assessment of this peace technologies, yes, to understand what works, what doesn't work, and the third step would be policy recommendation.  So today, we are looking at the atlas as the first step.  So I will now give immediately the floor to Uma to kind of bring us into the mapping of peace tech as a topic, and then maybe to Lucia, if she can connect and share or I can also share the mapping that we have done and a few thoughts on how we developed it.

And then I will get back the floor for just a few ‑‑ a couple of words on the step two and step three.  And then I will give the floor to Evelyne for a reflection on ethics.

When I was talking about the responsible use and I will close it here, you night wonder what is responsible use?  What is this word "responsible."  I mean, you can interpret this word in different way.  We have to agree on basic core principles which for Evelyne, I know human rights which is a framework that is already there, and to kind of mainstream these into the design of technological applications with principles and with the questions that the designers or ‑‑ or ‑‑ yeah, the developers are embedding into their design process.

So let's start.  I will give the floor to Uma for the mapping.  Please tell me if you can share, otherwise, I will try to share the slides for you.

>> UMA KALKAR:  I don't have the screen share, but I sent over the slides.

>> MICHELE GIOVANARDI:  Okay good.

>> UMA KALKAR:  So ‑‑ I got it.  I got it.

Okay.  Well ‑‑ all right.  Let's let it load up.  And hopefully everyone can see.  Thank you again, Michele for this very good introduction into what is Peace Tech and our topic mapping, and I will walk through a little bit what we did as a group at the Governance Lab.  One of our research methodologies is creating a topic map which is a very broad and rapid scan of the field and what is kind of out there, what exists, what doesn't exist in order to figure out what are the, you know, cardinal directions with a really big topic like peace tech and then figuring out who is already working in the spaces.  What are the applications that exist?  And what also is kind of not there, in order to really gauge where research can take place, in order to allow for a taxonomy for further kind of, you know, review.

And really paved the way from going from a topic map to a very active actor mapping.  I will spare you the details of you are methodology.  Kind of what Michele was talking, we looked at Peace Tech as a technology that is there to support positive peace in long- and short-term horizons.  Most instances of Peace Tech are dual use.  So they have the potential to do both harm, as well as good.

Really look at the institutional and the governance structure to steer them towards the good rather than the bad.  This is extremely important with current conflicts and the way that technology is being used and manipulated in order to egg on conflict and from this, we really looked at first the difference between military tech and then Peace Tech and we found there is a distinction between how technology is being discussed in literature, and that for the most part, people discuss the negative aspects or the more offensive aspects of technology when it's used in a conflict setting, and how can it enable peace and conflict resolution and building.

And from there what we did was we looked at four factors.  Number one, what are the technologies being used for peace and by whom?  What are the current use cases and practical applications?  And we then created a classification of the, you know, most recent kind of spread of different technology across six different fields that I will get to in a second and then we considered what are the challenges and further considerations for peace researchers?

And now going from there, what we really found was within the space, one way of categorizing it was to look at the physical tech tools and then the digital with a caveat that these two have a lot of blending and overlap.  And from there, we really looked at what are the use cases across different categories and gave some practical, tangible examples as shown here.

From there, what we did was we looked at these kind I have six categories of Peace Tech and now these categories were chosen because they kept coming up when we were doing our research of these natural kind of, you know, headings that are coming up.  And we looked at different schools and techniques, such as the conflict change map.  AI algorithms used to monitor migration flows and check to see if news is verified or if it is more on the side of disinformation.  We really looked at also what are these tools within the context of the Russia Ukraine conflict and how they are being deployed right now.  And from this what has already been done or being done, we looked at what are the challenges and risks that are emerging that policymakers and researchers need to consider.

One the primary ones is for tech to be used as instrument of good, in order to build relationships between different groups of people or more bad to sow seeds of misinformation and distrust.  We looked at the way that data in itself can be weaponized within the wrong hands, and we looked at what current international regulations are doing in order to help promote peace tech and we found there's a lot of outdatedness of current systems that needs to be enhanced for the 21st century.  We looked at the private sectors creating their own rules in order to govern their technologies and what are the risks and the merits of that.

We looked at this whole gamut of considerations that led us to show that we need to really look at the governance of these technologies and we have to have it meet the existing demand in order to really address this dual use of Peace Tech tools.

And from here we created this topic map.  You can see it on the Global Peace Tech Hub website.  And the next step is to know what are the areas that we missed?  Where should we kind of put more focus into?  What are things on this map that aren't there but should be there?  In order to then go into our next stage of really building this next new generation research agenda to help get a really holistic and clear understanding of where is Peace Tech now and where is it going?

With that, I will pass the floor back over to Michele.  Thank you so much.

>> MICHELE GIOVANARDI:  Thank you so much for injure overview and actually I haven't introduced Uma probably but she's research assistant at the Governance Lab and she works on this with Stefaan Verhulst which is a cofounder the Governance Lab.  She's a youth led nonprofit organization that helps 16, 17, and 18‑year‑olds understand how to vote, when to vote, and why to vote.  So it's something you might want to explore.

And she, she mentioned the website of this initiative, its global PeaceTech.org.  And you can access more information about the session and it's also in the chat.

So just a few words on this mapping that Uma was referring to and I will just share my screen for a second and show you this physical map that we did before leaving the floor to Evelyne.

So as you can see here, the Global Peace Tech map, this is a physical map.  So everybody can access when we have the choice on what tool to use, we opted for this tool because it's very easy to use and everybody can easily add a Peace Tech tag initiative to this map.  So the idea of the map is to get in overview of what is going on there.  But, of course, we want to build on this and to expand this to include more and more initiatives and the way we did it is' master of transnational governance that we have at the UI in Florence, we identified different students from different parts the world, including China and Asia, and North America, South America, Africa, Europe.  So we tried to cover a lot of countries and tried to overcome the language barrier and we started to map these initiatives by technology and also type of peace.  As I was saying before, we are not excluding any kind of technologies but also we created a categorization of peace.  So peace in the short term, peace in the long term, and we have both short‑term uses of technology to promote peace building, peacekeeping, but also long term peace, positive, what is called sometimes positive peace.  And we use some pillars of positive peace.  So here in the map you can get this overview.  This is a starting point.  It's not complete.  You can find a great example of what we mean by peace tech and you can click on this plus button and inserting the location and the link and the description of the initiatives.  And this will need to be approved to filter this against span.

If people don't know what Peace Tech is, they can immediately have a first contact and first example.  And now I will give the floor to Evelyne to talk a little bit more about Peace Tech in relation to ethics and human rights and Dr. Evelyne Tauchnitz is senior research fellow at the Institute of Social Ethics, at University of Lucerne.  So Evelyne, I will leave the floor to you.  I hope you are there.  I now ‑‑ I can see you.  Wow!  We made great progress.  Okay.  Great!

I cannot hear you, though.  Yeah.

>> EVELYNE TAUCHNITZ: Thank you very much, Michele for that great introduction and also for the peace tech mapping and others.  There's been lots of work and I thank you Uma also for showing also what has been done in the mapping.  I will not go deeper into that one but I want to look more at the ethics and the human rights perspective and what we mean by Peace Tech.  So that would probably fit in the second or third step that Michele mentioned.  Like the first is the mapping and then analyzing and also the question, like, about governance, like what we should be doing.

So I'm going to start with a quick overview.  I will first talk about technology and peace very quickly, how they are intertwined and defining peace.  And if we define peace, we should define what is violence.

I will talk about freedom as an expression of peace.  Human rights that I would say is ‑‑ what is peace building?  That is a big question.  What is Peace Tech.  Everybody talks about it, but it's not really clear.  I can mention some governance strategies and then the summary, what should Peace Tech aim for.  And I say "should" because I'm from ethics.  My background is political science.  I also worked in human rights law, but right now I'm in ethics and it's really the question, we see what is out there, like with the mapping.  We see these different initiatives.  We also know what are the risks, but there are certain risks that have on about mentioned and I will give them this scenario.  We should be questioning about the how and the what.

Yes, technology and peace, they are very closely relates because peace ‑‑ well, it's a very elastic concept.  People talk peace but they often make war.  Very to think about what we understand by peace and war.

It reallocates power relationships in society, and it really depends what we how we define piece and what we consider building blocks of peace.  When you want to distinguish between peaceful and violent users of pieces of technology.

So the titles are not visible.

Anyway, it's about defining peace, that's that slide.  The definition given by Johan Gauhung.  Peace is the absence of violence.  Negative peace, no direct violence.  That's observable from person‑to‑person, that you could actually be recording with a video camera, for example.

Positive peace is absence of structural and cultural violence.

It's embedded in the structure.  It means, for example, inequality and socioeconomic relationships means also things that are not directly attributable to one person.  It it's difficult to pinpoint who is responsible for it but still there's some form of violence that's connected to the cultural violence and a prominent example is gender discrimination.

So if, for example, women are not allowed to participate in political life, that's some type of structural violence, that's legitimized through the cultural violence, like saying women should be doing this or not be doing that.

So peace is the absence of direct structural and cultural violence we can say.

And some observations, this form is not separated but they tend to occur together.  There's always some sort of violence like daily victims of violence so to say.  We will never have completely peaceful societies, but peace will always remain a vision.  Still it's very important to have that vision because we want to know where we are going.  We want to have a compass.  We need a direction.  We want to know if we have governance strategies, like we should be knowing where we are heading to, where we want to be heading to.

So peace can provide us with that vision.  Not the only one, but it's an important vision, I believe.  So if we define peace as the absence of violence, and we should be thinking what is violence?  And it's defined as the cause of the difference between the potential and the actual.

So the potential is the person I could be.  And the circumstances I could be living if there were no violence.  If we take women who are discriminated.  Maybe a girl cannot go to school because she experiences cultural and structural violence.  So she might not reach her full potential, even if she's extremely smart, she might not have a job at all or something that doesn't match her potential.

It's a structural violence, but it's still there.  Another prominent example, if there's no good health system, the person may not enjoy the same health as if services are available.  That's also some kind of violence.  If we think about what peace would mean if we have absolute positive peace.  Peace is a vision.  But if it would exist, hypothetically, there would be no difference.  I would choose the life I want to live, but there's an important but.  I would still need to respect ‑‑ have respect for the equal freedom of others and that's extremely important if we talk about peace and war.  It's not a freedom I do whatever I want and I take the power and so on, but it's responsible freedom like respecting the rights of others.  Let's go on.  So absolute peace, people would be able to realize their full potential.  A negative and positive aspect of freedom.  It relates really closely.  It's the mission of actions that restrict freedoms.  Positive freedom is like what I'm free to do.  It depends positive peace on the possibilities and capabilities.  Like if I don't have the possibility due to structural violence, so then I'm not free to do something just briefly as well, possibilities, they are mainly defined by the structural and cultural violence in society.  Capitalization is not equally distributed in society.

But really, it's also closely connected to responsibility because as I said, the limits of my freedom is the freedom of others.

What's also key if we talk about peace, is that even if there would be peace, in the sense of structural peace or cultural peace, some people might still not have the capacities to make full use of opportunities provided of them, because they are maybe sick, they might be old, children also.  They don't need freedom, but they need to care.  They need solidarity.  Not to reach their full potential but at least to live a life in solidarity and human dignity.

I would advocate that we really do want human dignity as unconditional baseline and human dignity is both a core value of peace and human rights.

So Peace Tech should promote peace not just for the powerful but for everyone, and also those who need special support and assistance.

Human rights so as I said the values of freedom and human dignity are both shared in peace and human rights and that's not coincidence.  That's grounded after the second world war and also conceptually, because it's very difficult to imagine peace without human rights and also vice versa.  Like when we have war, mostly human rights are not respected any more.  Yeah, all human beings are important in freedom.  And I would advocate for human rights an ethical minimal standard for peace and Peace Tech, and really to create the socioeconomic conditions that allow the weak or the people who do not have the power to live a life in human dignity.  And peace should empower marginalized groups and give them the freedom to make their own choices.

It should be Peace Tech from above but bottom up and participatory.

Let's see how much time I have.  It's not only moral norms but also legally.  That's the point where you have certain mechanisms in place, they are universally recognized, also not always implemented.  But there are certain mechanisms.  We don't have to start from scratch and also we can already build on existing networks.

Outlook governance strategies.  Why is governance so important?  I think I should talk about it because we are here at the Internet Governance Forum.  That makes against.  We are at the Internet Governance Forum.  So let's look at it.

Governance is very important, but it's obvious for promoting the good stuff and addressing the risks.  And that's my personal research question there, what governance strategies should be adopted to promote and safeguard peace and human well‑being in the digital age.

Either we can promote the good things, the Peace Tech that's been mentioned before, like, all ‑‑ it's not only about looking at the risk, but also mapping really what kind of good initiatives are out there that are using technology to promote peace.  Also it might make sense to simply ban certain technologies or make a moratorium, at least on them, until they are ‑‑ until they can be properly regulated because many technologies it's difficult to regulate them because either there's not the knowledge yet of the policymakers or they create facts by themselves very quickly.  Like the technology is developing quicker than the regulation.  It might make sense to do a moratorium and say, okay, they are not allowed for the replication is not allowed until we have regulated them properly.  Or simply regulate them.

This is dual use of technology, and possible means of self‑regulation might work for some cases like, for example, platforms, economic and financial incentives is also worth a try.

Nudging has not been explored very much.  If it would work, that would be awesome.  Because it would provide not incentives but it would push people to do the right thing by altering the environment.  I mean, the most common example is there if you enter a shop and before you go out, there's always the candies.  Like, while you are in the waiting line, the candies are there.  Nobody pushes you to take the candies, but people who have children, they know, it's quite likely that the child will take the candies not because anybody pushes them to, are but they are placed in a way that it's most likely outcome so we could be thinking how do we use technologies to nudge peace.  Why not?

And legally binding norms.  I mean, at least for the most stressing problems, it might even be worth to be thinking if we should build on old norms like all existing human rights or if we want new forms.  The U.N. Convention, do we need that?  Or a different way in between could be that we need a new international agreement on how to interpret human rights, for example.

Because these are understood by different actors and different context what it means to apply human rights online.  So what should it came for:  Reduce all forms of violence.  And as we talk about technologies, for example, yeah, it's key for peace, without political rights, it's difficult to have situations of peace and create equal opportunities also for all to realize their full pong.  Human dignity as unconditional baseline.  Human dignity is key.  And ideally empower marginalized communities to make their own choices and do a bottom up approach.

That's not really last one.  If we look at ‑‑ if there's an ethical assessment that we could be thinking for Peace Tech.  It's used for persons and they produce certain questions.  So if something goes wrong, somebody has to be responsible.

And well, that's really based on the three main ethical strengths.

>> MICHELE GIOVANARDI:  We have five minutes left.

>> EVELYNE TAUCHNITZ: What purposes are the technologies developed?  If it's developed for military, it's difficult to combine with the human rights perspective.  And are they in accordance with human rights norms and what are the consequences of certain technologies being applied.  Also thank you Michele for being patience.

>> MICHELE GIOVANARDI:  Of course, she was talking about the patience for the technology at the beginning of the session.  And not her session.  And thank you for this overview of the ethical minimum standards and also to provide us this glimpse in the governance options, the governance idea and I think it's very important for us at least to think about this to the for of our reflection, it's the end game for us about this whole initiative, and how do we govern this?  What can we do with the regulation?  So we think about the policy innovation being aware that it's not the end of the story.  The regulation is not a magic wound, but then you need to also promote a culture in which this regulation could be applied.

I wanted to check if there were any questions from the audience there, or audience online.  So, yeah.  There's ‑‑ audience there or audience online.  Yeah, there's a question.

Sorry, with we talked too much.  We didn't leave enough space for you.

>> AUDIENCE MEMBER: Okay.  I had there.  My name is Aliska, I work with global of freedom of expression lead.  And this is a fascinating presentation of all of them, and great opportunity for us to become more familiar with this great project, since we have been working on platforms, accountability and content governance in times of crisis.  I do have several questions.  I try to keep it brief in case someone else has also questions, but when you open your presentation and I'm very happy that the last speaker elaborated on it.  This was a long list of different technologies and issues and at some point, I felt that maybe ‑‑ I know it was probably just an example of all issues on the list but it was like mixing together weaponization of tech together with weaponization of information, and also, of course ‑‑ and I'm happy that the question of ban and moratorium on certain technologies have been brought up because, of course, these technologies are very different in nature.  So what will apply to content moderation algorithms and what times we can add up there, in order to positively impact peace in the future is definitely different by way of Pegasus and something that should be outlawed or strictly regulated.  That brings me to a second point.  What is exactly a Peace Tech?

To me if we put aside concrete examples that are used for military purposes, then we also end up based on your recommendations deliver systems and the way we develop algorithms and these are an integral part of the business model of these platforms.

Indeed, there is harm, societal harm, or system risks much more dangerous in times of instability and crisis.  But at the same time, perhaps there might be a more systemic approach how to regulate them and those examples now exist in different jurisdictions, freedom in the EU, so whether you factor this in, this information spreads online due to certain technology, not only is societal first and foremost, but the consequences are much worse in open war and conflict.

And since we are in Ethiopia, since you are looking into issues around Internet shutdowns, which has far reaching conflicts and then maybe the last point would be ‑‑ so I'm not exactly sure how useful ‑‑ I mean, I totally understand the point, but we spend a lot of time, especially during Brussels not to mix human rights with ethics, and as much as I understand this is a misunderstanding and we didn't have more time to go into it, but human rights are legally binding standards that are also either through positive obligation or through United Nations guiding principle in one way or another binding for private actors.

And this is pretty much why we have the set of due diligence safeguards and they are extremely relevant during the time of crisis.  I would be interested if the project takes this direction.  I'm sorry that was too long.  Thank you for the presentation.

>> MICHELE GIOVANARDI:  Thank you for the questions which are very relevant.  I don't know if we can go a bit over time in answering because it's already 4:20, but we also started 10 or 15 minutes late.  And maybe we can start to answer that and of course, Uma and the other speakers, feel free to add to my answer.  I will try to be very brief, but hopefully we can continue this conversation as well in other spaces or other time, but ‑‑

>> AUDIENCE MEMBER: Can I stop you.  In case there are other questions, I would collect a few more questions eventually, just in case.  So better to collect all the questions.

>> MICHELE GIOVANARDI.  Can we go a bit over time?

>> AUDIENCE MEMBER: Yes.

>> AUDIENCE MEMBER: Just one small question, George from the Frye University of Brazil.  I want to ask if it includes impact assessments of these technologies because maybe I missed it.  If you could say something about that.  Thank you very much.

>> MICHELE GIOVANARDI:  Other questions?

>> MODERATOR: Other questions from the audience?  No, so you can move on with your short answers.

>> MICHELE GIOVANARDI:  Yes.

>> MODERATOR: Maybe somebody from the online participants?

>> MICHELE GIOVANARDI:  Anybody from online that wants to jump in?

I don't think that's the case.  So feel free to rely as well.  But what I would say very briefly is that Peace Tech is a field and definition and we think there are challenges to each specific technology but the why the is that some of the use of this technology for war or for peace or anyway, the social impacts ‑‑ the impact of this technology raises common challenges that needs to be started from a global affairs or IR perspective.  Many of these questions are interconnected and needs to be studied together in our open.  This is why we try to shape it as a Global Peace Tech Hub.  And if you want more details about what kind of questions we are thinking of when we say that you can go to this framework paper that we published and I will share it and provide the link that also has, like, five different examples on areas with thousands is of questions on where we can study and how we can study this from a global affairs perspective.  And so that's about the approach but is it something under definition?  You asked what is the Peace Tech?  We had a conference last Monday and Tuesday, and we spent hours discussing what is Peace Tech and each organization came with their own idea.

Of course, there was a definition that I provided before that is our mission, but not set in stone.  It's something that we are discussing and currently developing but, of course, this includes what you were talking about.  So this kind of idea of embedding peace by design when we think of technological applications for peace.  And, yeah, having a more systemic way ‑‑ we're talking about having a more systemic approach in regulation, that's obviously part of it.

I know it's an incomplete answer.  On impact assessment, this is part two.  We started in December and we committed to the mapping and the next step as you can see is an assessment and the third part goes to the post recommendation.  And so when we do assessment, we want to evaluate tech cues and how they are affected by technology.  And the third step is a tech regulation which is a legal normative framework we can have to minimize the risks and enhance the opportunities that these technologies offer.

I realized this is an incomplete answer but, again, it's impossible.  I wanted to talk more.  And ‑‑ but basically, yeah, I will point you to this paper that we wrote with Professor Kalypso Nicolaidis when we talk about the common approach for common questions that are affecting our time and need to be taken to consideration together and not in silos in order to have more peaceful and societies ‑‑ societies that leverage the technology for good.  So that's, I hope, a way of answering this question, but I leave it also to Uma, Andrea on the human rights.

>> UMA KALKAR:  I can answer a couple of questions from our first questioner really looking at the use of technology in ‑‑ in order to feel that conflict.  That was found in the Peace Tech map which was the dual use nature of the technology, and so something that needs to be considered at times is what governance methods under place so that they blend, and so the technology that is there is being manipulated.

And also the question that we looked at kind of how the technology is accessed and what are the results of that.  We looked at a bit of what is the existing legislation and standardization methods and one of the big things is we found it was not standardized.  It's really difficult to track how it has an impact because the metrics ‑‑ there's real no metric existing yet, however, doing more research into specifically look at how we can do an impact assessment.  Thank you for that.

>> EVELYNE TAUCHNITZ: I would like to add something to the question on human rights.  I think that was a really important one because human rights, they are legal norms, yes, completely, but they can also be morally justified and I mean, we can also even talk of ethics of human rights and I think is important because precisely so many companies try to avoid talking about human rights and instead have their internal ethical guidelines or they would rather it go for different ethical guidelines, precise because human rights are legally binding.  If we say human rights are not only legal norms but act as an ethical standard, we are linking up the ethical with the moral and the legal.

So moral norms that we think can be justified, they should translate into legally binding law.  They don't always overlap.  You have more laws that are not really binding and they have legal norms that from an ethical perspective can overlap.  If we take a look at the structural violence that we are experiencing, it would really be important that we try to find the overlap between legal and moral norms that can be ethically justified.

And I think there are a few norms that are already like universally accepted and that can ‑‑ that can, like, act as a framework that we can already build on because if we start from scratch, it will be very complicated.  I mean, it's unlikely that we are going to find a new political window of opportunity so to say, as there was after a second world war.  And even if we would agree on a new set of ethical guidelines or other norms they would maybe not be as good as the human right norms.  They may be softer and not to say ‑‑ how to say?  So much to the point, I would argue.  So yes, they are legally binding but that's why we should consider them for ethical guidelines and not leave the open space to companies just saying, okay, we think this is ethical and other companies saying this is ethical and we have a huge discussion going on forever.

>> MICHELE GIOVANARDI:  Do you want to add something, Andrea.

>> ANDREA RENDA: It was very, very useful to ‑‑ well to take into account the variety of intersections existing between peace and tech.  I think that now and, of course we discussed about this information.  We discussed about Access Now has worked on this a lot.  I think now that we broaden so much the concept, I think the further step is trying to narrow down the focus again because of course, everything has to do with the conflict.  And it has to do with conflict, but is conflict a strict link to peace and war?

So that is kind of an additional exercise that probably at least I personally would like to reflect a bit further.

>> MICHELE GIOVANARDI:  Thank you, Andrea.  The results of this discussion, as I said, it's a new project, that started in December and hopefully we'll go on for quite a while.  And the discussion will be put together on paper on the Global Peace Tech Atlas that we are discussing today.  There are two publications about the mapping and the variety of Global Peace Tech Hub and both can be found at globalpeacetech.org, but the idea is to also collect for the Peace Tech Atlas distributions for all of these different organizations working on peace including Access Now and working on tech for good.  So these Global Peace Tech Atlas will have a concept of what is Peace Tech and Global Peace Tech and all the questions this is raising in global affairs and governance and second a mapping of the topic.  And so of everything related to peace tech and third a wide release of case studies and examples of initiatives and some of the contributions that go more in‑depth into some of these initiatives.

This is of course the mapping and it needs to be broad because it's the mapping.  And Andrea was talking about narrowing it now.  I would say that's the next steps.  From here, this broad overview, we will try to make sense out of this, of everything that is Peace Tech, that also included misinformation, disinformation, and we have a European digital observatory in Europe that is fighting misinformation, disinformation.  It's considered parts of Peace Tech.  Organizations have done peace building, as part of the digital peace building strategies, they have constantly fought disinformation and tried to create narratives that were going against this disinformation.  So now it is part of it, but as well as any other thing, so digital identities, for instance, as we were mentioning, early warning systems and I will just point to the ‑‑ this Global Peace Tech Atlas that they are putting together as a starting point for reflection.  We won't have answers, but we will have a clear overview of everything that is out there, where people are talking about Peace Tech and also what questions this raises for scholars of global affairs and for regulations.  So that's ‑‑ the faces that you see here are not only faces of the initiative.  We are talking for us, but, of course, this is a very diverse group of people and each one will answer differently.

My suggestion is to keep this conversation active and discover who are the other participants of initiative and to join the initiative.  To join the hub and discuss with us how you can use and regulate technology to achieve peace so thank you so much for attending this session on Global Peace Tech Atlas, and I wish all of you a good continuation of Internet Governance Forum which is starting today.

So you have a great week ahead of you.  Bye and have a great afternoon and see you soon.