IGF 2019 – Day 3 – Raum V – WS #159 Towards a Human Rights-Centered Cybersecurity Training

The following are the outputs of the real-time captioning taken during the Fourteenth Annual Meeting of the Internet Governance Forum (IGF) in Berlin, Germany, from 25 to 29 November 2019. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 



>> JULIA SCHUERTZE:  All right.  Hello, everyone.  Welcome to this workshop.  My name is Julia Schuertze, and with me is Kate Saslow and, yeah.  We would like to do a workshop with you today.  We'll be interactive as well.  At some point you might want to get closer to the stage.  Our topic today is human rights centered cybersecurity training.

When we set up this workshop in our work at the think tank we work on international cybersecurity policy.  There's a lot of concentration on cybersecurity of the state.  However, states using cyberspace also for a lot of different reasons to have different foreign policy and security goals that are not necessarily good for IT security of people, citizens, and may also use surveillance tactics.  This workshop will want to concentrate on cybersecurity of the people.

We have basically three goals.  So one I already said.  We really want to focus on safety and security and for some people you would maybe use more the word digital security instead of cybersecurity of people and shifting away from this national security focus.

Then we really want to increase awareness about to achieve this, we do have to look at specific cases.  We do have to look at individual situations and take into account their threat landscape.  So we will have a case, you already have in front of that we will work through.  Then really identify some challenges and some best practices on how to work in kind of an insecure environment, challenging environment together.

And we will do this from different perspectives.  So we will have people here on stage who will start with an input before you guys can chime in and help us deal with these challenges who work on global development but also work on IT security specifically, incident response who are journalists.  To get at this problem you have to look at different perspectives.  We hope you bring in even more perspectives to this case.

So the agenda for the next 90 minutes introductions are done.  Then I am handing over the stage to Daniel Mossbrucker.  Daniel Mossbrucker is a journalist also a digital security trainer.  He worked at Reporters Without Borders, and he also does his PhD specifically looking at what impact does surveillance have on journalism?  For what he's here today, he developed a threat modeling guide for project managers who work specifically in development context who really need to identify from a nontechnical level first of all, what are the threats?  He brought here a case for us to work on.

With that, I'm handing it over to Daniel to explain a little bit better on what the case is about.

>> DANIEL MOSSBRUCKER:  Cool.  Yeah.  Thank you, Julia, for this nice introduction.  Thanks for having me and also thanks for having the academy today to present this use case you will all work on luckily in the next t90 minutes.  DW Akademie is part of the broadcaster DW and DW Akademie media freedom and media independence with a focus on Africa and Asia.  I'm part of a project there as a consultant in which we try to deal with digital security threats and to try to ‑‑ because there a lot of trainings out there, some in development countries but also in countries like Germany are already complaining that they're overtrained in digital security.  So they had a lot of trainings but they don't really change the behavior.  That's the focus we're trying to address in our project.

Okay.  What are the more strategic and structural changes that need to be done?

One part of this was this development of the interactive guide for Civil Society association project managers working on it.  It's not yet published.  If you want to receive it, it will be published digitally.  Write me informally an email with subject line IGF.  I'll put you in a folder and we'll let you know.  Once we're out, this is already the headline, so, yeah, please let me know if you're interested.

This from our project and for that project we did a lot of research.  Of course DW Akademie is quite experienced when it comes to digital security threats.  What I brought today is the stereo type example, yeah, organizations like DW Akademie but others have to work with.

So the case scenario for today is the following.  You have a situation in which you have a funding entity that does not care much more than about just giving money.  Giving money to a development cooperation organization.  And this development corporation sets up a project, has a project plan.  And in our example the project is setting up a digital human rights lab in a country of the global cloud working together with local partners.  This is quite common that it's like that.  Yet you have a development corporation basically giving money and, yeah, sharing knowledge and working always working with local partners to strengthen these local partner.  The interesting challenge is that this will follow soon, but we will see here it already, we have a situation from a security perspective in which we still have at least four different entities, but they on a regular basis which means every day they work together.  So they remain in their organizations, but they have like a temporarily environment to create in which they can communicate and work safety.

So the need is this temporary IT only for the project while all the organizations have their own IT infrastructure already.  Everybody remains individual, but they have to work together in a secure way.  And the challenges we are facing is that the funding organization, this is what was here, the blue one, the funding organization is a big entity with one thousand plus employees.  So everybody who is working in such a big entity knows that it's kind of slow sometimes, and there are a lot of rules and of course this organization has its own IT department.  And a change of the IT infrastructure is always an issue.  It's possible but it is an issue.  You have to lobby for it internally, and it just takes time.

Plus when we're looking now into the project, every digital cooperation of, let's say, a foreign entity, for example, local partners can be considered as a security risk.  So the IT department will ever say, yeah, yeah, leave them in.  Because for the funding organization, this can be considered as a security risk.

And why don't they let them in?  Because the local NGOs do have several structural problems.  First of all, mostly they don't even have an IT department because they're like small organizations, 20, 30 people as a maximum, or they have like ‑‑ if there's a real need, the mail service isn't running anymore and they hire on hour basis they hire a freelancer to do the work for them.  Due to limited resources and due to practicability they rely on free services like Google Drive, Slack, What's App, gmail, at least when it comes to money for free.

Number 3, the awareness is relatively low of the while I have to admit this is mostly true for all the employees, not only in the local NGOs.  Right.  So what for our case now, the risk matrix.

So the need is that we'll get a temporary IT for the project that works.  And, of course, this is quite a big thing.  It takes when I do these projects, it takes like weeks to make a real threat modeling process, but for our workshop we kind of narrowed it down a little bit.  So the risk matrix could look as follows, the assets are five ones we need to secure the communication within the team and with external partners.  Partners I mean like the local NGOs.  Internet research in terms of online behavior.  So everything we do on our computer and smart phone while researching project data storage and edited probably in a cloud because we have a temporary project running.  Bank account data for project‑related transactions, and the physical integrity of devices like smart phones.

And these assets are at risk because we do have certain adversaries which can be grouped as followed, government‑related agencies of the country of the global south.  We do not consider agencies of the funding country because as they're funding, if they want to know what's going on, they would probably just ask.  ISPs of the organizations speaking for all the techies in the room, speaking of the IT layer, third‑party service providers like Google or Facebook, speaking of the application area.  And we have relatively compared to these others, relatively low sophisticated online criminals who are just interested in, I don't know, hacking bank accounts and making some money.

Right.  This might seem now quite complex.  I think it is.  But Julia, thankfully has printed this case.  So ‑‑ on a sheet of paper, I think, yeah.  And we will distribute it.  Yeah, thanks for the attention for now.  Thanks already now for your input.  I'm very sure that this will help us a lot.  Thank you.

>> JULIA SCHUERTZE: Yeah, the case studies are printed in front of you.  If someone doesn't have one sitting in front of them, come find us, we have more up here.  We, as you can see, we invited different speakers.  They come from different professional backgrounds and have different experiences similar to the case Daniel presented.  Each of them have prepared a unique challenge and a solution to this case or to potentially problems with digital rights.  Such as freedom of expression or privacy.

And these solutions they'll present can be kept in mind as a best practice which you can use during the brainstorming session which will follow.  I'll tell you more then.

And the goal of these best practices or the goal of their solutions is to foster and protect rights such as right to freedom of expression and right to privacy.

So without further ado, I'll give the stage to our first speaker.  Farhan Janjua he is a digital rights practitioner, journalist, consultant, and he also does security trainings.

>> AUDIENCE:  (Off Microphone)

>> JULIA SCHUERTZE:  Yes.  It is only one page.  We'll discuss it more during the brainstorming session.  If you have any questions ‑‑ it's a one‑pager, though.  You're not missing a second page.

>> AUDIENCE:  (Off Microphone)

>> JULIA SCHUERTZE:  Yeah.  If there are questions, we'll collaborate during the session.

>> FARHAN JANJUA:  Thank you, hi, everyone.  I'm Farhan Janjua.  Thank you for the introduction.  I already considering the case in point which you already have in front of you.  I tried to identify some of the challenges given the context that I come from.  I'm a pack sustain journalist.  I work with local journalists and activists and also organizations.  When I looked at the challenge I had some ideas that are extremely relevant to the challenge.  I will list them down and hopefully will try to think of solutions from my point of view.  When it comes to the NGOs or the funding board or the international NGOs, there really is a lack of trust in countries like ours.

Also thanks to some of the pretty big scandals we have had in the international NGOs.  The first challenge in this is the local partners that you'll be working with would come up and ask you, how should we trust you and your proprietary software.  Are you telling us you're more secure than Google?  So really this ‑‑ you would need some work with them and you will need some convincing.  And really two scandals I would like to quickly highlight is in Pakistan.  One was in trying to catch Osama Bin Laden.  It was successful but they used the polio vaccine drive and that ended up jeopardizing the whole polio program in the country and it became a health scandal.  And then another one was this executive from save the children NGO, so he really had access to some of the data that he shouldn't have had, and he ‑‑ so he now is being charged with molesting like 30 plus children.  So now after all the scandal and being media scandal everywhere, there's really a lack of (?) when it comes to trust.  People locally they also ‑‑ this is really true also for the people who work with NGOs who really want to make a difference in change something but are hesitant when it comes to introduction of new idea all of a sudden because they're like, why can't we just use ‑‑ continuing using our Google mail or Slack or whatever?  This is one of the challenges that I identified.

The next would be even if you consequence them that okay maybe we should ‑‑ we have the custom email services, for example, or the proprietary software.  You need every organization and there's lack of training when it comes to the local organization and the staff that work there.  That's another challenge that I identified.

And then really there's also the lack of awareness, why the privacy matters because it usually followed by something like, we have nothing to hide.  So why should we encrypt our communication?  And why should we ‑‑ this is really a challenge because when you hear statements like this coming from really a progressive or liberal political leaders and then you really end up banging your head against the walls because if the progressive people are saying this, there's no hope really.

So then that's another thing because they would be like maybe something fishy is going on.  Why are they asking us to encrypt or hide our communication?  What do they want us to do?  Yeah, that's another issue.

Then, of course, the lack of secure communication devices given the attitude also among the activists and journalists and the people who work with these organizations, they often end up having an unsecure devices and it's really because of the context I mentioned before.

Given these points I would like to mention some solutions that I propose.  The first one, unless it's a really sensitive project, and maybe the big organizations such as Google or Slack could really be your adversaries I think we should find a way to make do with some of these existing tools.  Although I'm pretty sure Daniel might disagree with me.  But I think that if given the limited resources and, as I said, unless the project deals with really sensitive information and you really can't trust Google for that and you can focus on developing your own infrastructure.

Moving on to the next point, I think the big organizations as my colleague mentioned including the donor organization that may be based in the so‑called global north.  If they have a lot of data to protect, then they really have to find a way to give the access to the NGOs that work on site in these countries using that data.  I think one of the ways could be to creating sort of APIs, and through those APIs they could only find a way to give access to only limited amount of data that is really necessary for the project.

The next solution that I propose would be, of course, the concerns the training because we really need more workshops and trainings as to why privacy matters.  We're really talking about absolutely beginning and sometimes it makes sense when you tell them that yes it matters.  So how I try to do it, I give them an example, how would you feel if you were sharing absolute intimate chats messages with somebody, and this third individual who you don't each know has access to all that, including your intimate photos and something.  Then they're like, oh, yeah, we wouldn't like that our nudes ending up in the wrong hands.  This isn't the best example but we work with what we can work with.  We need those trainings.

And then we can also focus on individual trainings depending on the certain needs of the project because obviously depending on the project and depending on the organizations and individuals we're working with, we would obviously need to develop individualized digital security solutions.  And then, of course, comes encryption because that's really the basic, the least we can do is to encrypt the data that we're working with along with the communications.

Then, again, more of the basic solution, including password security.  I'm sure that you all might be wondering, this is really basic.  We already know this.  Believe me when I say this, when you work with NGOs and including activists on the ground, this really is something that can make a huge difference because most people won't each realize how important a safe password and password security can be.  This is followed by the two‑factor authentication.  These are some of the basic things that we can work with.  And then we can improvise depending on the project.

This would be me.  Now I would like the other colleagues too of the.

>> JULIA SCHUERTZE:  Thank you very much.  I'll introduce you quickly.  This is Chris Kubecka.  She is an IT security professional, a hacker, a cybersecurity researcher, and she'll also give her unique challenges and solutions to the case presented.

>> CHRIS KUBECKA:  Thank you so much.  So what I base these on were actual real world examples and things I've used in the past for various reasons.  Now when we're dealing with journalists you have to think about the safety of their sources as well.  Because if there's more cybersecurity with a journalist or an NGO and someone, per chance, is reporting something which is a safety hazard or a human rights violation, that particular source could then be exposed due to poor cybersecurity.

One of the ways that I was posing solutions for this was security is hard.  I'll give you a good example.  I have a fleece over there that is from the National Security Agency because two years ago I found an encryption exploitable on NSA servers.  If the NSA has a problem with ensuring proper encryption, journalists and NGOs are also going to have a problem with this type.  So we need to make it easy as much as possible so they can be used.  One of the things I suggest is a one‑pager if not maybe maximum two‑pager easy to understand guides and a guide to show the privacy risks as well for using different types of free services.

Because they're going to be used because they're low cost, no cost and a lot more people know how to use these things rather than something more exotic and proprietary.  When you're talking about money, that's a big topic for most people and also for organizations.  It's a primary target for, say, your regular run of the mill criminal who's wearing a trench coat and a hat.  One of the things we can do is actually use secure operating systems like something called tails where any time there are banking transaction, you use that operating system and then that operating system does not save any of the settings.  It's only used for financial transactions.

When we're dealing with trying to set up a good security, one of the things we can do is to look to some of our computer emergency response times.  Luxembourg has a great civilian one called circle.  Security with a smile Luxembourg is what it stands for.  They have assistance with free training if you happen to be a constituent for Luxembourg.  And they will place dummy sensors to alert on various project networks.  In addition to that, they have a hardware project based on Raspberry Pi and you receive from a USB a bunch of data from a source.  You want to make sure you don't get infected from that source.  This particular project you plug your USB in and you plug the source's USB in it and it strips everything including metadata which can be used to identify sources.  And so it strips that all off and you have a clean piece of data that contains exactly what you want and it strips it off from images.

One of the things barring from the corporate board.  When I had the Saudi Aramco family.  They had a higher risk profile ‑‑ much higher risk with their data coming into us versus us where we had lots of money to implement security.

So what we did is we treated them as a red or untrusted connection.  So in order to get anything in to our networks and into our project, they had to go through several layers of security, such as a virus scanner and so forth.

Now another thing to be aware of when we surf the internet we can be identified based owned our searches, our DNS searches and also our devices can be fingerprinted using things called a user agent string.  There is free technology called squid, for example, that can mask your user agent string and fingerprinting of your devices, which is a very good alternative in many cases to using, for example, the Tor network because a lot of different governments and police departments around the world have exit nodes on the Tor network to try to find the traffic.

Another thing can be using and leveraging types of projects like let's try to set up an automatic encryption system based on let's encrypt which is based in Canada prior to being put up in Google Cloud which could be looked at by the US government under the patriot act and they might have to hand over things which you may not want which could expose sources.  Something such as simple as virus protection and I've seen it with journalists and a huge amount of NGOs and each some government systems.  Because the case study has money, buyer protection could easily be purchased in bulk for every party that is taking part in this.

Now a separate smart phone for secure communications, this is quite important, earlier this year my smart phone and secure communications was actually broken into by a nation state because I was working with a journalist to break my work with negotiating and then getting rid of a particular threat dealing with a cell of ISIS.  The particular nation state wasn't pleased about things about to come up.  My stuff got hacked.  The journalist's stuff got hacked.  I took a page for someone who I met who works for the NSA.  They have a separate cell phone with no SIM card which can only be turned on in a very controlled manner.  In this way it's extremely separate and you don't surf the web on it.  You don't do banking transactions.  It is only for that particular secure communications and that device happens to be encrypted.

Another thing we have to be concerned of is when people are travelling around, either to different parts of a particular country or internationally they're going to want to be able to charge their things.  They're going to want to talk about things and do things.  Borrowing a page from Saudi Aramco, one of the things we did because several of our corporate officers have been assassinated or survived assassinations.  And we issued every traveller a travel pack with a mostly pictorial travel information booklet to advise when you happen to be in a taxi cab, many of these are recorded with audio and or video.  So what you're saying is not private and be aware of your surroundings.  In addition to that, we issue power packs so they didn't just plug into an airport phone charger which may or may not be safe.

Another thing we did was we issued something called a USB condom which only allows power transfer and not data transfer at all.  In this way the smart phone might not be infected and bring things back and then expose information.

So with that, I will hand over to the next speaker.  And thank you very much.

>> JULIA SCHUERTZE:  Next we have Gbenga Sesan the executive director of the Paradigm Initiative.  Based in Nigeria, Gbenga will share his perspective.

>> GBENGA SESAN:  It's a one page case study which made it easy.  The first thing that jumped at me when I read this was the criminal capacity.  I like the way you described how the phoney organization just wants to give money.  I'm sure you sensed the reaction in the room because it sounds familiar.  There are people who just want to give money.  But on the other hand there are people that want to get the job done.  And that means if there is a gap between expectation and the capacity in terms of what you can do, that is a challenge because many times we have conversations between people who give resources and people getting stuff done on the ground.  At the end of the day you write a report, and there's a mismatch.  It's typically not because you don't want to get it done, it's mostly because the capacity for that particular requirement may not be there.

I'll speak to what I think may be a solution in a second.  Second is in terms of priority.  I think that many times ‑‑ I talk a lot about priority mismatch between phoney organizations and actors on the ground.  There are many times there are things that are sexy, it's great, it's new, AI, everyone wants to fund it.  But is that exactly what the organization on the ground wants to work on?  So there's the question of is this really what the partner needs right now.

The interesting thing about this case study, paradigm initiative has a lab that commences in February 2020 it kind of is a real experience for me.  The interesting thing is we're not talking to a funder so we don't have that mismatch problem just yet.  The third thing that jumped at me from the case study is legacy tech versus new tech co none drum.  Where it's been around a thousand and five years, that's an exaggeration.  Beliefs the legacy tech is secure.  The new organization, most new organizations don't look for what's best.  The first thing is what's available.  What do we know?  What can we use?  As they grow, they get capacity to use more sophisticated things like that.  But that is a challenge.  Because you have this legacy tech that you're used to and this organization has this new cool tool that they have just become aware of.

So solutions, suggestions, because I know the experts in the room who can't wait to share their own solution.  So the first is if you look at the challenge of capacity, I think that we need to begin to think of going just beyond project support, which is typically what most, in this case, the funding organization wants in their lab.  My suspicion that the projects you expect to be presented by the actor on the ground has everything but (?) capacity.  In many cases things like (?) staff and things like that are not included in the project support.  But I think we need to go a bit beyond that.  I'm glad that many funders across the world are beginning to think of that.  You're not only thinking of sustainability of the project but sustainability of the organization itself.  Because if the organization dies, then the project is dead.  That's the reality.

Second is to match available support with demonstrated need.  So there are people who really want to build labs.  There are people who just want USB condoms.  So match support with need.  In this case, does this organization in this global south country, is this lab what they exactly what they want to work on?  If not, the only thing that can from a terrible handshake is (?) in terms of legacy versus new tech is that if this is the platform that you use as an institution that's been working for awhile and this is the platform that the others use, is it possible to have a handshake between both platforms?  It's not always possible.  Obviously.

In an ideal world everybody expects that every phone charger will walk with every phone.  It's an iPhone or an Android phone.  That's the ‑‑ the handshake isn't always, always there.  But I think there is a gap between what you use and what I use.  So the question is, what are we gunning for if we're gunning for security ‑‑ I'm saying if the legacy tech you're using it's secure, two, it's easy to use.  That's not quite legacy and the third is that it's fast.  The question to ask is, what is that tool that offers these three properties or these three attributes that is within the rage of what new tech solutions or what this organization can use?

I can (Off Microphone)

>> JULIA SCHUERTZE:  All right.  Thank you so much.  So yeah, now that we got some challenges, I think I really brought out some of the individual challenges, organizational challenges, IT security challenges specifically and on the setup.  But then also at the end also project setup challenges, and then different solutions.  So what we would like to do now is get your perspective in because we know there are lots of people in the room who either have their own experience working on these things and their own expertise.  So we will now have overall 30 minutes for brainstorming.  And we'll do this step by step.

So first round to think about what other best practices do you think should be considered to protect specifically and now we focus on privacy and freedom of expression in this case.  So first find a partner to do this and discuss this.  And the case is dispersed.  If you need a case, come to the front.  Before we start brainstorming and before I set the clock, are there any questions about the case like understanding questions that you need answered before you can think about this?  Now would be the time to ask them.  No?  Okay.

>> For a quick background we had different perspectives from the three speakers we had.  If your solutions are from different backgrounds mention that as well.  This is meant to be broad and inclusive.  It doesn't need to be exclusively from a tech background.  It can be ‑‑ be creative.

>> JULIA SCHUERTZE:  Okay.  Yes, please find a partner and brainstorm first on the question, what do you think best process that we should consider in this case to protect privacy and freedom of expression?  You may need to move if you don't have a neighbor.  Let's go.

(Group activity)

>> JULIA SCHUERTZE:  You've got one minute left, one minute in the pair.  All right.  Thank you so far.  Please listen to me one second.  Now please find a second group.  I'm sure you brainstormed lots of best practices.  The next challenge is the most urgent one now.  You meet with another group and share some of the best practices and decide which you should definitely urgently should implement in this case and nail that one down.

(Group activity)

>> JULIA SCHUERTZE:  Okay.  So last step, everyone.  Since you are already quite big groups, so I would say, yeah nearly all of them ‑‑ nearly eight people, I would say we skip the step of merging again.  But I will give you five minutes time to think about what you would like to present to the others and figure out who would present the best practice or the challenge you were working on.  So you get five minutes.

(Group activity)

>> JULIA SCHUERTZE:  All right.  Well, thank you.  Thank you very much for participating already in the group work.  We will now finish the group work.  For the transcription reasons it's best if one representative or two of your group come actually to the front and present.  And the good thing is we're not pressured on time.  So we still have an easy half an hour where you can present your best practice and the other people from the group can ‑‑ you can raise questions and we can discuss a little bit.  Thank you for participating in the group work and feeling stressed out a little bit but now we have enough time.

So looks like the first group here would like to start.  Yes.

(Laughter) because the transcription for the online works better.  Thank you.

>> AUDIENCE:  Don't take anything I will say for granted and it's definitely a lie it's the worst thing you do because the time was not much.  So in terms of communication within the team like the first aspect we thought of a lot of things.  We certainly agreed that we needed a decentralized service that you can set up yourself, if needed.  So what we ‑‑ in the end, came up with because of the broad support and the broad client, and the broad client choice that you have is use ‑‑ set up an exemplary server which is basically quite easy to install.  And on top of that use off the record encryption and make it with your partners and maybe sources.  Just a small side note, we agreed that the most crucial thing is integrity of devices, but if you have this threat level, then a group of ten people that sat down for a workshop is not the best choice you have.  So we refrained from thinking of a solution for that.

>> JULIA SCHUERTZE:  Could you offer ‑‑

>> AUDIENCE:  So you talk about we as in external or internal?  Are you the organization, or do you support them or how good to you know the organization?

>> AUDIENCE:  I would consider the organization itself, not the external.  Like we the organization.  You would need knowledge but it's technical knowledge that is acquired and easy to installed.  There are truckloads of tutorials on how to do that.

>> JULIA SCHUERTZE:  Thank you.  Any more questions on this?  Thank you.  Very good ideas.  Very technical.  I need to look up one thing afterwards again.  So now this group, please.

>> AUDIENCE:  I represent the group.  If I move left, you bring me right.

>> JULIA SCHUERTZE:  If you can say your name and your organization because we get to know each other.

>> AUDIENCE:  I'm Saddam, I'm a fellow of the Germany scholarship program.  I'm a freelance journalist.  So for our scenario we considered basically which I will mention and I will go into details we discussed.  We thought have having four key points.  Threat model scenario for the funding organization to determine what exactly the threat scenario mentioned and we considered the four aspects for account security, encryption and an minimization.  We decided to go to a safe tech audit and check their internet setup in improve its security basically for devices we considered loading software updates making sure all the devices are update in terms of software licenses and checking for malwares and make a safe policy use for the organization and make sure that it's enforced for all staff and train the staff on digital ONS.  For encryption we use end‑to‑end encryption communication apps or tools for communication within the organization.  Also encrypting data addressed in transit and use anonymity tools and using browsing to all employees.  Finally in terms of the bank account security, if it's just to protect the organization's bank account from basically criminal hackers, we advise using tails for sensitive information.  I forgot to mention for account security to ensure the use of two‑factor authentication for all online accounts.  Thank you.

>> JULIA SCHUERTZE:  Thank you.  Questions or feedback to those measures now I say the group over there ‑‑

>> AUDIENCE:  I was told to present.


>> AUDIENCE:  I'm Karen Reilly with status 404 right now.  I've written a lot of grants for some of the tools that get mentioned here.  We focused a lot on processes.  So basically starting out with a threat modeling adversary modeling to understand who exactly you're protecting things against and what sort of jurisdictions that you're worried about maybe would help organizations that do security audits, particularly NGOs like internews and others that have a framework for evaluating a small NGO's security stance.

And doing the low effort, high impact mitigations, basically things that everyday citizens should be doing for their own digital security things like two‑factor authentication, good password management, whether ‑‑ not recycling any passwords if people don't do anything else, sometimes that helps particularly with the threat of the low level criminal organizations that should forestall a lot of those things.

In terms of device security that would be particularly if you're doing human rights work where you have to say go to a prison and hand your mobile device over to the security system ‑‑ security services of a company.  So that phone is going to be with them for an hour, so they can do who knows what with it?  That shouldn't be your primary device.  Along those lines, there will always be departmentally organizations.  So compartmentalization and access control for organizations.  Not everybody needs access to everything.

Also, usable tech if there are alternatives to big platforms that are usable for people that won't be using a What's App group on the side or something like that because your mattermost is not configured correctly.  Particularly with the freelance case, a lot of NGOs, a freelancer comes, sets you coming up, doesn't document it, and then you have data who knows where and maybe they get lost or you don't have access to it.  And so to do a little bit of project management, everybody who comes on and does tech stuff will document it with the use case that they will have to hand it over to somebody else.  And also if you're going to outsource things, if you're going to put things in the cloud, you should be aware of the data processing agreements so you know if there are copies of sensitive data on devices that you don't control.

I think that was it.

>> JULIA SCHUERTZE:  Thank you.  Any questions there?  Very thorough.  I think we're going to put together a very good guide after this.  So the next group I think there's one in the very back, yes.  And then ‑‑

>> AUDIENCE:  I'm Christoph.  I'm here on my own.  You put some good things together.  We have already heard some.  But what you also mentioned is training for the people we're working with.  So they know how to deal with softwares that they're around and to spread the skills they have a little bit.  When somebody is not available for some things to do, you won't have to wait to update your server for the next five days or something like this.  There's somebody else who jumps on and does it.  We're considering splitting the functions of devices as we already heard in the lecture that we have got a device for one function and another device for a next function.  Yeah.  That's it.  I think.

>> JULIA SCHUERTZE:  Any other questions?  Okay.  Last but not least, there's a group over there.

>> AUDIENCE:  Thank you so much.  I just want to share my best practice for academic experience that we are working on in GIS mapping for water treatment and there are so many phishing and cybercrime on the database, but we practice to structure VPN model that's just verified people through VPN access to this database and the second level of security is that the user should be verified with email address that receive specific code via email and the observer can check, is that true with the correct code that monitor for water treatment or not.  That it's strategic and there are so many problem about that or best practice VPN.  Second, level up verification code and just live observer that check is that correct user and editor just have access to database or not.  There are so many phishing about that especially for strategic practice.

>> AUDIENCE:  (Off Microphone)

>> AUDIENCE:  VPN services that's verified with academic institution.

>> AUDIENCE:  (Off Microphone)

>> AUDIENCE:  Yes.  Cisco system.  It's updated just one year ago.  There are so many gap just one year ago.  But we have some problem for comparability of the first level of security and second level of security.  Thank you.

>> AUDIENCE:  Thank you for that.  First of all, I would like to ‑‑ I really appreciate it.  My name is Mois.  We discussed best practices on how to work together.  Is it email and, yeah, privacy with email or is it AI working.  We discussed the problems if data is server‑based or if it is cloud‑based and if it's cloud‑based, it's more critical because if someone wants to use the data, everything is on one place.  And that's not good for the security infrastructure.  But the main part was that it's always important to give a secure condition on learning and on intercultural aspects because everyone has to give new changes concerning software and so on.  It's always important that there's enough resource on learning capacities and to make mistakes and to give possibilities to learn together.  Because if the team doesn't learn together, then it's not possible to save the privacy and freedom of expression.  Thank you.

>> JULIA SCHUERTZE:  Thank you so much.  Yeah, we were busy writing here because I think this is a really good guide and we'll definitely provide a thorough documentation of all your results as well afterwards.  It will be linked on the workshop afterwards.

Since we still have ten minutes, please have now ‑‑ I will ask for feedback now here from our experts here on stage.  Then I will turn it back to you as well for like final comments and thoughts that you have after this workshop.

So for you, Daniel, but also Gbenga and Farhan and Chris, since you have sometimes been on the ground, what do you think about some of the solutions and best practices, how reality fit are there?  Are there other things we might not have touched upon yet that are really vital to implement any of these really good IT security and digital security solutions?  So, I will start over here with Daniel, and then maybe do a round.

>> DANIEL MOSSBRUCKER:  Yeah ‑‑ as you said, it's a great tool box, let me put it that way.  I can only say from my training perspective working with people, mostly the crucial point is that it's not technology itself because we do have plenty of secure technology out there.  The question is, do people want to use it?  I was very thankful, especially about the last point like about also this intercultural learnings.  I cannot stress enough how much I learn for that threat modeling guide when we did workshops in countries I've honestly never been to before like Ghana or Uganda.  I seek input and just realized, Daniel, this European perspective does not fit.

And if, for example, the company I'm consulting at the moment, DW academy, if they really want to make a difference in digital security, there needs to be a common understanding between these cultural gaps that do exist.  And I think, yeah, this is apart from all the technology mentioned here which are all good, yeah.  This is the most crucial point.  Just one example.  If the browser, yeah, yeah, this might be better than not having a Tor browser but it's a matter of fact that no one really uses it, in certain countries at least.  We have to think about, okay, what are the mechanisms, what are the processes to change this.

>> JULIA SCHUERTZE:  Thank you.  Farhan?

>> FARHAN JANJUA:  So one of the suggestions I think it was by the second group was that we should also make sure ‑‑ because we also have money for it, hopefully the donors have money for it is that the devices and the software, because now that I think of it, there's a big user using pirated software is quite common because not many people can afford licensed operating systems.  And what can you do when you don't even have a licensed operating system?  And the one you're using is downloaded over the internet and God only knows what it brings with it.

If you can, then this is the basic and like you should help them with the licensed software and devices.  And really emphasize the need for it.  I really quite liked this feedback along with what everybody else from the audience said.  It was very relevant.  And I see that being used in the context of what I talked about.  So, yeah, I think it was a good suggestion.

>> JULIA SCHUERTZE:  Thank you.  Chris.

>> CHRIS KUBECKA:  So one of the groups discussed doing an audit and seeing what sort of weaknesses and vulnerabilities might exist and then trying to correct those.  It's important to consider continually doing those audits at least once a year, if possible.  In addition to that, another group mentioned using VPN hardware.  It's very important to audit the configuration of that hardware because one of the things that I find the internet always yields loveliness for me because I do identify as a hacker.  One of the things that the internet yields a lot is VPN systems each Cisco VPNs that were put in but not configured.  I won't mention the political party from the US they never turned on encryption or authentication on their VPN system which means it's not a VPN system.  Right?

So you have to be aware of those things.  You also have to be aware that the hardware itself is going to need to be updated.  So if you have a Cisco or whatever version, you actually have to still keep updating those particular things.

Another thing I find a whole bunch of times is this is on email systems or security appliances, like a VPN or so forth, they might have encryption turned on but they're using a vendor demo certificate, which means, for example, up until at least this April I found that the Saudi Ministry of foreign affairs which is their intelligence arm was using a Cisco demo certificate on their email system that was supposedly secure which means a whole bunch of devices are using the same one.  And I find this all the time.  So I make a joke in my last book if anybody wonders how Turkish intelligence could figure out the moves of Saudi intelligence which it came to Kashoki their entire email system was assumed to be secure but it was not.  Pay attention to these different things.

Another thing.  Be aware.  I told this group over here when we're dealing with the Tor browser sometimes it's not the best thing to use because you have to take into account a few different things.  Will you look like a blip, an anomaly that say an access provider like an ISP might see you and then try to track what's going on?

Another thing to consider is if you have something called JavaScript enabled in that Tor browser, I can hook that Tor browser using a very simple tool called Beef which is in (?) that's something that is actually now more and more commonplace.  For example, one of the ways that the NSO group that were able to hack into brand new updated iPhones because the Safari browser runs JavaScript by default and it's difficult to turn off.  Aware of different things to ‑‑ to use Tor, you have to read the warnings and make sure your screen full size because all of a sudden that telemetry data can be broadcast and fingerprint your exact system.  There are some inherent risks to using some of these things that everyone needs to be a bit more aware of.

Not to automatically have certain assumptions.  Thank you.