The following are the outputs of the captioning taken during an IGF virtual intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
>> I think we can start.
>> Brilliant. Hello, everybody. Welcome to this town hall session of the Internet Governance Forum 2021. My name is Kate Russell, and I'm very pleased hosting this session in tackling the very complex issues surrounding the risks of human augmentation.
That term might think you cyborgs or Steve Austin the bionic man, but it's an innovation that's part of the human story for decades now.
Whether it's prosthetic limbs, hearing aids or pacemakers, there are already many ways that we turn to technology to fix our bodies.
And as we look to the future, it's not inconceivable that technological enhancements might be developed to help us become more than human, to see further, move faster, lift heavier weights or even learn more effectively.
It might sound the stuff of science fiction but having reporters on technology since 1995 when less than 1% of the planet was online and home computers were still kind of a novelty, I can confirm that the only constant is rapid change.
And with every new innovation, the world invents comes a new a risk that needs to be mitigated. We'll have a multistakeholder meeting how human threats of human augmentation devices both existing and whatever might come in the future. What are they? And share some of the ideas about how these threats might be addressed on an international level.
We are taking questions from the floor both online and from those attending live in Poland.
Hello to you, by the way, if you're actually in Poland. My colleague Arnaud is there with you ready to pass on your questions.
Arnaud, give them all a wave so they all know who you are, so you can track them down and also take part in the discussion.
So let's meet our stakeholders.
Joining me today we have Marcelo Araujo, who's professor ‑‑ federal University of Rio De Janeiro.
Founder & CEO of Walletmor Wojciech Paprota.
Ilya Chekh who is CEO of Motorica who makes prosthetic upper limbs and makes all types of robotics based in Chile.
Tristan Vouga is cofounder of Twiice, a company that makes medical exoskeletons.
And Marco Preuss, who's head of an analysis team, and he is based in Europe.
Also joining us a little later on will be a member of the European parliament from Greece and also chairman of the European parliament committee on the future of technology and science.
So well‑healed panelists to talk about this topic, and I'd like to start by asking each of you for your opening statements to share ‑‑ opening statements to share your thoughts on the main thoughts of human augmentation as a phenomenon and how you see the world mitigating the risks.
Let's go in the order I introduced you.
So, Marcelo, the floor is yours. Don't forget to unmute yourselves, everybody.
>> Thank you very much for your very kind introduction. There's so many different kinds of technologies for the purpose of human augmentation. People can have a very strong opinion about this. And right now during the pandemic we see how people reacted to this because vaccines to work as a kind of human augmentation to a kind of human enhancements that makes our immune system better adapted to address a new kind of environment.
But then we see, too, how people react to this kind of modification of the borders through vaccines, and I think the only way to ‑‑ to address this forum is to have a kind of conversation we're going to have here with our colleagues. Thank you.
>> Thank you so much, so, hopefully, we will begin to bash out some of those ideas as we go long.
Wojciech, over to you.
>> Implants, smart implants, is a great example of body augmentation because in theory, or also in practice, they opt a sixth sense to our bodies we are interact with certain machines and technologies thanks to implants and the payment implants we make ‑‑ the reason why we decided to pursue that way is that smart implants when it comes to payments break this reverse relationship between safety or security and convenience because up until this point the users had to choose between convenience, whatever they wanted to have a super convenient way of authorizing activities such as payments, so the most convenient way would be biometrics whether it's a fingerprint, scanning or eye scanning or face ID scanning but let's imagine that once that part of your identity is getting stolen scanned by a third‑party it's not possible to change. You cannot change your face in a significant manner, so that it becomes a different one, so that is very insecure.
On the other hand, we have devices that are secure. SMARTworks even a payment card which can be locked whenever there's a problem that you're facing as a customer, as a user and get the new one, and you will not get a new face, you'll not get a new finger and one day you'll definitely run out of them but implants are that kind of a device that are already embedded into your body, which is super, super convenient such as biometrics, and they're super secure because whenever there's a problem with a particular token a particular application, for example, payments you can decode it and change it and reprogram it just like you do with a standard device, and I believe that's the biggest advantage of the smart implants. Not only in the payment industry but in general in the ‑‑ in the digital identity industry. Thank you.
>> Thank you. Fascinating, and I'm definitely interested to hear more about that as we go along as well.
Ilya, over to you.
>> Yes, thank you. I think about one interesting example of augmentation I need like to ‑‑ when I spoke to someone in a conference about human augmentation about our work over assistive device, I'm often say about plastic surgery. After second World War it was just a medical procedure. We had no -- any ideas about make your body better than it is.
But in 10, 20 years plastic surgery began just simple surgeon ‑‑ a simple medical case when you can just change your nose, change your ears ‑‑ any kind of your body, and this is a great example how augmentation transfers from the medical usage, for example, or those for disabled people, and then this technology will transfer to help people enhance some kind of issue, and I think it's a great example of future predictions how the human augmentation will ‑‑ we'll use in the future, yeah.
From now we use, for example, prosthesis only for handicapped people but in the future when it's ‑‑ when it becomes a simple surgery, simple operation, we'll see that humans will just change their hands within in‑built payment card which builds smart phone functionality and all this gadget in sight or prosthesis so ‑‑ that's my first words about this.
>> Yes, it's an interesting topic, and we already have seen over the last couple of Paralympics, I think, the evidence of prosthetics actually enhance human performance in the right hands. We'll hear more about that later on. Jorge, over to you?
>> Hi. We have already been working ‑‑ it is wheelchair that it's controlled by brain computer innovations, okay, and I think that it is a great example of human augmentation and human augmentation is quickly growing in our society, so we need to act fast in order to make things work right away, and we can't stop ‑‑ we have to be careful and work together, so that nothing goes out of control, and that's it.
>> Absolutely, very succinct, and we're going to talk a lot about the security ‑‑ additional security risks around that.
Before we do that, let's speak with Tristan.
>> Yeah, well, we believe the essence of human life is to use all our senses, you know, to explore and experience the world around us, but this is particular mission is made much more difficult for about 65 million people that live in a wheelchair every day and our mission as a company is to help people reconnect with their community and reconnect with their vocation by giving them back access to life.
And as a company we engineer human machine traction, and our first product is a powered exoskeleton that lets people stand up and walk away.
We believe human augmentation is everywhere from the chair you might be sitting on right now to the glasses that some of you are wearing, but we're also convinced that technology can only be ethical if it's meant to reduce inequalities as opposed to inequalities between humans.
>> Absolutely, and, you know, we are going to get on to the question of the potential digital divide, you know, as we know ‑‑ as it I say ands at the moment in a little while, but now let's hear from Marco Preuss.
>> Let's be honest with human augmentation we're not just talking about some technologies, some hide ‑‑ some niche of whatever. We're talking about one important aspect of future mankind.
It is already around us, and we'll see more and more developments, so we have different problems and challenges here to solve right now, today, and we have to address technical challenges and on the other side the challenges in terms of this society, so we need to have more agreements, not only standardizations but really exchange some things. We need to have policies and regulations but not in terms of limits things but more to shape this new world, this technology, which will be in the central integrated part of future humanity.
>> Absolutely. Very good point all of you have raised and as I said we'll hear a member of parliament from Greece, a member of the European Union in Greece. She's actually in another session, and she's going to dash straight right away over to us.
One of the things we want to have from today's session is come up with some ideas to put some thoughts down on paper about how we can protect and regulate and also encourage innovation without, you know, allowing as we have some other technological innovations over the decades we've kind of let them go a bit wild and on their own at first, and then we go, oh, goodness where are we now? How do we control this? This is what I really want to drill into today with all of you. Human augmentation ‑‑ the field that focuses on creating cognitive and physical improvement as an integral part of the human body is one of the most significant technological trends today.
We're already seeing a wide range of practical applications as we've discussed a little bit already being deployed across everyday world education and tradition. Exoskeletons for fire and rescue operators or bioprinting of organs are just a couple of examples.
People are generally happy to see this technology used for the good of humanity because he's done some research on that but there are also some real concerns about how this kind of innovation might be used in the hands of bad actors and what risks we open everybody up to when it comes to cybersecurity because as we all know every new innovation comes with a whole new herd of cybercriminals who want to take advantage of it.
I would like to start a question to Wojciech, company manufactures a chip to ‑‑ wave their hand over the terminal and the security implications of having it as something that isn't part of your policy that can't be change.
The line good physical and digital technologies is merging all the time in this connected world so how do we ensure human augmentation is used safely and how can we make sure digital technologies are not developed and used for harmful purposes, Wojciech?
>> Okay. Let's start let's start with the most important aspect that it cannot be forced on anyone. We have type of ‑‑ when it comes to the human aspects, I agree firstly we need to understand what the technology we are using for a certain company that offers this solution to us is using when it comes to walls of technology it's NSE which stands for media field communications, so the first thing that probably comes to your mind when talking about influx and talking about potential threats and issues with cybersecurity in our case is the possibility of being spied or tracked or monitored in such a way, so it takes just a couple minutes to learn that the technology works only in a newer field as of any sense.
Whenever we're talking about potential threats of spying or monitoring someone ‑‑ someone's location we learn that I am a certain place only when you are 13 meters away from you, so that's the very importance when we're talking in general about potential uses of our technology.
When it comes to the other concerns that we have raised related to cybersecurity, of course, in terms of ‑‑ in a very new product so for now we're using the world technologies to gather our power with our major partner, major payments company and the cybersecurity solutions such as to secure a system such as any other encryption kind of methods for those transactions are very top down why? Because we're working with the best people and the best companies in the market. And when it comes to evolving -- when it comes to evolving technology to ultimately merge our physical identity technology, we have to make sure the identity is protected in multiple stages and multiple factors.
When we certainly shouldn't do is put all the valuable information in the implant. The implant can only -- can't as a certain gateway from any other device to a super secure multiprotected system that's an ecosystem, so the information can only be accessed through the ‑‑
>> And not edited or even ‑‑ or hacked so, so that's what we're actually working on to establishing the multiple ‑‑ not even two‑step but three steps authorization to make sure data is stored on a cloud‑based system is super safe and unhackable and possible to be changed or edited by an authorized user not person who's accessing the internet itself so make sure that the data is secure.
With and whenever there's someone at this panel who's willing to invest any implantable technology as a person who is very much skill, I definitely encourage you to do everything you can to make sure that information is in the ecosystem because there is a lot of potential to make sure it's protected and as long as the protection is there we need to take advantage of it instead of this little outlet, which is the implant.
>> I'm glad you characterized that point of view ‑‑ I was going to ask. I remember being at Defcon, which is a hackers conference a few years back, and I actually had my credit card was scanned ‑‑ because you're crushed into a lift because there are lots of places, obviously, not with COVID at the moment. I know everyone is spaced out, but there's lots of instances in life where we are kind of crushed together, and that would be my fear that, you know, how do we protect ourselves from people getting close enough to us, you know, to actually get information.
But the fact it becomes your key for the lock rather than actually the safe and everything that's in it.
Okay. Let's move on. He would come back to more on that in a while and don't forget if you are watching either online or live, and you have any comments or questions that you would like to contribute, then do let us know he.
Marco, I have worked with you for several years now chairing caper ski, will human augmentation, and it is very easy to get blown away with the possibilities. You saw Tristan's video, and we got somebody who's obviously physically disabled and able to walk up a mountain. It's easy to get very excited when you meet the inspirational people that I've met during the discussions with you but opportunities ‑‑ with opportunities come risks and uncertainties as the technology becomes more mainstream so will the cybersecurity issues.
What about the existing digital threats that can affect augmented devices today? Oh, mute?
Well, we have ‑‑ it's a huge topic. Let's bring it down the bit. To be honest, I think what Wojciech just said before especially in payment systems we have certain standardizations, and that's good, and I think he took that into account.
In many other fields we have the problem that you have a focus on a certain field which may not be as secure, and, therefore, you're probably lacking or there is a potential of lacking certain necessary steps in terms of I.T. security but also in terms of privacy.
Just having a look at the previous trends, the topic of IoT, for example, before we had, like, policies, and so on and every time we've seen this new technology dropped in the market was running completely wild and uncontrolled, and we got a lot of problems in any terms that you can imagine. We had the tags we had malware Ransomware ‑‑ so many things which had a huge impact, and they're still going on to be honest and the problem here what I see if technologies gets closer and closer to the human body, we're talking about a way more sensitive area, therefore, we have to take way more steps to make it safe, to make it protected and really take the learning steps that we have found from the past decades from so many different technologies, adopt them, develop them further to make this ‑‑ these technologies better because I see from the POC attacks from certain smaller devices, from smaller implants that we've seen already in the past ‑‑ like pacemakers got hacked, and so on, so these are known. These are documented, but they were, of course, kind of small area small area of people that we were talking about that had them but just imagine this on a global skill or just on a country scale or city scale. It doesn't have to be that big but as long as you have enough devices, which are either vulnerable or can be attacked or somehow criminals can do something with it, then you have a problem, and we should do all our best to not to make this a reality.
>> What are some of the good practices that you could advise people to address the problems and make sure that they, you know, address them ahead of production and ahead of products becoming mainstream because this is the time to do it; right?
>> It's exactly that, and exactly what you mentioned. This is exactly the first step, not thinking about security, when the product is already done and ready to ship, but before in the development process as early as possible, taking into consideration what problems may appear and also what we can do against it. A very simple example where we're still lacking in many devices and aspects of achieve area is a simple update fractionality for example, so we have all this amazing functionality, these standards, these technologies, these exchanges whatever you ship the product of but if something ‑‑ but if something happens, you have no chance of adjusting it later on so simple mistakes.
Think about it, first place, when you're in the development, if you don't have the expertise, talk to the people with the expertise. There are many I.T. security experts out there, like me, for example, but also others. It's a huge community. It's a huge global community. You will find someone near you. Talk with them. Involve them and ensure that these technologies get secured.
>> I know a lot of penetration as well, pen testers in shorthand and if you're not familiar with that term they are ethical hackers who you can ask to try and hack into your system, so people can find the holes for you.
We're starting to get questions, but I'm going to hold them because we have 10 minutes of questions a little loaner on.
Ilya, your business makes prosthetic limbs for amputees currently, do you think that human augmentation and enhancement should be restricted to health and medical use or is there an argument for expanding it into recreational and even business use?
>> I think it's impossible to strict new technologies. Instruments but come on!
>> It will be used in different kind of striations in the future, and it's impossible to restrict technologies. I don't believe in this.
In fact, in our company we receive about 3, 400th new clients per month and from these 300 clients I think 5 or 10 is healthy people who call to us and ask whether I can cut off my hand and produce a prosthetic device instead my healthy hand. We say to those people, not now, maybe in 20 years. It may be an option.
>> There's a huge ethical issue around that because, you know, where there are good players, and there's always bad players as well, and you're a very, you know, upstanding company, but, you know, at some point they may find somebody who may just do it for them. The ethical issues around that are huge. Do you think we need to develop commonly agreed standards that transcend borders or do we just leave it up to industry to manage it itself?
>> Yes, it's a great ethical question because just an example, I like examples ‑‑
>> For example, we have one case with our clients with a prosthesis where he had a fight with some person and in this fight he broke his prosthetic arm and how we ‑‑ how should, for example, for police for future ‑‑ for future case ‑‑ in some case you broke someone's device or you broke someone's hand because for the handicapped people the prosthesis is for your hand it's not a device it's not a smart phone yet it's part of your body, and such, so we have to ‑‑ I think ‑‑ of course, we to have admit a security issue market and have a development process but also we have to meet some ethical issues during the development process because for now I think society isn't ready for such technologies and for such questions because there is no cases how to ‑‑ how to ‑‑ how to admit the device is used for some person, and I think it's an interesting question not only for prosthesis for exoskeletons yes. Whether it's a part of the body of the person who use this exoskeleton or it's just like a car, yeah, like a device, and I think we should ask these questions at the conference and discuss and make some rules and some ethics ‑‑ new roles maybe in this ‑‑ part of new future
>> Yes, it's a really interesting point actually. I haven't really even thought of -- sort of the impact of somebody ‑‑ if they damage their prosthetic limb or whatever, also I would imagine getting hit with a prosthetic hand could potentially be a lot more damaging to whoever was being hit by it as well, some interesting questions raised.
Jorge, robotics has been your business advocating health, education, and industrial, and from our wide range of experience do you think the human augmentation and enhancements should be restricted to health and medical use or expanded?
>> Okay. Since the beginning of humanity humans have evolve physically medical different dimensions and the human beings have expanded capacities and has been perfected himself and in this way has also made it possible to be in his environment.
I think the improvement and human development in talking about medical or health uses shall be a priority. It's not something exclusive but for human development due to an enormous of talks, current needs leading the development ‑‑ we ran the risk of generating bias towards innovation in the future and in international organizations and governments favor knowledge or research that favor are helpful in ‑‑ sorry, of humanity.
>> Yes, yeah. You're right. In some respects, the wheel could be considered as a type of augmentation of the human condition anyway at least.
Do you think that we need to develop some of those agreed standards, you know, international standards that everyone should be abiding or do you think it should be just ‑‑ you know, just let innovators innovate?
>> I think at first it's important to educate authorities and also in universities for future development it's considered to be human and nature as a basis ‑‑ it is possible to think about standards especially in aspects where people are affected by technology without losing the human right in matters of freedom, okay? But if they enjoy what freedom brings human augmentation must be a quality protected against cyberthreats and under certain principles, and this in turn will promote innovation and stimulate the industry, which will without a doubt benefit society, and I think that is the challenge that we have, yeah.
>> Educate, legislate. I hear you that's a term that I agree wholeheartedly.
Tristan, same question to you. Medical exoskeletons. You help people walk again. Do you think it should be limited to human use?
>> In fact, we really believe augmentation is acceptable and even desirable in many cases and that, you know, the primary concern is accessibility. It's not a matter of whether it's the field use is leisure or not it's what you're introducing equalities, and that's why we innovated to the business model to make sure that our accessible as possible, so we a pay for use so our clients can book our devices for just one session of an hour at the rate of $150 per hour, which is a lot more accessible than $150,000 that they would have needed to pay to buy a device like ours.
In general, I would say as a company we have a really strong ethical guideline in our developments, and we only develop applications that are ‑‑ that are certain to reduce inequalities, and while we can't prevent others from misusing technology, we can definitely educate people around risks or misuses, and those along the same line what Ilya was saying before.
I think it's not our role as technology makers or industry players to self‑regulate. We've seen enough cases where self‑regulation has failed. I think our duty is really to raise awareness to educate the general public about potential pitfalls of technology and that we're developing.
And it's up to the general public and the policymakers to extend the stakes at hand and create legal frameworks that make people understand everything and for people to really benefit from our technology, and I would say it's a question of incentives, you know. If a company is fractured with only profit in mind, then it becomes difficult to stick to these ethical guidelines, and that's why we as a company ambition to involve our users in the capital structure of the company, so that the final deployments which we, of course, are paying for we really envision to have a financially stable, viable and company that it benefits our users eventually.
Well, talk with the next element of the discussion. We've talked already about the topic of regulation and whether responsibility for oversight might lie, but I would like to broaden the discussion to include stakeholder roles and responsibilities.
What does the landscape for human augmentation looks like in the future and how do we ensure inclusion and accessibility, so we're not kick‑starting the next big digital guiding, Marco, what do you envision in the research sectors.
>> I think there's a lot of ethical discussion about the ethics of human enhancements and one thing is clear there is not a general theory to assess the ethics of human enhancement because there are many different methods for enhancement: Pharmaceuticals, drugs, high tech, prosthetic limbs, wearable devices, genetic engineering so there are different means of augmentation but then too there are so many different targeters for the augmentation that is our cognitive capacities, our physical strength, and now our immune system, which is the object of augmentation through vaccination.
I don't think ‑‑ for every combination of method of enhancement and target we see lots of different ethical questions. I think many of the questions turn around the role of the state, should the state prohibit this technology and relate? But then too, we should think where the state might be expected to provide the technology if it's going to provide autonomy on equality. This is exactly what's happening now with the vaccination. Even though as I said at the beginning people can have very strong opinions about this technology.
I don't think there's any general theory, ethical theory for the assessment of this topic. We have to have in mind what kind of combination we want to assess; but then, too, regardless of which theory is going to be deployed, we have to listen to people. I think one further development of this debate, more renal is that we realize people have very, very different perceptions of human augmentation in different countries. This has been made very clear by the sienna project, a project I had the opportunity to take part in. It was funded by the EU.
But then too, last year the Kaspersky published a report which also made it clear that different countries ‑‑ people in different countries can have very, very different perception about cognitive enhancements or enhancements by means of CRISPR, it must be regulated, of course.
But then, too, these perceptions can change jog graphically but a further ‑‑ geographically, but it's an interesting topic for a long time now that perception can also change in time. There has been a lot of discussion about human enhancement but there has not been much discussion about the history of human enhancement. We think of the enhancement as something that's either happening now what's going to happen in the future but as a matter of fact even though the word or the expression human enhancement was not used in the past there has been discussion of human enhancement after the World War. There's a huge debate in Germany particularly on government's thought they might make former soldiers more productive by producing high tech ‑‑ new kinds of prosthetic even from our perspectives today it looks impressive prosthetic limbs that have been produced at the time.
And after the second World War there was an enhancement of amphetamines, and I'm more interested with my colleague with the professor the history ‑‑ unexplored history of human enhancements.
>> Yeah, you just pretty much described the plot line of the bionic man movies. It's well in that too.
It's great to know Eva is going to with us, and we'll come to you in a moment, Eva but Marco, what do you think is the main role of private businesses in the technical community in this discussion?
>> I think Tristan already made it quite clear. It's a very nice example to be honest. Working with high standards towards society but also, of course, not mutually excluded towards society not having sustainability.
I think these new questions not having the main focus just about money, whatsoever, but educating the society, bringing up the discussions, explaining the pitfalls and also working with higher values, higher standards, that's ‑‑ that goes along especially with this technology.
>> What do we have today and what steps should we be undertaken on this level in the next 5 to 10 years, do you think?
>> I think exactly these points we have to develop further, this accessibility we have and the enablement responsibilities and rights is a human term, which is also again a topic where does regulation and policies come in.
I mean, we may this example with the service equality policy we created for augmented parties or third parties working for us or working with us, and this is, like, a first step. This is an example. This is the way we have to go over the next years to ensure that we have all of these topics covered in the right way. Not only as I said before ‑‑ not limiting. Not making it more stricter but shaping it and unable this accessibility to avoid the problems, to the potential problems we may have in terms of inequality or other problems which may divide society, for example, bring a lot of problems also from ‑‑ like everything what we're doing on a social level will have a huge impact also further on the technological level in terms of how attacks may go on.
A simple example with just a few more seconds to make this a bit more clear.
So having, for example, technologies, which are, like, just cheap, and you can get it everywhere, but we have to pay with it with our privacy data with our personal life is maybe the wrong way but on the other hand we have this huge risk which was also shown in our survey that what if only wealthy people will have access to certain services, so we have the split in the technology. This may go on to like different criminal activities, like, trying to steel, like, theft all of that kind of stuff, ransom, and so on, and we also have to think about this from the full ‑‑ angle from the full scale. It's not just one topic we have to address this whole topic in order to secure security and protection at all levels.
>> I'm glad you mentioned the, you know, potential for another digital divide. The other area that concerns me is, obviously, protecting the vulnerable.
You know, we think about ‑‑ in Japan at the moment, they are actually using exoskeletons to help elderly people keep working in heavy lifting environments longer. Well, what if they don't want to keep working in heavy lifting environments longer, and you're in a position where you're pressured by your work to use this equipment. These are all the kinds of questions that I want us to be thinking about.
Now it's my great pleasure to welcome a member of the European policy from Greece, Eva Kali.
Thank you for joining us as well as being an EMP, you're part of the European policy committee, and you bring a unique perspective on this environment. Could you explain key issues and how you would see them tackled?
>> Yes, thank you so much, and it's very timely event so thank you for inviting me.
And I'm happy to be with IGF, and I'm happy to be with you.
I've been working on technologies I think here is not something different than what we've been discussing how legislation to the digital economy, and we need to make sure that we will ensure that our principles and fundamental values will be respected by the use of this technologies that present different challenges, and they can actually present extreme challenges so with the great innovation comes great responsibility since you can bring a lot of benefits to the health sector but at the same time where he understand that we can give rise to more biohacking, and we also are during this pandemic are realizing that the technologies could be bring solutions but there could be challenges beyond borders, and we need to join forces to achieve our resilience and defense to these threats.
More companies start producing, of course, tools we will be able to the extent of these risks and the extent of the fears, and we will be able to adapt our current legislation that people can trust it without compromising privacy, fundamental rights or principles, so we need to have the benefit of these technologies we need to ensure that we will understand how the use I felt could not have ‑‑ at least deliberately harm people or society, and we need, of course, to have constantly assessments on these technologies and how they affect us.
I will give one example ‑‑ for example it's really raising concerns to us, and I think it's going to be maybe perhaps the fifth industrial revolution it will be technology it's the development of implantables brain interfaces to treat serious brain diseases in the short‑term with the goal of human enhancement and whatever this means presenting challenges, risks and concerns to citizens because the technologies are made by humans, and we understand also that they are not perfect, so we might love to have the possibility of our memory capacity or to learn languages super fast but then this goes vice versa we could be influenced, manipulated and controlled by external sources and potential threats, so it's really important to develop the specific fellows for the use and to make sure that people will have affordable access to this technologies and immediately they will feel they're safe to use those technologies.
Since how we talk about again people in the loop having control of their data, and this technology we are developing constantly ethical frameworks for all these technologies. Now it's AI. It's on the table, and it's the convergence with other technologies that lead us to the topic of discussion today. Sometimes we're saying it's better to be late not to prevent innovation but in this regard I think we need to have important safeguards so even discussing something that looks like science fiction but already we have the potential to use these technologies today, I think it really should ring blood cells for us to have movements as far as, and we should not just allow the self‑regulation approach. I think we should do more, and that's why European politicians and like‑minded collaborators meeting with standards that at least would respect what I mentioned and to set alliances and guidelines. Thank you.
>> We've heard a range of opinions in terms of the level of interaction that the government should have, you know, in regulation, you know, there is an argument about stifling innovation but for me one of the really important issues is inclusion, and I know that, Eva, digital inclusion and respect for human rights is very high on your agenda, how much do you think the government ‑‑ what are the responsibilities of governments in civil society in regards to laying out a roadmap to digital inclusion in this area?
>> Well, it depends. When we're discussing about inclusion it means we should avoid exclusion and access to services and benefits, and we talk about, you know, I think not hiring for exclusion but when you talk on a global scale what exclusion could mean, and you look at China, and you can see you can be excluded by the basic income by the main services to have the freedom of movement and privacy and everything that we consider we take for granted in Europe, so I know that it might sound difficult, you know, to define but then I think the best response to your question ‑‑ my answer would be we need to have a principled based approach we need to find the methodology to ensure high risk applications and technologies following stricter regulations and rules. This cannot cause more friction I agree, but it's a larger bigger extent and specific sectors, and everybody thought that GPR would stifle innovation. It didn't we have more legal certainty in Europe and Europe is a huge market, and these technologies again cross border, and I hope, and I feel this is the role of Europe to play ethical framework and quality of life having these technology to what humans do already and, of course, have publicly accessible human rights impact assessments for all the users of this exponent technologies should be carried out and should be available to everyone ‑‑ to be able to update our safeguards.
>> Marcelo, is there anything in your sort of experience academically in research that you can bring to the table in terms of what would be a good balance to, you know, ensure inclusion but also ensure you're not stifling innovation with too much regulation.
>> Yeah, it's a real challenge how to benefit from new technologies without stifling I should say. I think one special challenge is how to regulate across borders. It was mentioned that different countries have very different perceptions of human augmentation. Chances are if you regulate some new technology very, very strictly in one country, the high middle class in that country will cross the border in order to have that kind of service somewhere else. This is already the case in regards for genetic technologies. Maybe it's prohibited in some countries ‑‑ it's happening with abortion, but it happens with genetic engineering or selection you can have it very strictly regulated, but it will not prevent rich people or from other cultures from going and having the service. For this reason, there has to be some kind of international agreement as to how to deal with this case, but this is, of course, a huge challenge.
>> Yeah, a challenge for every industry, in fact.
Tristan, your exoskeletons clearly from your video huge sort of life enhancer for people who lost their use of their limbs to some extent, how would you like to see inclusion regulated within this industry to make sure technology like that is available to people who perhaps can't afford to go to a mountain for a spot of skiing?
>> As more of the example show how far technology can go. It's not the way we envisioned our product will be used daily. We really care about reducing the impact of secondary health issues of daily living of people and that means, you know, bowel function, bladder function these are essential parts of our daily routine in our daily life that we need to owe access to. I think it's fundamental rights to have the capacity to control your bowel and your bladder that you not have your bones brittle or break as soon as you sit for 5 minutes in your chair and technology like the one we're developing is helping ‑‑ reducing inequalities in this way that we're helping people reduce the impact and the burden of physical disability in their daily life, and I think access of healthcare made access to innovation and invasions that were not technical and that were institutional invasions, and I think this is the kind of innovation that we need to think for, to look for and foster, and I think it's the same in terms of also preventing risks. If we can make sure that the business model center governing the major companies that are providing services ‑‑ I'm talking about Facebook, for instance, it's the business model of Facebook, which is harming people and incentivizing the company into harnessing the data and the private information of people and selling the data to third parties, this is the business model that is harmful, and I think we have to innovate in technology and also the business model approaches, and I think there is a great deal of new ways of thinking how to provide technology to people and how to finance it in the way that makes inequalities bearable for society.
>> Well, we have 3 minutes left, time really does fly, doesn't it?
So, Marco, your question is a little off‑topic, and let's finish because we haven't had any inactions on the floor but let's finish very briefly. I would like to ask each of you to make a closing statement just, you know, in a few words ‑‑ how can we make use of digital technologies to promote more equitable and peaceful societies that inclusive resilient and sustainable.
Wojciech, what for in one sentence what are the most important issues around this
>> I believe in our case I'm saying from World War perspective it's to also encourage companies and state organizations in other words institutions from the government or any agency or even the European parliament to digitized as much as possible because I believe that digitalization is the way forward in terms of cybersecurity in terms of the identity because that's something that cannot be lost if properly captured and protected.
>> Great, I'm going to stop you there. Jorge, do you have a closing thought with a few words.
>> Yes, we work under the assisted enabled objectives of the United Nations and developing technologies that meet the needs of the person without compromising ‑‑ we think STEM science technology engineering nature, art and the human ‑‑
>> Safely working serving a community together with caper ski.
>> Ilya, same to you.
>> Yeah, I think the main role subject technology has to be available to everyone. I think the primary issue to make more peaceful society future more ‑‑ augmented society.
>> Perfect, Tristan?
>> I'd like to see more people involved across the board of all shareholders. We are going to step into the future that is going to be radically different from now, I'd love to see artists, policymakers step into that future before technologies ‑‑ and I'd highly encourage people from across‑the‑board to build that future only technologies first.
>> Great, Marcelo?
>> I do believe in the power of new technologies to promote dialog and mutual understanding across borders, and I think we are going to need a lot of days to face new challenges we are going to have in the near future as we got new pandemics on climate change. I think it's a force for the good new technologies, thank you.
>> I agree. Marco?
>> I think the future can only be shaped by all of us together also in terms of technologies but not excluding technologies but more including technologies because clear from today that technologies are essential for the future.
>> Brilliant and Eva, final last few words to you. We're in overtime right now.
>> Any final thoughts? Okay. I guess as we are now at 5:30 Eva probably rushed off to another meeting. I shouldn't wonder. That is, unfortunately, we have time for many thanks for our stakeholders for share time and perspectives and for Kaspersky.
Thank you for listening, I hope it gave you some food for thought. Let's keep talking I think we can all agree with the possibilities of human augmentation are incredibly exciting, but we have to remain vigilant about especially cybersecurity and the ethical risks around embedding technology into our actual bodies. We need to continue to address the increasing need for global cybersecurity regulation, for human augmentation and conversations just like this.
Enjoy the rest of your sessions here at the IGF. Thank you for attending and have a good day.