You are here

IGF 2020 - Day 7 - OF42 Personal Sovereignty: Digital Trust in the Algorithmic Age

The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 



>> MOIRA PATTERSON: I'm going get started now, as we are just a few minutes into our slot.  So, let's get started.  Hello, everybody.  My name is Moira Patterson.  I'm the Global Market Affairs and Community Engagement Director for the IEEE Standards Association.  I'll be moderating the Open Forum today and I'm very happy that you all joined us.

IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity.  And we are honored to be here with you today as part of this IGF event.

First I want to thank the IGF organizers for organizing the timely and impactful IGF under difficult circumstances, no doubt.  So, we are very happy to be part of it as well.

I also want to thank the IEEE team supporting this session for all the great supports and inputs.  So we would like to welcome and thank you for joining us today.  In a few minutes we'll start this session but I have a few housekeeping items to mention before we get started.

So first off, we will have all of the attendees muted and we will be taking questions using the chat features.  And Q&A features in your Zoom application.  So click on the relevant buttons in the bottom bar of your interface.  Also the Open Forum is being recorded and will be available on the IGF events page and it should be available as soon as the session ends.

We hope this session will be interactive.  We know we have a lot of great attendees who have highly‑relevant experiences and insights to share or questions to ask from different environments and we encourage such interactions.

So please again, mute the Q&A function ‑‑ use the Q&A function along in this session and we'll try to answer as many questions or have as many discussion points as we can accommodate.

So, to learn a bit more about our audience today, and also to get familiar with the chat function, we just want to put a question in the chat and we ask all of you to type in your answers.  So the question is, what stakeholder group do you represent?  So industry, government, Civil Society, what type of organization are you with?  And what country are you joining us from today?

Just give your answer and we'll read a few of them so we can get a sense of who is on the call with us today.

I am seeing we have someone from Australia today.  Ontario, I believe.  Thank you and welcome.

>> SALMA ABBASI:  And there is another gentleman from Consueloa.

>> MOIRA PATTERSON: Civil Society, et cetera.  U.K.

One of our speakers is from the U.K. but we'll get to that in just a minute.  Excellent.  Thank you for sharing.  And there I see someone from Kenya, Civil Society as well.  Excellent.  Well, thank you all.

As I said, we have one speaker, Dr. Salma in the U.K., and John and I are both based in the U.S.

So we have people from around the world here and we are happy to be here.  So let's get started.  Today we have two other distinguished Panelists and I would like to just give a brief introduction before we start the discussions.  First off we have Dr. Salma Abbasi, and she is a technologist, philanthropist and social activist with a proven track record of visionary leader with dynamic drive and positive other in.  She motivates professionals and Civil Society.  She is the CEO of e‑Worldwide Group, an International company focusing on Developments and Securities.  She is also former Senior Vice President for Technologies and has over 35 years experience in the field of technology, innovation for sustainable development and policy development for economic empowerment.  Also with us is John Havens.  He is Executive Director of the IEEE Global Initiative on Essex and Autonomous and Intelligent Systems.  He helped defined guidelines for beneficial relationships between humans and their increasingly clever machine counterparts.  His mission is to educate, train and empower people involved with developing these future technologies in order to implement ethically aligned design and improve outcomes for all.  In addition to that, John is also Executive Director of the Council on Extended Learning and the Interdisciplinary Group of Experts founded by IEEE Standards Association and MIT.  They Champion responsible participant design data agency and metrics on economic disparity that prioritizes people and the planet over profit and productivity.  So you see, these two very highly‑accomplished expert participants.  I myself, I'm Moira Patterson, and I'm in the IEEE Standards Association.  I oversee some key initiatives related to digital inclusion, identity, data governance as well as looking at children's on line, off line experiences.  I also managed strategic initiative and policy engagements with a focus on Europe and Africa, and engaged in capacity‑building activities and strategic engagement with national standardization organizations.

So this is the team who you will be hearing from today.  And I also want to say that while we are here for this session, the IEEE, the views expressed during the discussion may be our own, the speaker's own and not necessarily those of IEEE.  Keep that in mind.

And with that, I do want to again thank you for being part of this discussion and get started with the context.  So, in the wake of COVID‑19, there has been a dialogue and we would like to say erroneous dialogue, that has emerged that talks about the tensions between privacy and protection around addressing and addressing the COVID‑19 pandemic.

So there is an assumption that people need to give up information such as location, biometric, or medical data and other very sensitive information is really needed to be able to address and combat the pandemic.

But the concern is, is that key Human Rights can be violated in such a context.  So, what we want to discuss really is, is there really a tension or how do we address and balance these needs?

So to get the dialogue started, I want to start ‑‑ we will be talking about three things under this topic, and then take questions in each of them.  So I'll start with the first theme where we do want to stay with this line of thinking.  So during COVID‑19, we have seen how digital technologies have been a lifeline for many people.  Many of you have experiences and we are all here at IGF through this whole technology, so I think we see it as a very real way.

So, people have used digital technology as a lifeline around the world.  So first off, I want to talk about what are the key messages we have relating to digital technologies and personal digitalized entities and data governance?  So just from my point of view, ensuring that the conditions of on line access enable and preserve our personal agency and dignity in way that is empower people, it's very important.  And I think in the off line world or in the physical world, we have levels of agency and trust that are assumed or that are foundational and that are somewhat disappearing in the on line world or can disappear in the on line world where data can be combined and analyzed in new ways and in potentially invasive ways.

And in IEEE we have a number of activities where we are looking at managing exactly that concern through different bit of governance guide lines and things of that nature.  So I want to ask Dr. Salma, how do we reset and address the concerns around a surveillance post‑COVID‑19 in the context of sovereignty, identity and data governance?

>> SALMA ABASSI:  Thank you very much for inviting me to join this excellent Panel and raise the very relevant question in today's times.

I think that there is no doubt that globally there has been a huge acceleration in how technologies are being used by governments to support us during the mitigation and response to COVID‑19.

Many governments opened the ‑‑ in Europe in particular, Belgium, Australia, Italy, U.K. and Germany, are gathering our data through their track and trade systems.  Some governments are actually doing this in a very unethical manner and taking the data ‑‑ I think there are some examples from Ecuador and Israel where data has been taken.  Your medical records, your own personal situation with COVID is being shared with people in the area that you live so that they can avoid you or your ‑‑ the potential threat or spreading the disease.

There is a huge debate at the moment on how is this actually invading your privacy and the doctor/patient ethical Code of Conduct in that space?  What I see is that there is a balance and a conflict here between how much data should be given to address the pandemic and how much data should be protected because we are moving into a very dangerous area of surveillance.

So the question is, how much surveillance is acceptable?  How much privacy do we need to keep for ourselves and how much needs to be given away due to public safety and health concerns?

So policymakers need to recognize that technology and surveillance systems are constructed by Private Sector.  They have ethics.  They have bias issues, all sorts of conflicts going on, which may not necessarily match what we believe as private citizens to be acceptable.  And also, unintentional errors occur which also impact us.  And there are all sorts of examples where we'll acknowledge that the data is being collected by these big giants in the world, tech companies like Google, Facebook, Apple, Amazon, all forming an omni foundation, on line surveillance architecture.  John will probably talk about this as well.

With this, as you know, more and more data is bubbling up every day because of the Internet boom but also because of the pandemic.  Smartphone devices are also being used to gather lots and lots of data and we are unable to really understand exactly how our data is being used.

So there is an importance for us to build a technical task force that really understands how the data is being collected, how the data is being used in a more actable and transparent manner.  And we have to actually regain to reset and understand after the pandemic is over, what are the right things to do and how do we protect ourselves and how do states actually work in this space?

There is a danger with the response data that is out there, that our behaviors our attitudes, our moods, preferences and social society are actually being analyzed every day through AI.  And I know the work that the IT ‑‑ IEEE team has been doing on ethical‑aligned design is helping to bring back efforts in that space.  What we see, as an example,ide like to give is the Government of Estonia, I think that's a Gold Standard that we should actually look to in the future because after they have a huge cyberattack in 2007, the government actually set up a huge infrastructure to protect its security but in a very transparent way where people knew what data was being collected, how was it going to be used, how was it going to be protected and which agency was going to use it.  So it wasn't in Private Sector's hand, it was with the government.  And that is a very important aspect that we need to address.

I think we have to accept that in the long term as life continues, that there will be need for more data to be collected by governments to keep us safe in the future.

This is just the beginning.  Many pandemics in the future are expected.  What we need to do is we need to take hold of our data in such a way when we develop a mechanism where we actually own our data.  We need to take back the responsibility rather than blindly saying, click, yes, take it.  There needs to be an ownership and recognition that our data is our ‑‑ we own our data.  I think we have to realize that the responsibility isn't just in the hands of the government.  It's also in ours.

And I think that as we deal with ‑‑ there is many, many things that I believe that we need to be able to do but the first thing is, we need to have some serious discussions with the Private Sector that is collecting our data and create those kinds of policies and systems that are in place that actually allow the individual to create in terms of reference for their data, where they decide who has their data.  Who is allowed or who are they giving allowances to share that data and when is that data going to be expired?  And when will it not be shared?  It's almost like in terms it of reference in contracts that we need to put in place.  And I believe the Estonia government model is a good one to study to take this forward.

The question was, post‑COVID, the surveillance date.  If you look at how data has been misused and how companies like Facebook, there is a wonderful example of a gentleman who wanted to sell on eBay ‑‑ an actual case.  He put 10 years of his data on eBay to sell and Facebook stopped him and said, wait a minute, our Terms and Conditions say you can't sell your data.  You gave it to me.  So that is a ridiculous example of what happened but on the flip side, the U.S. port has actually stopped and find and sued Facebook from using data that was ‑‑ people were putting data on Facebook for their AI and testing their programs without the permission.

So the law is trying to catch‑up.  But I think that the recent work that is going on in Europe, the EU, is also going to be something we'll talk about later but I think lots of things, but the first thing I'd like to say is we need to own our data and governments need to set policies that actually regain trust with the population.  Thank you.

>> MOIRA PATTERSON: Thank you Salma.  And I know John is very passionate about exactly this topic of Data Sovereignty.  So John, why don't you share with us why the emergence of Data Sovereignty is so essential, especially in the algorithmic age.

>> JOHN HAVENS: Just to build off of Salma Abbasi, who is a dear friend and brilliant.  The phrase, own your data, it's a big term and I completely agree with you, Salma.  I think one aspect of ownership, which is such a beautiful time for all humans to think about, is what does it really mean to own your data anyway?

And I want to start metaphorically because I find that my friends who aren't in technology.  I know this audience is in technology, but I'm starting here to make the metaphor work and get more technical.

In one sense, what is the access to the data that you create?  And how would you even begin to understand why it's important?  So let me give you an example.  Our health data.  I wrote a book, two books ago called, hacking happiness, where I did a lot of research into the quantified self movement.  And these are geeks like me.  They wear a FitBit, they measure their sleep, they measure how many steps they take.  They measure this and that.  And people would sometimes say to them, why are you measuring everything?  Doesn't that take away from your life?  And their answer and their mentality was more, I'm not going to do it every day for the rest of my life.  I'm taking a measure of the data that represents my actions and behaviors that in aggregate really shows the world, at least the digital world, who I am.

In one sense, what we have experienced as humans right now, is this situation where other people, because of advertising and really good opportunities brought by advertising and governments in terms of safety, kind of literally know more about us than we know about ourselves.  And this is either some people can say it's good or bad or whatever else.  IEEE and me included although I'm speaking for John here, but I don't like to come from a place of fear.  It doesn't get work done.

Advancing technology for humanity, the tag line for IEEE is to say how do we frame digital Data Sovereignty.  Let's make it about an opportunity first.  So one aspect of owning your data is owning the narrative or understanding around what it would mean if we had the tools, technologies and the policies to do an example of every ‑‑ whatever person on the call would like to be able to do.  When you go to your doctor, let's say you have a child who is sick with something that is pretty severe, like not to be negative but whatever it is.  And you have to see a bunch of different doctors.  What a gift it would be, and I can speak from America.  Right now today, I have to go to my doctor and then a surgeon and then a specialist and this person and that's of course if I have access to these amazing people.  So I want to be conscious of that.  But they all in general, give me this.  It's a piece of paper, right?  Cool.  Thanks.  But about what could they give?  Now there is the opportunity to give me digital data about those trips.

In aggregate that means I can start to have what is called portability of my data which is instead of a physical piece of paper, the actual data about my child in that example, starts to be grouped in such a way.  Now all this means, I'm still paying my doctor.  I'm happy to pay my doctor.  Happy to pay my insurance.  These deserves that money very much as caregivers.  I have a resemblance of control over my child because at the end of the day, I care most about my child.  But a lot of times, I think there are conversations about data that get kind of hard to understand, for me even, I have been in this space for 10 years.  Data is a big word.  Let's make it more personal.  Because this is how it will work.  This is how we take ownership of our data is to own the narrative first and understand why this is so important.  Your health data.  You have to know when your doctor or your surgeon or trusted whomever givers you data that you can share it with someone easily.

And this is really sometimes technical as much as it is policy or otherwise.  But when you start to think about 10 different doctors and 20 surgeons and this and that, it becomes life or death.  And the question then becomes, if there is all these wonderful technologies that can analyze our activities and behavior for advertising purposes, which that is great.  I'm happy that people are tracking my preferences to know what coffee I want or what have you.  This is well coo.  But the question that I often ask people is, why do you think that the same tools to analyze aren't available for people for things like health and the rest of our lives?

There is it a lot of answers and they come from a positive standpoint.  Let's just say I don't know because I don't want to be negative and someone already pointed out about companies tracking and stuff.  But let me just come from a technical standpoint and expert standpoint.  Every single tool that is used for advertising and tracking that heads been very nuanced for almost a decade, those are all completely available for the individual.  Period.  Meaning is it science fiction to say we can't build a personal algorithmic Terms and Conditions for ourselves.  We absolutely can.

So let's make the positive thing and I'll wrap up this part of my talk, which is to say, this is such an opportunity to say to the advertising world, whomever else, thank you for all these beautiful tools and technologies and ways to track us.  And what I want to have a better experience with my product I'm interested in, that's cool and now all of those companies et cetera, thank you very much for changing all of your priorities to now say, what a gift to the world to go to the caregivers dealing with COVID, to go to the environment, to go to all of these different things where along with now tracking our preferences for what we want to buy, cool, now what a sacrosanct and beautiful time to use all the beautiful technologies to go to every individual in the world.  It sounds impossible buffets book and Google have been flying balloons over Africa for five years to give them Wi‑Fi.  The technology is there.  But the real opportunity is the narrative history and we have to stop saying, it's going to be so hard for people to do these different things.  It really isn't.  The message has to become the only way we will know who we are and have agency and identity and dignity is to go into the immersive era where we are equipped with the tools to speak back to the algorithmic world in the same we are tracked now and it's a wonderful opportunity and I'm very excited about it and I'm so happy we can do it versus just getting angry about things we can't change.  We can change and they need to change.

>> MOIRA PATTERSON: Thank you John and Salma for your perspectives.  And I'm seeing that there is already some dialogue going on, some agreements to some of these points about owning the narrative and also owning data and having some guide lines and controls around that.  But I'm not see anything questions yet so let me move to the second question that we have for discussion and then we'll ask questions and take them later.  Let me keep moving and move to our next theme.

I think it's actually related to what we just discussed.  We are talking about, and I think John to your point about a lot of these technologies have been developed.  How can we make sure they get leveraged in other contexts to make our lives better beyond the specific context you mentioned.  I think that's a good segue into this.  So do you see any changes that are needed and how we fundamentally approach technology development and how we think about solving problems?  And just one thing I want to mention before I extend the question to the other Panelist is that, I'd like to highlight the role of standards in this context because as a standards organization, IEEE of course is very engaged in this space and we see the critical role that standards can play in this space in scaling solutions.

So one thing as the role of standards to empower people, such as standards on digital literacy, IEEE just approved the standard in this space to help measure and create digital literacy frameworks, which then helps empower people with the necessary skills.  But even more fundamentally perhaps, I would say that human dignity needs to be at the core of our thinking.  And building on what was said before, the technology should serve people and people's needs at the very basic level.  People and their communities.  So developing standards to help support that fundamental understanding, I think it will be very important.

And so, I want to ask, related to this question for Dr. Salma, how do we ensure privacy and trust in this environment where data is commercialized in the algorithmic age?

>> SALMA ABBASI: That's a very good question and as I said before, we have a lot to do ourselves.  I think there is no doubt in anybody's mind that today we are turning out ‑‑ I use this word, trillions of bytes of data, and one billion photographs are posted on line on Facebook.  With the smartphones, the Internet of Things, the data being transmitted at high speeds across all sorts of networks, there is large data.  This is a whole business that has, involved and evolving faster every day.  There is a couple of things I think that are really paramount and this is first to acknowledge and understand that this data explosion is there and it is indeed a commodity and the person who is creating, generating this, is us.  So in order to be able to move forward and address the issue of trust, we need to understand that multiple components are playing with the artificial intelligence evolving and the ability of our personal information to be in nanoseconds or less than that, to be able to be analyzed and basically navigate through our data‑mining, deep Machine Learning, our personal information is now being exploited at new levels.

So this issue has been seen to be in a dangerous form when you look at how people are being bombarded and manipulated through specific needs, political or commercial.

I think that first of all, we have to, as I said, as John said as well, we need to agree that we own our data and there is a new narrative that is set by us.  And we have to demand, trust has to be earned but we have to demand our rights.  And I think unless we collaborate collectively with governments and the Private Sector and Civil Society, it is very difficult to be able to shift this huge power, which is in a ‑‑ bigger than the government's if you will, there is a under tone here of risk that is becoming increasingly hostile that manipulate us in every single way you can think of.  And I think that we have to rethink how are we going to give our data to business?  And the Private Sector has to realize that this data is owned by the individual.  What they or how they process it and manipulate it and resell it is something that we need to know and we need to be responsible and supported by the government to have those regulations in place to protect us.  How is it possible we can do all these manipulations and interrogations with manipulation of data and propaganda and promotion, but when it comes to the analysis of the trends in health, we are unable to do that?  When it comes to technology for social good, there is delays, no revenue, no billion dollars a they companies and corporations are earning.  So that has to be pushed by government with the right accountability they is driven by citizen.  And I will go back again and say that we have to be able to develop a balanced method which is allowing the data to be seen and owned by us and used given by our permission.

That is going to require new laws and new regulations to be in place that are enforceable.  The regulations need to be enforced.  Consequences need to be had.  Only when there are consequences will anything change.  You can't have the big 5 sitting in America in Congress and the government saying that so you tell us how should we better regulate you?  That doesn't work.  So governments need to increase their collaboration with think tanks and the people like the IEE and other organizations and agencies to actually increase our negotiation power to build a more balanced ethical understanding of how our data is being used and manipulated.  Only by demonstrating this as the Government of Estonia has done, will we be able to rebuild trust because at the moment, everybody, everybody knows that they are being manipulated.

Examples of the targeted messaging and nudging that is going on commercially to young 9‑year‑olds and 12‑year‑old children is inexcusable.  They then start to pester their parents so right now I'm in London, we are in lockdown and the kids are exposed to the Internet.  The amount of issues that are going on, it's like someone has opened the fire hose and there is nothing to protect the citizens.

So I think the government needs to be held squarely and fully accountable to bring to the table with some serious conversations regarding this, which shouldn't take three years to discuss.  It needs to be done in months, if not sooner.  And I think that one of the regions of the world that have done or are moving in the right direction is the EU.  They have made significant progress in this space and small pockets of activities are happening in the U.S.

And I think the fear and the anger scientists in the context of the facial recognition systems that CCT V cameras, all that bias in the algorithms has been something that literally forced the governments in California and Oregon to take the laws and legislation in place to stop the police body cameras, the facial recognition system.  So I know when we make enough noise, we can pull things back.

And I think that is what has to happen with the balance of commercialization of our data and the ethical aspect of us owning our data.  But we are also responsible to start asking the questions and not blindly clicking.  We have implemented GDP‑N Europe and it's a mess.  There is something there but it is a mess because you didn't click and move.  So we need to do something that is practical to follow as well.  Thank you.

>> MOIRA PATTERSON: Thank you for those insights.  And for really sharing a lot of different examples and highlighting the roles of different actors in this space.  John, how can we rethink our definition of societal success to optimize IA systems, design, to really ramp this up?

>> JOHN HAVENS: And thank you to the comments people have made in the chat.  And forgive me if I'm mispronouncing your name, Gobori (sp), I don't know as much about 5G but I'll speak to the current question and then I'm just not a 5G.

>> SALMA ABBASI: I'll address that later.

>> JOHN HAVENS:  Who governs the Internet and does an Internet Law from one country ‑‑ a lot of great questions.  I'll answer more generally and hopefully with your questions answer them.

I'm really glad that Salma brought off Estonia.  It's a specific country, 20 years old, meaning it was part of the Soviet union.  So I have been to Estonia twice and I'm a huge fan of the work that Estonia has done with Finland.  They have a lot of cross‑border work they are doing and data is digital.  It's not physical.  But the servers where they live, it's a big part of how data happens.  So I bring this up because we are in this fascinating time in humanity where living between the digital and the physical and we keep using status quo kind of terms about our physical reality whereas right now, none of us are actually in the same room together.  We are a combination of bytes and atoms rearranged on the screens you're looking at.

My identity right now is not physical to any of you.  It's virtual.  So I think one thing there, which I think people tend to forget is that we are already in the virtual immersive world.  And the opportunity of Data Sovereignty and Salma mentioned the ownership of data.  Let me give another metaphor which I'm just in the metaphor mood today, I guess.

A lot of talk about privacy, they'll say what have you got to hide?  People use this phrase.  If you're focused on privacy.  And I'd like to give the answer back, what is ours to reveal?  Now what do I mean by that?  I used to work in advertising and PR, top 10 PR firms.  Let me hit you up with exciting advertising facts.  When you track someone on line, there are a lot of great tools to track someone.  And hi clients like Gillette and HP and on Facebook, you pay money, CPM, cost per thousands ‑‑ I think it is.  But basically if I want someone to buy my whatever, razor, and I'm Gillette,  you can pay a certain amount for a cost of acquisition to get the data about them that will help me get them to buy my razor.

Now in one sense, this is fine.  It's like hanging up a sign outside of the store when you walk by, hey, I have razors.  Bob, buy my razor.  However, the difference now is how this is all aggregated and how the data about a person, which gives them information beyond just a shave, is are they male?  Where do they live?  What are their preferences?  What is a portrait of who they are?  What I'm getting back to is, the actual job of a good advertiser is to get to know someone and hit them up with an offer that will bring value to their lives.  So that's not wrong or evil or whatever.  So that is why morality and ethics, I'm careful how to use that term.  It's just what it is.  This is 10 years ago.

Now cut to the present, the opportunity for any company, and an organization there are two things, there is building trust and there is also sustainability of your business.  Now I bring this up because you can buy from data brokers who are not organizations.  These are third parties.  Pictures ‑‑ are you all right Moira?  I thought maybe ‑‑ okay.  Pictures of who is coming in to say how do I get this big group of people to pay for my stuff?

Those portraits, that's where I have my most concern because as pointed out, we don't own the list and in terms of owning our data, there is the ones and the zeros that represent the Code of Our data but the portraits of who we are, these are the analysis and insights about us.  This is what is valuable.  Why?  Because I want John to buy something.  I have this picture of who he is.  I introduced ads into his life, digital or virtual he'll click and buy something.  This whole system, before we get into the economic and I'm not disagreeing with Salma about the legislative side of things.  Here is what smart modern businesses know.

Pretty soon the individual is not needed anymore to track me.  You want to know what a eight a month ago?  I have no idea.  20 other people do and you can buy a list that says what I ate six months ago.  But eventually, if you want trust me, the man, John, you actually have to create systems where I know that you are reaching out to me directly.  It's an old school peer‑to‑peer mindset of the general store.  If Salma walks in and I know she likes black coffee with one sugar, here is your coffee.  Now she trusts me.  Why?  Because she has asked for something, I provided it, she trusts the framing and there it is.

That trust, those frameworks again whatever the physical substantiation of a metaphor, the general store doesn't exist.

This is why we have no trust or the real potential of erosion of trust, is because tracking us from the outside in is great.  That's how governments protect us.  Without data sovereign tools, that as an individual, we can't say back to the world, this is who I am.  And also why this is so critical is eventually, almost now, at least in the states with the current election we just had, if you're not allowed ‑‑ if you don't have a platform to simply say things like, I'm a Democrat.  I'm a Republican.

This is how I want to vote.  And you have a way to know that when you press the button and share that information it is through say block chain or trusted data coverage is structure that you're saying it just in a trusted way.  And this is what Estonia does so beautifully.  Then it actually means democracy doesn't exist and this is, I'm not talking about how news gets to people and all those things.  I mean, literally the channels of in the immersive world like when virtual reality takes over, if right now Moira was like, we are going to take a vote about whatever we just talked about, I have to be able to just press a lot on and know this is going to go right to Moira.  Like an individual IM versus going to the whole group.  It's about clarity.  And technical exactitude.  This is I didn't IEE, the standards I'm working on are a focus on this.

Hold on, before we get into these incredibly important difficult questions and privacy and whatever else, how do we sustain business?  We sustain business because brands need to be able to know they can talk directly to their customers.  Right now they aren't.  If I wanted to know about Salma, the logic is, hold on, Moira, do you know about Salma and ask 40 other people to see if she likes a certain beverage.  You know what I can do?  I can ask Salma, do you like this beverage directly?  And that's cool.  It saves a ton of money too.  So brands who are progressive and smart and want to move beyond the status quo, are realizing that along with these critically important questions of privacy which are essential and Seminole, they are also like, if we don't start to embrace data channels, then actual trust that is two way established through block chain or similar types of channels, we literally will have no way to know and our customers certainly won't know how we will be trustworthy because we haven't built the challenges to try to understand how to hear and listen to them directly.

And legislation by the way, I'm agreeing with some of the folks in the chat.  Like what should governments do?  How should legislation whatever?  I found in the past five years instead of getting angry because it's easy to get angry and frustrated, is to tell businesses, let me ask you something.  Do you believe what you're saying about trust?  Then you have to build the date sovereign channels to allow your customers to speak back to you or it doesn't make a difference that you think you're being trustworthy.  You have not built the tools and empowered them with the tools to actually answer you back.  So what you're saying, which is critically important, being responsible in my design.  My intentions are to build trust.  Those things are wonderful.  That's cool.  Thanks.  But you know what you can also do?  The same structure you had in place for a decade to track advertising.  Learn about date sovereignty and news flash.  You'll save a ton of money.  Cost of acquisition is so much easier to go to Salma, do you like this coffee beverage?  No, I like tea.  Cost of acquisition once the tools are set up, zero.

And now I don't have send her ads that will cost me tens of thousands of dollars and goes into that brand ‑‑ and I'll wrap up.  The brand reached out to me and we had a peer‑to‑peer data exchange.  They did what they said they were going to do and that and only then is when trust exists.  Only then because Salma gets to say they did what they said they were going to do.  They followed up on their action.  Now she has the opportunity to trust because she is being given agency dignity and tools to do so.

>> MOIRA PATTERSON: Thank you, John.  And I think that is really great concrete examples of how the technology can work and can work to really make or create Sarah Wynn‑Williams situation.  So let me quickly jump to the two questions we have.  And I'll start actually with the second one.  Catherine is saying that the need for government regulation, is from but especially in the global South there may be some ‑‑ it may be difficult until whether there is advice to strengthen the enforcement of legislation.  So one thing I want to mention and I'll ask Dr. Salma to say a sentence or two about that because I know she works in this space.  But one point in IEEE we are actually working on a program where we are looking to create a standardization strategy.  So again standards can be tools that help translate principles and requirements to make them implementable by communities.  And we have a program where we are working with a lot of institutions from the African centre organization to smart Africa and one of the key things is to develop a strategy and then a roadmap that outlines some of the ‑‑ specifically in the context of the industrial revolution and that is very relevant in this context and the goal is to give governments tools to access this, knowledge and concrete tools I know other organizations are doing similar things and blueprints being developed for governments as well which may be helpful in this context.  But please feel free to e‑mail later if you want more information but I'll ask Dr. Salma to say a word about that and also tow address the 5G question.

>> SALMA ABBASI: Thank you.  Before we started the Panel I was talking about how difficult it is for countries, particularly in Africa and least Developed Countries that are being pushed by Private Sector and the governments are being pushed by development agencies to accelerate their broadband.  And this morning at 4:00, I was on a webinar with CIS countries and I was talking to them about my experiences in Africa.  And I believe that one of the most important things that we have been doing with the IEE for the last 9 months now is we have been collaborating on organizing a series of targeted webinars across Asia‑Pacific and how to build digital resilience and mitigate the COVID‑19 basically the spread of the pandemic and support the response.  And in that context, we discovered many, many huge gaps, systemic gaps that hopefully we'll be launching a program in a couple of weeks to build the or address the gaps through the missing standards that Moira mentioned.

During the course of these discussions through the webinars, where countries share their experiences and policies and regulations and challenges and testing, tracking, tracing methods, they also share the regulations and how things were modified because of the pandemic to maintain the trust.  And this is South Korea and other countries in the Asia ma civic.  What we established in June was a collaboration between Asia‑Pacific and Africa.  And we built the bridge of south to south collaboration to accelerate the adoption of standards and policies and regulations that were working there because I'm very sensitive as a person who works heavily in Africa, particularly in Nigeria and in Garner, that we are all quick to adopt technologies and we haven't got maturity in our infrastructures.  So the only thing I can say to you is as Moira was saying, by sharing knowledge from other countries, adopting and localizing what works for you to make sure that your citizens are actually protected and you're not going to be exploited by technology companies who are going to come in and take everybody's data.  But I think this is a very good topic for future discussion in this area.

As for 5G, there has been huge concerns.  I remember in the early days if you remember, some may remember, there used to be ‑‑ this is the size of my speaker at the moment but it used to be the mobile phone for Motorola.  And I used to have terrible my grains in the 80s and 90s and I would went to see my doctor and said I don't know what is wrong with you, I think it's your mobile phone.

There used to be a massive headache on the same side I used my phone.  And I think that now with Wi‑Fi and everybody wants Wi‑Fi so there is no public space you can go to where you're not being X‑rayed.  You tell me.  The proof is in the pudding.  I think there is a lot of research going on in that area.  But the evidence is weak at the moment.  And I think that there are consequences to health with technology and as John was saying, the advancements of humanity 3 technology is what we are looking ‑‑ through technology ‑ what the IEES working on.  I'm running over but I have so much to say.  I'm going to pop my e‑mail down in here because we can chat later.  Sorry, back to you.

>> MOIRA PATTERSON:  These conference are so interesting, we have taken a little bit of interest time.  I will ask us to contain our passion and keep it to two minutes because I know we want to highlight these last themes because they are important and we can always continue via e‑mail later.  So the last thing is looking into the future.  And what do you view as one of the key important technology developments that you think will impact and enable personal sovereignty and the on line off line experiences?  And one thing that I would like to highlight is children and really looking at the needs of children in this space.  And finding or recognizing and acknowledging we need to have solutions and services and products for them that take into consideration their evolving capacities and their needs, the fact that they are growing up in an on line off line world that they have different capacities and different maturity levels.  And then we need to balance the need to protect them with the need to let them thrive in this space and look at holistic and inclusive ways to design these tools.  So that is a space where IEEE is looking at making an impact and, working with global stakeholders, including Dr. Salma who is also passionate about this topic.

So with that, Dr. Salma, let me hand it over to you and what is one key development that you think is going to be critical in protecting the free will in a digital society?

>> SALMA ABBASI: I'll try to say this very quickly.  I think there needs to be a paradigm shift towards our view of our data.  We need to reappropriate our data and stop giving it away for free.  As I said before, we have billion photos on Facebook a day.  We are the original producers.  They turn around and sell it for billions of dollars and we who are the producers get zero.  Until we get this in our heads we won't be safe.  We have to understand that human beings are actually being degraded and we are just objects, to be commercially exploited by the hypocapitalism running around with these Internet companies so‑called to make our lives easy.

The first thing, one of the first things we have to do, stop being lazy.  Don't just keep swallowing it.  We need to get back and say, wait a minute.  What that means and how that translates is that we have to turn around and say that we are ‑‑ own our data.  Reappropriate.  I need give you one example.  I love examples.  This is it an American company and it's an American personal data collection company called axiom.  I'll put the name there it advertising 360 review of your customer.  They gather our data, our behavior, our status.  In the 80s, these are stuff HR could never ask.  Are you married?  How many kids do you?  This knows your kids, job, hobbies, how much money you make so they can target you and they have 70 categories to measure you.  And then if you don't fall in 70 categories, they throw you away.  They individual a dump pile.  It's called waste.  We have analyzed this to the 'enth degree.  You can not imagine how this is reading about what it is about us it's an awful situation to be in which is we don't want our children to be that way where they are going to be exploited.

We need to bring back the monetization and the ownership of our data just like it's an asset.  If any of you own houses, you have a physical property asset, look at yourself data that way.  If we have that paradigm shift and we own our data, we will be able to say, these are the Terms and Conditions.

This is the contract I'm making with you.  I'm allowing you to use my data.  This as John said, people frighten you by saying, so what do you want to hide?  My name is Salma Abbasi.  I could be a terrorist.  What am I trying to hide?  It's ridiculous.  We should not be pushed into guilt to think we want to hide.  Excuse me, my privacy whatever I do is my business.  What I wish to share with you, if I want John to be able to buy me a cup of tea, it's my business with John.  I don't like coffee.  That's it.  His business is my friend to know.

So we need to be able to understand and define the contractual terms particularly for our children who we give freely here is a tablet, play with it, go quietly in the corner.  We don't see what they are doing on Facebook and other social medias.  We need to be able to stop giving away our information to these AI grinders with these algorithms that are trying to process and analyze and predict our behavior and how they can sell or manipulate our minds.

We need to understand in these Terms and Conditions who will be giving our information to, how freely do we want to give them to?  In John's case who wants to give freely to all hospitals, all medical companies.  How long they should have it for and what should be the price?  Our data should be sold at.  And most importantly, a termination clause.  This is how we can regain the trust because we will know after six months or three months after the pandemic is over, all of us and our data will be destroyed.  Again, I'm going to talk to final conclusion, I'm going to talk to what is happening in Europe with the AI committee.

The former Martin Schulz laid the foundation to engage multiple stakeholders to discuss and build the resilience in our data.  And we need to just wake up and participate.  The IEES an excellent platform to discuss and one of the questions of who owns the Internet, we do.  We are the users of the Internet.  Internet Government Forum, this Governance Forum is a perfect flat form to voice, consolidate and collaborate today to actually make sure we can protect our children from the invisible dangers on the Internet it's much more dangerous than risky for us to have our children in their bedrooms on line.  We are out of time.  I'll stop there.  Back to you.

>> MOIRA PATTERSON:  And thank you for all this extra context.  And thank you for answering the questions along the way too.  So well done.  Let's give John a chance to talk about extended reality which I know is a topic near and dear to his heart, that is getting a lot of attention and opportunity and the data space.

>> JOHN HAVENS: 60 seconds?  We are at 2:30.  Just to build on what Salma said, complete yes.  And I think it's more and more an urgency that is just required it's binary.  And all the stuff that Salma said is said.  We can get angry or ‑‑ it's true ‑‑ but extended reality is the idea of the immersive web and a lot of people probably know what it means.  We are already in it, right now.  Right now.  Hello.  I'm not in the room with you.  This is extended reality.  During COVID, how many hours a week do you all spend here?  What is going to happen soon, many gamers right now in second life or games are spending 50‑60 hours a week in games.  And when we have our glasses have augmented reality and we see digital on top of whatever, the point is, that pretty soon our physical reality may not be the main place where we spend our conscious lives.  So there is a binary choice here.  Either the system continues to track us and that's it and all the ‑‑ let's say what it is, negative stuff, only expands.  It only becomes more concern oriented.

Or, what a beautiful opportunity just to simply say, before you put that helmet on, that is an Oculus rift helmet that is owned by Facebook because Facebook owned Oculus rift, hold on.  All the great stuff about that game, you'll still have it but here is the digital Terms and Conditions that you can write for yourself with your family, with your kids that simply says, these are my Terms and Conditions.  And the government is doing their stuff.  Businesses want to sell.  Cool.  When you put on that helmet, the second something happens where maybe whomever, whatever actor algorithm supports do something that you're child or you weren't wanting, there is simply a way to say, hold on, these are my Terms and Conditions.  It doesn't mean they'll be honored.  Ownership is a big complex discussion.  But the point is you're in the mix.  So the dream for me is we are in the mix because it's either we are or we aren't.  That's the extended reality, we either extend economic and all the other realities that we are not as humans fully present.  Or, we say once and for all, to Salma's points, here is our rights.  Instead of it coming ‑‑ I'm not saying that about you, because I shouldn't rage.  But hey, everyone else, what age is it?  5 years old?  4 years old?  The tracking isn't working for all of us because it's mainly working for a very small few of whomever.

And the technology is so amazing.  We have all these great standards.  IEEE is doing so much good work.  This is where we need all the Partners to simply say we can either be angry or we can just get the work done and the work is protecting our kids, honoring our dignity, taking all the tools that are available for advertisers in a certain small group of society and saying, thank you for developing all this stuff.  Now give it to us as well.  That's it.  It's actually not that hard T really isn't.  The hard part is the ideological, political and all these other struggles and the demand can also come from okay, everybody, give us what we need to protect our kids well.  Extended reality will do that because either it will be complete loss of everything, agency, et cetera, or complete gain.  So I'm working for B.  And thank you for the opportunity to be here today.

>> MOIRA PATTERSON: Thank you, John and for ending on that positive note.  So I want to reiterate a few things as we wrap up.  So, if you were interested in any of the topics or activities that John, Dr. Salma or I highlighted, contact us.  We all put our e‑mails in the chat and there are many ways to get engaged.  IEEE is looking to make these things implement age and practical.  So it's helping create the solutions and be able to scale them and role them out broadly.  So, that is part of putting us in the mix as John said, and helping build trust.  So, feel free to reach out to us.  I saw one question on Child Online Safety and I know John already highlighted IEEE is doing work in that space.  So reach out to us.  And also Dr. Salma is one of the leaders or the leaders of the child on line protection work.  Also reach out to her on that.

And with all of that  I want to again thank everybody for attending, for your time and interest and your questions.  I want to thank our Panelists, the IGF people and IEEE team also supporting us and coordinating us to make this event a success.  So thank you for your time.  Feel free to reach out and also feel free to check out our boots on line, which I believe has also been shared in the chat.  Looking forward to connecting in the future, everybody.  Thank you.



Contact Information

United Nations
Secretariat of the Internet Governance Forum (IGF)

Villa Le Bocage
Palais des Nations,
CH-1211 Geneva 10

igf [at] un [dot] org
+41 (0) 229 173 411