IGF 2019 – Day 2 – Raum II – WS #261 Equitable data governance that empowers the public

The following are the outputs of the real-time captioning taken during the Fourteenth Annual Meeting of the Internet Governance Forum (IGF) in Berlin, Germany, from 25 to 29 November 2019. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 



>> So maybe we will wait two or three minutes beyond 9:30 because it is the first session.  Who is the presenter?  Pretty simple even for idiots like me.  And this is ‑‑ this you control?  So we're good.  Oh, look at that.  Thanks very much.  Thanks.

>> We are going to wait one or two more minutes because it is the first session of the day.  I am expecting many more people to come.  I will see if I'm right.

>> Is there a hashtag for this session?

>> That's a great point.  Yes, there is.  We are going to make it up on the spot.

>> (Off microphone)

>> PHILLIP DAWSON:  Is there?  Okay.  WS261.  And, you know, it just rolls off the tongue.  And then maybe data trusts, first two words up here, data trust.  So that's ‑‑ so that's the first order of business, sort out the hashtags.  We are going to make this a little less formal.  Everyone is in the center.  So that's good.  So I will begin by introducing myself.  We are going to stand in front ‑‑ I guess ‑‑ yeah.  We have had a bit of a change ‑‑ so my name is Phillip Dawson.  I work at a company called Element AI based in Montreal.  We had a couple of changes in the last week.  So where the three panelists who are on the program all canceled.  So then it became just me.  So that's something you will have to live with.  And then very kindly.  Dr. Roja Seward and Mr. Philippe Rodriguez from the Government of Canada who are familiar with the topic and also interested in the topic personally have agreed to work with me in the session.  And instead of ‑‑ it was originally organized as a panel.  What we are going to do we are going to have more of an interactive presentation of ‑‑ on data governance and specifically data trusts.  And then open it up for question and answer and hear what you guys have to say about some of the points that come up.  So maybe I'll ask Roja to just introduce herself and who you work for and your interests.

>> Hi.  I'm a senior program officer in technology and innovation program at the International Development Research Center and the IDRC has been working on data related issues for over a decade from open data to big data.  And now we work a lot on Artificial Intelligence and different kinds of Human Rights approaches to data governance.  And we work in the Global South.  So IDRC seeds and works on developing research ecosystems and policy influence in countries in all across Africa and Asia and Latin America and the Middle East which is actually where I'm based.  We look forward to discussing this topic.  And it is a small seed in what we are thinking of, but I'm curious to hear from all of you and if you have an interest in it and if you are doing any work on this.  Yep.

>> Hello.  Philippe Rodriguez.  I'm a senior policy advisor in the Government of Canada.  I have an interesting position in that I spend some of my time at our Foreign Affairs, Department of Global Affairs Canada where I work at the Center for International Digital Policy.  And we have worked quite extensively on AI and Human Rights.  We have worked on global relationship on Artificial Intelligence and worked extensively with the Freedom Online Coalition on developing normative language around AI and a few other things.  And the rest of my time is spent at our Priv Council office which is the part of our Government that serves our cabinet and the Prime Minister.  And so right now as you may know Canada went through a change of Government and the questions surrounding data governance, digital governance are front and center in terms of a new mandate.  We are here to discuss this important tool that is data trust, but also to get information in terms of how to shape the direction of our policy going forward.

>> PHILLIP DAWSON:  Thanks very much.  I want to get a sense of who is here.  And we won't go through introductions because that will take quite awhile.  But who is here from Governments?  Just one.  Anybody from a political party?  One political and one Government.  How about Civil Society Organizations?  A few more.  Private sector?  Okay.  And what about academia?  Okay.  Pretty balanced group.  So I'm going to hold you to account for this.  We now hold official multi‑stakeholder meeting.

Okay.  We are going to start off really basic.  What words come to mind when you think about data?  I'm going to get a couple of the easy ones out of the way first.  So, you know, I think privacy is one.  What are some other ones?  You can put your hand up or say something from your seats.  What are some of the words that you think about when you think about data today?  Protection.  Transparency.  Resource.  In the back.  Open data sharing.  Free.  Okay.  Yeah.  Over here.  Bias and control.  What about power?  Power is another one, yeah?  Any other words?  What about data governance?  Some of the same words apply.  Anything else?  Data leaking.  Data economy.  Value.  Policy.  Data ownership.  Decision.  Decision‑making processes.  Anything else?  For me power comes back again.  Does this help?  Data governance is a very complex constellation of structures and people and outside forces or the structures we use to make decisions about how data.  And you notice in this you can see this is either augmented or virtual reality or blindfolds, right?

So let's go back.  The structures we use to make decisions about how data is collected, managed, shared, used and deleted.  So that's kind of the basic understanding of data governance that we are going to go with.  It is very simple.  It is the decision ‑‑ it is the structures we use to make decisions about how data is collected, managed, shared, used and also deleted.  Anything else ‑‑ does anybody else have something to add there that maybe we are missing?  No.  Okay.  So how does it work?  In a real world, how does data governance work today?  What's going on?  Do ‑‑ who is in charge of data governance?  How does it work in the real world?  Well, mostly so far without us, and this is a slide from one of the panelists who is supposed to be here, Sean McDonald, his organization Digital Public.  How does data governance work today in the real world?  Mostly without us and what comes with that is that we often don't really know.  Some of you may have seen the story in the last week, but California is making 50 million over selling driver's information.  It includes people's names, addresses, other personal information to generate revenue without their permission.  This is an example.  What about project Nightingale?  We see more and more of these all the time.  This is an instance in which we ‑‑ it was a whistleblower informed the Wall Street Journal, informed that Google has acquired the intimate medical records of 50 million patients, information that was not deidentified without the knowledge of doctors or patients involved for what purposes, not entirely known.  And the investigation is still unfolding.  This is an example I heard recently at a conference where a group of people had set up a support group on Facebook.  And they had all ‑‑ what they all shared in common was a predisposition to cancer.  And this is something they learned, it was a genetic disorder that they learned when they were young.  And it affected their life in different ways.  And they had come together on Facebook to share information about their experiences.  And often very sensitive medical information about their experience.  And, you know, the last few slides illustrate the extent to which, you know, in this context as well this particular group of people had very little control over their sensitive personal information once it was on Facebook and what do they do now now that they have been sharing that information on the platform.

Another word I think about when I think about data, data governance is accountability.  How do you hold the companies that govern data accountable?  You will remember in the last year or so that Facebook had refused to respond to subpoenas in a variety of jurisdictions.  Here we see in California and also in Canada to testify about the events concerning Cambridge Analytica.  Succession, we talked about resources, assets, data as bringing value.  What happens when there is ‑‑ when there is no provisions for what happens to data after an entity dissolves or not for entity without much resources?  This is a case where a huge amount of data research collected over five to ten years, forget exactly the number of years, disappeared on ‑‑ and this is research that had been publically funded.  And it ceased to exist after a not‑for‑profit dissolved.  What happens to this succession of data as an asset?  So the current framework is largely governed by terms and conditions and all of us are familiar with that exercise, clicking through terms and conditions and are familiar with some of the problems.

So first they are nonnegotiable.  It is a take it or leave it situation.  You either accept the terms, and then you have access to the service or you choose another service provider.  That's kind of how it works.  And the terms and conditions we know are very long and complex.  One of the problems of this situation is that there is no other model out there.  And part of the reason there is no other model out there is because there is not much competition among ‑‑ in the technology sector, especially in the large platforms, but there is also not a lot of competition in terms of innovation around data governance.  Kind of this race to the bottom.  So there is no real choice other than to succumb to turn the framework which revolves around terms and conditions which are lengthy and complex.  You may be familiar with the study from 2008 that reported 244 hours of time is required to ‑‑ for any ‑‑ the average individual to go through all the terms and conditions that they might engage with in a given year.  You can imagine how that has scaled since then and how that will continue to scale.  We often talk about the problem of privacy self‑management and it is a burden that individuals no longer should be expected to carry.

There was another statistic that reported that a third of the ‑‑ a third of the terms and conditions of Fortune 500 companies required a post‑graduate degree to understand.  There is length and complexity to understanding how our data is collected, how it is used, how it is shared.  And I'm not sure how often it is deleted.  But we put that in there just for to be comprehensive.  So the current framework gives people little control over the personal information.  Leaves people too vulnerable to privacy and other Human Rights abuses.  Concentrates data in the hands of some companies and lack of representation in decisions that are being made about the use of our data.  Often it is an exclusion from value.

We talked often about targeted advertisements that generate an enormous amount of value as one of the principal ways that data extraction value, it has nothing to do with individuals who have participated and generated data and lack of accountability.  These are just some of the problems in the current approach to do that governance which is based on terms and conditions.  Regulators of personal information in all but name.

So I'll just read this for you.  This just goes back to the problem of privacy self‑management and being unable in the current circumstances to anticipate different risks.  The complexity and opacity of information flows, makes it virtually impossible for individuals to discern, much less self‑manage the risks or rights that engage to when consenting to the use of their personal data.  We are not able to understand the current or future context of decisions that relate to the use of our data.  And this exposes us to great risk.  Privacy is one of them and security breach is the other but also more broadly other Human Rights including discrimination.

So we need new approaches.  Right?  We need to start innovating on data governance.  And this isn't ‑‑ isn't going to happen without a collective, a group of people from different constituencies discussing and thinking out loud about how they would like to see the future of data governance evolve.  These are just some of the things that we are looking for, repeat kind of responses to some of the problems in the current framework.  We want greater control.  We want to address asymmetries of power that exist and also enable other people, the public to share in the value of data.  So these are some ‑‑ just some of the kind of aspirations for new approaches to data governance.

So this is a shot of a workshop that was held at the company I work for, Element AI about a year ago where we did just that, we took advantage of hosting a large international meeting at our office.  This is actually the G7 Digital Ministers' meeting to invite a bunch of different people who are attending that meeting but people from the Montreal community two days to think about a new approach to data governance based on ‑‑ in trust law, data trust.  We went through a variety of scenarios after reviewing some of the legal foundations of this new form of data management.  And came ‑‑ and came up with a ‑‑ what we thought, you know, based on some of the research of experts who were there, Sean McDonald being one of who was a panelist today and Sylvie Delacroix, another panelist who could not make it.

Some of the current research on data trusts and tried to tease out how it might be applied in different contexts.  Data trusts, what's a data trust?  Who here has been following some of the data trust conversation?  Who has heard about how they have been applied?  I want to know about what you think a data trust is.  Sir.  No?  Okay.  No.  Anybody else?  It does.

>> Audience:  (Off microphone)

>> PHILLIP DAWSON:  Sure does depend.  Yeah.  Yeah.  Okay.  Yeah, that was ‑‑ some of you might be familiar, one of the most visible discussions in the news about data trust occurred in the city of Toronto as sidewalk labs, alphabet affiliate that is ‑‑ that is one of the ‑‑ the contractors of the water front Toronto, a Government entity that has put out a bid for the redevelopment of part of the water front.  Sidewalk labs said the way we manage the data collection and use in the context of this project is through an urban data trust.  And a lot ‑‑ a lot of that sounded kind of good on the outside.  Some of the proposals were great, responsible data use guidelines and impact assessments for invasive data collection, but they stopped short of other components of a data trust including trust law.  A key component there that attracted a lot of criticism for one.  And there was an absence of meaningful consultation on ‑‑ in the proposal.  So it might have created another topdown situation in which a large technology company defined the terms of what this data trust could be.

Okay?  So that's ‑‑ so the sidewalk labs example is a key one.  And it certainly illustrated the fact that, you know, a data trust is not a solution in and of itself how you construct it, who is involved in constructing it, all of these things will shape whether or not it is effective and serve some of the purposes that we just discussed.  Anybody else another example of, you know, something they have read about data trusts or an idea of what a data trust might be?  Okay.  So this is a very, very basic definition.  Somebody actually told me recently that it is still not very basic.  But data trust creates a legal way or a structure that helps with the management of data rights for a purpose that is valuable to a beneficiary.  So and we come back to a lot will depend on who is the beneficiary, who is doing the management, who is engaged in managing the access and use of the data.  This is kind of a basic definition that we are going to go with.  So it just creates a new way, rooted in trust law, that enables us to manage data rights for a purpose that's valuable to beneficiary.

I hope you can see that.  But we will go in a little bit more detail.  Specifically it is the application of the common law trust to the management of data or data rights.  And this is a key point because there are other instances in other jurisdictions in the United Kingdom where data trusts have been talked a lot about.  And they are being talked about in ways that do not include any reference to trust law.  So the way we are talking about it today is specifically rooted in trust law and leveraging components of trust law, including fiduciary obligations to enhance the obligations of a trustee who is managing the data.

So there is an asset that's put in the trust.  We are talking about data and data rights.  A trustee would have a fiduciary duty.  It could be including duty of care and duty of loyalty and these are other concepts that need to be impacted.  Fiduciary duty to manage assets for the benefit of an ascertainable person or group in accordance with the terms and purpose of the trust.  The purpose for which the trust has been created is also very important.  And in combination with the fiduciary duty gives a kind of proactive forward looking responsibility to the trustee who is ‑‑ who is then managing the data on behalf of the beneficiaries with a kind of ongoing purpose driven risk management approach ensuring that their interests are looked out for.  The trustee negotiates access and use on behalf of beneficiaries and there is built‑in accountability because, and this is just a feature of trust law where you have the personal liability of the trustee, okay.  So you have direct accountability and built‑in legal mechanisms to enable the trustee to remain accountable.  And also have a collective action mechanism.  Some have been extraordinary advances in comprehensive data protection legislation in the last few years.  The GDPR it is the leading example there.  But still data rights are exercised individually.  You can imagine the difference that might be created if together we were able to pool our interests through a trust and exercise greater leverage vis‑a‑vi a data controller.  If a data trustee could withdraw data rights for a group of 5, 10 or 50 million people from a platform, that might have more sway than if I made an individual request to access some of my data or to port it from one company to another.

Okay.  Look at just two examples or theories of data trusts that have been talked about.  We touched a little bit on this one so far.  Or at least what some of the critics of the sidewalk labs urban data trust were advocating for.  Civic data trusts, so this is ‑‑ this would be a data trust that incorporates civic participation directly in to the trustee organization.  So you can imagine that community groups or city officials or other stakeholders who are outside of the government entity or the sidewalk labs vender in this case would be participating in decisions about the use of the data.  Maybe the department for urban transport that's going to be involved in some of the decisions about how the data is used could be represented on the board of trustees.  So also just illustrates that the point that was made that, you know, the composition of the data trust how it is created has a huge impact on what type of decisions are made and what type of value is extracted from the data.

So this also promotes public interest and accountability and decisions about the collection and use of data.  And has an impact on sharing of value from the data.  So the first, this is basic features of a civic data trust often think of public sector projects, municipal projects but it could be community‑based projects.  The core features here are representation on the board of trustees and extending beyond the data controller.  Something called bottom‑up data trust.  This is a way, conceived as a way of returning the power that stems from aggregated data to individuals.  If all of us got together today and pooled our data in to a trust and established, you know, maybe we chose a purpose that the data that's ‑‑ let's say is generated from our rides in ‑‑ none of us take Uber here but pick your ride ‑‑ maybe some of us do.  I do sometimes.  The data that's collected from your ride sharing trips in to a trust that is not ‑‑ that does not belong to the company then maybe we would set up a trust that has the purpose of to support urban transport or sustainable transport in the city of Berlin or maybe we make it accessible to other cities that don't have access to, I don't know that, that type of data collection platform that ride sharing companies create.

Okay.  So the bottom line is that you're empowered to pool data in to a trust that would champion a social or economic benefit of your choosing.  The data trustees act as an independent intermediary that negotiates the terms of data collection and use.  So this is the part where, you know, this addresses some concerns about privacy self‑management.  The fact that we can't individually do this on our own anymore and be effective at it.  So maybe we need some type of professional data trustee, I don't know, or trustee organization or group of them to help us manage this burden.  And then also we have the question ‑‑ the advantage of pooled or shared ‑‑ pooled interests that helps the data trustee exert more leverage.  In this particular concession and this is credited to Sylvie Delacroix, ecosystem of data trust.  Not just one.  You have to have a competitive environment where data trusts compete around the value they bring to people, the way ‑‑ the manner in which they are able to champion or social or economic benefit of your choosing.  This is the viability of this type of ecosystem is something that needs to be explored.  But this is the ideal.  Ecosystem of data trust in which data subjects could choose a trust that reflects their aspirations and be able to switch trusts when needed.  So that you are not beholden to a particular trust.  And if something happens that the trustee does that you do not ‑‑ you do not think conforms with your interests or the terms of the trust, you can leave.

Okay.  We talk about Smart Cities.  We will go quickly through these.  These are some examples that came up that we held at the workshop at our office.  So it is a separate entity from the companies involved in the project.  It is a new entity that is the purpose of managing the data collected from the project.  It is a participatory model of data governance.  So you get other stakeholders involved in the decision‑making process.  And then the ‑‑ as mentioned before the representative nature of the trustees has a huge impact on trust in the usual sense of the word.  And the legitimacy of the data trust itself.  We talked about health data.  And some interesting challenges came up.  You can imagine there would be ‑‑ there is an interest in increasing access to data for research.  And the people have different interests when it comes to health data.  Some people who have been quite ill they may be more predisposed to sharing data because they have maybe a stronger sense of their interest in advancing research in a particular ‑‑ for a particular disease.  That could create an incentive to share data but also could predispose this particular group to a certain vulnerability, right?  Whereas healthier people may not be as interested in participating in data trusts.  It was an issue that raised some of the challenges that come with giving control over data or greater control over data to people, when, especially in the context of very sensitive data.

Some of the challenges also come from data share providence and medical imaging.  So you have ‑‑ I go in for a ‑‑ some type of a scan.  I have participated in generating the data.  But so has the doctor who has performed the scan.  And if you are in a public health care situation, maybe the Government has an interest in the data that has been generated.  You have this data share situation, if the data is in a trust who has ‑‑ who has authority to withdraw rights to accessing that data.  Do all three groups have to?  Have to agree?  Can they?  Even if they are all on the board of trustees, there is ‑‑ this illustrates one of the complexity of, you know, thinking about new approaches to governing data.  Because data is not something that is held exclusively by one person.  It is usually some concept of shared rights over data that is in play.  We talk about portability and ratio rights and then already mentioned variable patients participation.  Online platforms, we talked about this briefly already in the example of ride sharing.  So I won't go in to it as much.

Often people when we get to talking about online platforms they say how are you going to get a company like Uber or a company like Facebook or ‑‑ to not collect their own ‑‑ their own data but rather have it pooled in to a trust.  Well, I think in a ride sharing creates an interesting situation where these companies have to ‑‑ a certificate to operate in the jurisdiction where they want to.  Could be made a condition of the licensing process, right?  So I think that's a little bit more feasible.  For social media platforms I would encourage you to look in to some legislation that has been proposed in the U.S. and I think in the Senate.  One is called the Access Act.  And they are starting to talk about different intermediaries that would assist the public in managing or as an interface between the public and platforms.  Very similar concept to what we are talking about in data trusts.

Implementation challenges we have already gone in to a few of them.  Maybe I will stop and ask if there is any questions on some of the things that we have been going through.  I have been going quickly.  Any questions?  Everything is crystal clear.  Oh, there is a question.

>> AUDIENCE:  (Off microphone)

>> PHILLIP DAWSON:  Yeah, right.  So then I guess my reaction to that would be to say we have to think ‑‑ we have to think of the structures of data trusts may not be dependent or any data governance model may not be dependent from whatever value comes from it.  Who gets a say in some of the decisions made, purposes for which it might be used.  We don't really know the value of the data that is being collected from us through all the applications on our phones.  We don't know that.  Yes.  Okay.

>> AUDIENCE:  Which direction the information is going.  Say from Facebook to the trust or give your information to the trust or give permission to Facebook.

>> PHILLIP DAWSON:  Yes.  This is obviously still theory.  But I think the way ‑‑ why I conceive of it would go directly to the trust, platform like Facebook, negotiates its own access with the trust based on the terms that, for instance, you and I have agreed to in the trust.  So or that we have established.

Okay.  So then Facebook would then ‑‑ a platform like Facebook would have a license to access according to terms that respect our trust and the purpose of the trust and our trustee would be able to revoke access to the data for some breach of the terms.

>> AUDIENCE:  There is talk about value of data which you just mentioned nobody knows.  Are there any people thinking about trust for the purposes of enumeration for whatever value it may for whoever wants it?

>> PHILLIP DAWSON:  I'm not sure about that.

>> AUDIENCE:  It is a hot topic.

>> PHILLIP DAWSON:  I would like to chat more with you about that.

>> AUDIENCE:  We have benefit from data trust and then purpose.  It seems like quite simple words.

>> PHILLIP DAWSON:  Very complex.

>> AUDIENCE:  If you can explain it.  And it is an idea of what the ‑‑ most valuable to scale and something that ‑‑ wouldn't it be amazing if a million people kind of data trust but then that beneficiaries become a group.  The fact that we are not placing obligation on the members of the trust.  Are going to be making big decisions which are not binary correct or incorrect.  But judgments.  Some of the challenges that I see.  As they get more useful and bigger it makes structure of trusts harder to manage.

>> PHILLIP DAWSON:  Yeah, I agree.  One more question, I will try and respond to the other.  If you guys have one ‑‑ want to respond to either of these questions, please feel free.  Yes, sir.

>> AUDIENCE:  Just following up on the platform, a lot of assumptions that are going in to this idea of data tracking.  The idea that basically the individual should be the focus of all this, should be individual privacy and responsibility.  Putting pressure on the individual.  A lot of research that has been done, people don't understand these things.  I would argue that I don't put my 75‑year‑old mother should be like have to handle these ‑‑ the other way that you can do these things is a collective issue because it is personal consent happens to be legitimate, when handing over my data affects others.  In a sense I think we question whether the whole idea of data trust as opposed to some other things, data trust ‑‑ a trust built on and identified the group is the way to go.  I don't know if, for instance, it is a good idea to leave kind of statistical collection up to the private sector and have multiple competing agencies.

>> PHILLIP DAWSON:  Yeah.  I can ‑‑ I can respond to that.  Maybe we will go first to the first question.  Absolutely.  I think a lot of it will depend and I think some of these norms are in flux and not sure which way they will go but the level of granularity or specificity with which decisions about our data, I think right now there is none.  We have absolutely no idea what is happening.  Or any way to exercise some control other than ‑‑ as a collective.  Right?  For example I think about sometimes is when you choose a portfolio to invest your money.  You often don't always know all of the ‑‑ maybe I'm opening up a huge can of worms here, but Philippe, maybe you can respond to that.  You don't always know exactly, you know, all of the decisions that are being made.  You get reports about what happens, what type of transactions are happening in a portfolio and shows how it respects the terms of portfolio that you are involved in.  But there are also technological solutions that would have to be created.  And some of them already are in terms of managing permissions and access.  This is something that is happening, high levels of automation is being used to help manage that, if you think in identity management field, that that ‑‑ that is highly evolving and also using AI to learn, you know, differentiated uses or requests for access.  That could be applied to let's say a hybrid trustee that is part automation, part AI maybe for the low hanging fruit or like low value transactions that meets ‑‑ clearly meets certain criteria.  And then triage to the flags that go out for different transactions that humans would maybe triage.  So I agree with you, this is one of the ‑‑ this is one of the implementation challenges that we would be coming to.  You guys want to jump in?

>> Sure.  So, first of all, yeah, sorry, I think it is useful to remember that, you know, there is a relatively large body of water on trust law that can help in terms of this decision making.  General concept that comes to mind there is the notion of due diligence.  There will be mistakes that will be made and some individuals may be unhappy about some decisions on their behalf, reality is that when you manage the data multitude of individuals usually you will have some checks and balances, some accountability mechanisms to ensure some kind of due diligence.  In the context of data trust it is still kind of unclear what this would look like.  And there is a lot of questions, different levels of technicality that would have to be the answer.  But the idea in a more theoretical setting like this, and it goes back to your question, what are some of the underlining principles that we would want to have to shape data governance and to see whether data trust can be a vehicle through which these principles can be implemented.  And I think there we can learn a little bit from actually Internet Governance.  It seems to me that looking at this presentation, hearing, there are three principles that come out of it that could be useful to drive the design and implementation of the trusts.  And I would say there that actually they are very similar to those that we usually use when we talk about Internet governance which one as you mentioned Human Rights.  So rights respecting grounded ‑‑ so grounded in law.  So in order to allow for, you know, freedom against discrimination due process, once again due diligence.  So I think that would be one of the ways in which to ground development of data governance and data trusts.  The second is multi‑stakeholder.  I think as you pointed out when you have a situation where the data governance added is being present now is usually driven either by industry alone or by Government alone which creates problems in and of itself.  And then a third one which actually comes from the multi‑stakeholder lessons learned in the last little while is that you can be multi‑ stakeholder and can extremely elitest.  You can be a topdown multi‑stakeholder governance.  We have seen the professionalization of Civil Society Organizations that may not represent the wider audience.  And so a third principle there could be something like inclusion and what that actually entails once again would have to be data mine, but I think those are three principles that could help shape the way to think about governance and to think about trust more specifically.  Thank you.

>> PHILLIP DAWSON:  Just to respond to your point about whether some of these situations should be handled by private trusts or if there is a role for public trust, that's ‑‑ that's ‑‑ that's one of the questions.  And it goes back to who should be setting up the trust and trustee, and who do we trust to do that.  Right?  So like the idea like the statistics agency, you know, whichever, you know, country you are from, public statistics agency, should that be like, I don't know, should data that's collected by that agency be coming from a private data trust?  I mean I would tend to think not.  But it would be a case by case and depending on who do you ‑‑ who do you trust to manage this data.  And I think it is often not going to be just one entity, like Philippe Andre said.

>> I will add.  Also I mean the idea I think that what we find interesting about the data trust model is that it is ‑‑ it could act in the public interest of the members of the group.  So I think of refugees.  So refugees, for instance, right now in many situations are giving up their biometric information with no protection at all.  And it is being sold or used by a lot of different actors.  And none of those people have any control over that.  So imagine a situation where there could be a public interest trust set up where that ‑‑ where there would have to be some controls on how it is being used.  It is not being picked up by private sector security organizations that are doing all kinds of nefarious things.  I think of the trust as being something that would be in the public interest.  So that it is not up to the individual to always be making constant decisions about where this is going.

>> Yeah, I think you have a question now.

>> AUDIENCE:  Just to clarify that.  It basically comes down to who is going to set the rules.  So this is why I'm kind of disappointed to hear a Democratic representative to talk about ‑‑ it is either driven by industry or Government.  And it could be a multi‑stakeholder approach here.  Government is this kind of thing that's isolated from everything.  I have worked in Government.  And it should know that Government comes up with policies to consult various stakeholders.  That's the ‑‑ the question comes down to who is going to make a decision.  It should really be, for instance, in your case it should be Government sitting down with Facebook and Google to negotiate the terms of trust which is kind of what you get in multi‑stakeholder situations.  Or should be the Government basically kind of like surveying all the various interests and then have legislation in their definition of public interest because it comes down to who defines common trust.

>> Thank you for the question.  I agree with you.  The concern coming from the multi‑stakeholder model is more related to the fact that yes, in Canada this might work.  Not every country may have the capacity to understand or to implement rather a trust that would work in practice.  So the question is yes, maybe in Canada and actually I would say in Canada would be a lot easier to go through a Government led process in part because one of the main challenges I think related to trust which is the cost.  Obviously one of the issues that we see at least from a Canadian perspective from data trust you can select the trust.  The most fortunate would have great trustees working in their interest and what happens to those who cannot afford a trust.  What happens to the most vulnerable.  Those who don't understand anything about their data and may give their data to a trustee that might not work in their interest.  So in the context of at least a society with a robust Government that will do this check and balances that will engage with Civil Society and that will engage with its population yes, but there is a broader concern from let's say a global perspective of well, if we implement that model, are there any possibility that it would be used against the public in other contexts and how do we fight against that.  And that's kind of the challenge there in terms of, you know, you want to implement a situation that works, yes, domestically but how do you implement it beyond your borders.

>> PHILLIP DAWSON:  There was a question here.

>> AUDIENCE:  The green party solution would be to go for decentralized structures.  But difference.  One is that you need to get that trust, to have multi layers.  So you have a policy board to make a decision, administrative board or organization.  And you have a technical layer.  So you have like in democracy a separation of powers.  And that's a helpful structure.  And the second thing is that fiduciary function is not the only one.  But the structure suggests also some notary function to be notary of the process.  That the process is in terms of law.  And then you can scale that model in to something like you can pool ‑‑ a centralized pool, a very small way for just some super light groups or you can do it in a huge nationwide startup.  You can scale every dimension.

>> Yes, thank you very much.

>> AUDIENCE:  I think it goes back to ‑‑ have some different approaches.  I am wondering that because we conduct our entire group daily life in terms of things we need to do and is it enough for us to accept every time you want minor.  And so then this is a whole other layer deciding or setting up ‑‑ isn't it preferable that companies who have data have much more clear laws that apply to them or have fiduciary responsibilities assigned to the company who is out there collecting it and then they create a whole other layer.

>> PHILLIP DAWSON:  That's a good point.

>> AUDIENCE:  To get the court's ruling that something that sounds just fundamentally unethical and the court would say you didn't get their permission.  There is going to be a step before that certain things are clearly not allowed.

>> PHILLIP DAWSON:  That's something that has been talked a lot about in ‑‑ for platform, especially the United States.  But what if we make the platforms fiduciaries?  If we apply fiduciary obligations or given the relationship you have with the Internet users you have a de facto fiduciary obligation towards them.  So I think there could be some positives from that.  But I do know that there is some people who have criticized the people, people who have criticized that approach pointed to the impossible situation a platform could be in for which they have a fiduciary obligation to you and I and a duty promoting to their shareholders.  And that could place ‑‑ that could place the duties in conflict.

>> AUDIENCE:  Would the bank be like that?  The banks have fiduciary duties.

>> PHILLIP DAWSON:  That's a good point.  Understand precisely what is meant by fiduciary duty.  Duty of care and that is more similar to what banks have.  Banks have a duty of care, duty to do no harm to their clients as part of their fiduciary duty.  But they don't have the type of positive duty of loyalty that might be acting in your best interest according to say terms of a trust.  So that's ‑‑ that's ‑‑ that's where the conversation lies and it is an interesting idea that is still being unpacked.  Yeah.  Okay.  Any other questions?  We have a couple more topics to go through.  Over here, yeah.

>> AUDIENCE:  (Off microphone)  Put the information, all this is taken ‑‑ also they are ‑‑ the person on the other side do not allow them to play and continue playing.  But continue offline all the time.  At the end they ‑‑ take all the money aside this visa.  When you tell me ‑‑ if I'm going to trust it, I have to see you like that.  How can we trust a machine with Big Data and then the other side ‑‑ so I think we have to find something that will give ‑‑ it would make the boundary less so we can see each other from a point of trust that will secure your underage or whoever.  And ‑‑ latest ‑‑ which could not stop anyone.  So I think ‑‑ the trust ‑‑ put this point ‑‑

>> PHILLIP DAWSON:  That's a very good point.  What happens to data that minors participate in generating our data about minors.  I think there is existing legislation in place that could help us understand the rights of minors in relation to data, but I think another point you mention about is really just the overall need for transparency.  And if you think of data trust, one of the things that we were chatting about before the start, what are some of the fears ‑‑ what could happen with data trust.  Could they turn out to be, you know, shelters like trust law has been used in the past to shelter money, to shelter other assets, could data ‑‑ could data trust ultimately just, you know, exist as a shelter for some of the worst data practices that you could imagine.  I think that's where your point about transparency is extremely important.  I know that in some jurisdictions including in Canada they see using data trusts in society to ‑‑ to do that you would have to have some type of statutory oversight capacity or enforcement.  So that you would be able to audit the practices of data trust or to understand what their practices are and then to ensure that they are complying with the law.

Okay.  So some of these we talked about already.  Choosing a trustee, this is ‑‑ not so much an implementation challenge as a recognition that data has become extremely political.  And so what is ‑‑ what is a legitimate trustee and who should be a trustee is something that is negotiated.  It is ideally through a Democratic process.  It is not clear who should be responsible for managing the data.  So far we have large companies that are doing this alone behind closed doors.  And we want to bring transparency to that process and bring other actors involved, but we don't always agree on who should be the trustee or part of that trustee organization.  Just kind of a consideration.  Nature and scope of fiduciary duties, we touched on that a little bit.  What is the duty of care.  What's the duty of loyalty.  What level of intensity of fiduciary duty is appropriate.  These are common law concepts.  They are functional equivalence in civil law jurisdictions but are they understood in the same terms.  How do we talk about them if this is going to be something that could be transnational.  Scale of data transactions, this is mentioned.  What are we going to do?  Are we just going to be replicating a new burden of privacy self‑management?  We have to choose our participation in hundreds of different trusts and how do we manage how the trusts are using our data.  This is going to be ‑‑ regardless of what data governance approach you take this is going have to be something you reckon with.  A trust model or if it is not, whatever ‑‑ all the choices about data access and use the scale is increasing every day.  So I mention people who ‑‑ people in ‑‑ are starting to research the ‑‑ these hybrid models where a level of automation that's required and some humans who also act as trustees, for instance.  Or even personal AI trustees.

Now this is like thinking far in to the future, and also comes with some of the challenges that Philippe alluded to.  What are the costs of that?  Something that the Government might provide as a baseline for free as part of citizenship.  These are some of the questions that have to be answered and explored but ‑‑ and how do you deal with, you know, walking to a smart store that has sensors all over the place.  And if we walk in together my settings on my phone where my personal AI trustee are different than Philippe Andre's, is the technology able to capture that.  And communicate our different preferences and ensure that my information is completely deidentified in a way that is not exactly the same as Philippe Andre's if his preferences are different.

Regardless of whether it is a trust or not, these types of approaches to ‑‑ these are some of the challenges that new approaches to data governance have ‑‑ will have to address if we are to move beyond the status quo.  We talked about jurisdiction harmonization, common law and civil law.  Technical architecture, this is not really my forte but we talked about automation, personal AI trustees.  Some people have said that the data trust legal and governance literature it is time for it to merge with some of the federated trusts or that are ‑‑ not necessarily trusts but solutions like Tim's project on solid where there is a distributed network.  So these are some of the questions that are being explored.  And here is a final quote that is from an Article that a colleague and I wrote recently for this year's APC Bis watch.  If the data trust agenda appears ambitious, this is as much as an indication of data trust's prompting features as it is a reflection of the public's aspirations for data governance in the Digital Age.  This is a work in process.  But things that are important to us, representation, and decisions about data ‑‑ how data is collected and shared and used.  Shared rights of data.  So that we can also participate in extracting value from it or choosing how value is distributed.  Real accountability and remedy and compensation.  These are some of the features that we think that data trusts can respond to.  Whether trust or not.  These are some of the things that we want.  And I think it is continuing to guide some of the research on data trusts.

Any more questions?  Or comments from ‑‑ questions or comments?

>> AUDIENCE:  The fact that ‑‑ so we are individuals and we have given our data to a data trust.  Because now our data is completely spread over the Internet.  So these companies also give data to the data trust?  Will they be mandated to give data to the trust so that we can regain autonomy over this data that the companies originally now have?

And the second question will be do you have any idea on how you are deciding the size of these trusts?  Because I mean it is going to be a huge dataset across nations globally set up.  So I mean are we like going to make them national trusts, regional trusts, company based.  So what are the ideas on the size of trusts?  And are companies going to give back the data to these trusts?

>> PHILLIP DAWSON:  First question is about do the companies, currently large data collectors, are they collecting it directly or is the data going to a trust?  Is that the question?

>> AUDIENCE:  (Off microphone)

>> PHILLIP DAWSON:  Are they going to give it back right now, the data they already have.  That's a very good question.  I don't know that I can answer that.  I'm not sure.  I'm not sure.  Any type of action like that would obviously require legislative action.  And, you know, I don't ‑‑ I'm not sure that that's what is being contemplated for the Access Act.  If anybody is aware of more details of that, that piece of legislation, please feel free to comment.  But I think the idea is that there would be a ‑‑ there would be an intermediary and that manages the use of the data by the platform and in a nonuniform way.  So it may also depend on what you want the platform to do for you.  As one of the questions that came up before, I think I could see a situation in which for ‑‑ if you were here at the beginning of the presentation, there is the group that ‑‑ the Facebook group that established itself to be able to share about some of the sensitive medical issues that they were all going through.  If I was part of that group, I would imagine ‑‑ I would like to be able to when I'm creating the group on Facebook indicate that the data collected in this group is not going to reside with Facebook, but in a trust, let's say a trust of our choosing.  Maybe there is a lot of information that, you know, I don't care as much about that I'm a little bit more comfortable having a company like Facebook have access to.  Maybe it could be shared access.  So I think, you know, there is no answer.  I think these are some of the ‑‑ some of the answers that we all need to participate in defining.  And there was another question.  Oh, the size.  I mean I don't think there ‑‑ I mean again don't think there is a set answer there.  I don't know if ‑‑ I could conceive of trusts that are local, that are ‑‑ that have not many participants, if they are depending on what project they are for.  If it is related to some community project in the, you know, the specific area of Montreal that I live in, like I don't think ‑‑ that would be ‑‑ I don't know if there would be set a cap of like who ‑‑ that it would probably be bounded to the people who are living inside that community with maybe some exceptions for other stakeholders but I think it would be highly context specific.  And then maybe there could be transnational data trusts.  But in terms of setting a cap or ensuring you have frankly sufficient participation for it to be worthwhile in a sense, those are some good questions.

>> AUDIENCE:  Because I am ‑‑ about the decision making there.  How will they make decisions.  I understand the (Off microphone)  But then even the board, et cetera, how they decide.  How will they decide that each individual gets the chance to make decisions for themselves, if it is such a huge size and some people disagree with some decisions.

>> PHILLIP DAWSON:  We talked a lot about trust law, there would be other important features of this type of structure would come from normal corporate governance.  So we do have ways for millions of shareholders to have participation in the governance of a company while they may not be on the board of directors.  So you may have ‑‑ you probably would have, you know, you could think of like a data ‑‑ a trust, trust law applied to normal corporate governance structures of a typical corporation.  I think that could be one way.  Any other questions?  Gentleman over here.

>> AUDIENCE:  Thank you for the information that we shared.  One of the things that led us to data trust is accountability and being accountable for the data or information that I am presenting.  Rather than starting with regulation and governance it would be much better to start it as a culture.  Don't say something as long as it is correct or reliable or have ‑‑ can, for example, we establish ‑‑ is it practical to say as we have certifying accounts today and in a similar certifying accountant, social networks like Twitter and Instagram.  This is a certified account.  Can we say, for example, a tag or something trusted source of data.  So whatever published from here it is referenced source correctly, something like that.  Does an idea like that looks practical a lot of times jumping in to regulation or Governments.

>> PHILLIP DAWSON:  Yeah, and like part of the attractiveness of a solution like this, it could be a private law mechanism of dealing with the governance of data without a whole lot of need for new legislation or regulation which as you know ‑‑ as we know could be pretty difficult to come by.  Maybe something we can use, if anything, in the meantime to govern new projects that we have, that have data managed with new projects.  It may not immediately address, be used to address problems that come from platforms or existing data controller situations, but for new projects that we have, they are data driven, we can create these on our own.

Another thing you alluded to is the importance of industry standards and if whether it is about the providence of data or data quality, these are efforts that are underway in parallel to data trust conversations a little bit all over the world.  How data is accessed or shared or even metrics for what is a trustworthy AI solution or explainable AI solution.  All of this I think is part of the enabling environment for any data governance model or frankly enterprise data governance within a company or in a community.  So looking to things we can do on our own through private law or, you know, standard creation, I think there may be a lot with that.  Philippe Andre.

>> You mentioned the notion of culture.  The trust can be useful in that sense given that you can have a trustee that can reflect a cultural reality of a certain community and act upon it.  So I think actually if we want to take a cultural approach to this issue, I think trust can be an interesting mechanism, you know.  And I think at least in the Canadian context that's one of the reasons that we are interested in that concept, having these trustees that can represent various different parts of society can be a useful way to deal with some of the tensions that can exist in terms of a broader data governance structure.

>> PHILLIP DAWSON:  I think we had this gentleman over here and a question over here.  Who was first?

>> AUDIENCE:  I wonder about a few things.  First of all, you mentioned the idea about that one of the advantages of this is that we can rely on basically private law, in other words, contracts in order to handle these things but that actually removes a lot of the accountability because without having to go through the legislation.  Legislation is where you have accountability in a Democratic system or one of the areas that you have Democratic accountability.  And when you put things in private law it may not be a big deal, a contract between two companies, but when you got a contract that regulates how people's data is used that means that the resource has to be throughout the courts, which can be time consuming and expensive and is vulnerable to the ‑‑ to what's in the contract which is actually probably harder to change than legislation.  So there is some problems there.  And that's one of the things that we actually see one of the concerns in the Toronto project.  And on the idea with respect to the multiculturalism and how it would be great if you have ‑‑ to have these Councils because it will be representative.  In a representative democracy you already have that with respect to Government.  This is Government's job to represent the people and to represent, for instance, in Canada's case or awesome multiculturalism.  So I wonder here this is just kind of a way of dancing around the issue of like who is going to be making the decisions.  Can data trusts solve a situation of ‑‑ solve ‑‑ can they work in a situation where a country has poor governance?  I don't think they can or rather they will work just as well as the overall society will work.  It is still the same people and same institutions that are involved in this.  And so like thinking about this, it will work ‑‑ it will ‑‑ in Canada we have to export data trusts because other countries have bad governance.  I don't buy that.

>> PHILLIP DAWSON:  Okay.  Maybe just in response to this, the first part of your question or some of the comments, I didn't mean to suggest that data trusts in the private law is a type of silver bullet that in and of itself could replace, you know, either Democratic processes and as legislation and regulation.  Certainly all those would still continue to apply and as Philippe Andre said many of which should be used to condition kind of governance of data trust.  If you are talking about privacy or Human Rights or, you know, other theories of agency but in trusts also there is ‑‑ there is built‑in accountability structures beyond lawsuits.  Whether it is arbitration or whatnot.  So there are some other possibilities beyond just going through the courts which I would agree with you as a formal litigator.  You can get two years before a court date.  Sometimes it is actually longer than passing or changing regulation.  This is not to suggest that you should ‑‑ that data trusts should be pursued one as a unique solution or as ‑‑ that there is no need for new legislation or regulation.

Lots of countries are in the process of considering amendments to their privacy legislation as an example to bring it in line with some of the GDPR like provisions and others.  I think that's extremely important, too, and also that's ‑‑ that's a Democratic process.  In some ways I do not think that will cover all ‑‑ I'm not sure to what extent that can cover all solutions or if we can rely on the government to be the ‑‑ legislation and regulation alone to be the way that we govern data.  This would be ‑‑ it may be that this just fills some of the gaps that you have solutions, other data governance models including data trusts that is part of this entire kind of governance.  Data ecosystem.  So that's a little ‑‑ more like how I see it.  So I appreciate your points.  Philippe.

>> Yes.  So just I agree.  I think you cannot replace legislation by trusts.  I think that's obvious.  The question that we have at least from our perspective is more can it be a supplement.  Can it be another tool in the toolbox on data governance that could be used in addition to let's say a General Data Protection Regulation.  Because this is going to come but is it enough.  And I tend to agree with Phillip, I think we need additional tools to ensure an even more robust data governance structure.  But at least I don't think it is anybody's claim that, you know, we should replace the role of the government by private law.  Yeah, there is a lot of issues, a lot of issues that that would raise.  And I actually agree with your other point and that's kind of a concern from a foreign policy perspective is how will these ‑‑ if this tool is implemented or enabled in context like Canada what kind of message does that send abroad and how could it be used for purposes that are not in line with the ones used in a Canadian context.  That's a big concern on our part.

You raise the point I think it would be very easy to see ways in which this would undermine even more vulnerable populations.  Those questions are still there.  We are still kind of figuring out going through them which is why when I started I said ‑‑ I think from our perspective it is we are still exploring that concept and how it can be implemented and all of its implications and not there yet in terms of, you know, selling a product or coming to a decision.

>> PHILLIP DAWSON:  A question over here.  Maybe just before the next question I'll ‑‑ I want to share something that from ‑‑ I've been to many workshops now that I participated in questions like yours have been raised and reminded in one particular workshop where people are thinking about like the composition of a data trust.  And the questions are like wow, how do we make a group of trustee organization representatives and Democratic and how ‑‑ for getting that we actually have these Democratic institutions that are part of our Government.  And in some cases like it may be that an entity from the government which is Democratic and has Democratic processes built in could play that role.

>> AUDIENCE:  I'm very interested about the concept of data trust since I come from a country even if we have local legislation and data protection, what I observed when it comes to, for instance, personal data breaches of major platforms our citizens do not have access to remedies.  There is no way that local data protection authority can actually have enforcement power over corporations that are not there.  So my question is how can we ensure that data trust will be accessible to peoples of these countries so that if the local data protection authority doesn't have enforcement power then at least the citizens can avail of another avenue for remedies.

>> PHILLIP DAWSON:  That's a good question.  I think one of the questions about trustee's ability to exercise the rights of the people who are a part of the trust is addressed by your comment or at least you touch on it which is that, you know, can we ‑‑ can we assign the rights we have in data to a trustee who can then sue on our behalf.  And in a way that we ‑‑ now obtain a remedy on our behalf for like, I don't know, a large group of people.  I think that's a question that may be that in supporting regulatory changes would be required to enable the trustee to exercise those rights on people's behalf.  That's kind of like partially addressing your question.  Okay.  Yeah.  Thank you.

Anybody else?  We have ‑‑ we only have a few minutes left.  Thanks very much.  I don't know, Philippe Andre or Roja, if you wanted to share anything before we close, but for me I will say thanks for coming.  Thanks for all your questions.  These are, you know, some of the questions that come up a lot.  There are some new questions, too and new comments that I have heard.  I hope it is interesting to all of you.  And if you want to read a little bit more and this is ‑‑ I hope you believe this is nothing to do with me, but this is a chapter that I mentioned is in this year's APC's Biz Watch.  It is a survey of data trust and fiduciary data governance models and touches on information, fiduciaries and data trusts.  It is not very long.  And it has a lot of references to other people.  And I think if anything that's probably the most value of it.  But it is a quick read that gives you an overview of some of these ideas and implementation challenges and then a list of many other resources we can read a little bit more about this topic.  And actually that's a publication that's getting launched tomorrow in room 4.  So not ‑‑ you can stop by and see, you know, some of the other chapters in this year's edition of APC Biz Watch which touches on AI and Human Rights and development.  It is a great publication.  You should check it out.

>> Sure.  So I guess just some concluding thoughts on my end to kind of sum up at least what I take away from this.  I think one, the point that Phillip made in his presentation that there is already a governance structure in place.  Sometimes we tend to think there is a gap; there is a void.  Actually there is already a governance structure in place.  It is extremely problematic which is why we need to find solutions to improve it or completely replace it.  So that's something that I think we need to keep in mind that there is maybe not a sense of urgency but there is a sense we need to act.  Because otherwise that governance structure will continue to exist.

Second I think something that came out in the questions, the problem of the data governance is global by nature.  We have to think about the implications internationally of any type of governance structure that we decide to implement domestically because it may have important ramifications elsewhere and something that we tend to forget sometimes when we think about these issues because we are a big part of it.

The third one that we still need to think long and hard about the kind of principles that we want to underlie any type of data governance, governance in general.  Because any type of legal tool that can be implemented as we saw can be implemented in the way that goes against the public.  So even if the tool in itself can be useful, if the principles are underlying are not ones that we want to see in the society.  This may raise fundamental problems.  I mentioned rights respecting multi‑stakeholder and inclusive.  This is another part of it.  Thank you.


>> PHILLIP DAWSON:  Thanks everyone.  Thanks for being here.