IGF 2020 - Day 8 - WS184‎ Children’s‎ Rights and Participation in Data Governance

The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

[ Session in progress ]

‑‑ affiliate of the Internet society and studies the governance of data collected by digital platforms and CTs.  Varunram Ganesh is a research engineer and working at the MIT Currency Initiative.  Background is in technology research and data privacy.  I am a visiting fellow from School of Economics and Political Science and founder of a start‑up which aims to visualize educational data and empower students to learning about their data.  And according to my youngest kid, I'm a person holding a cooking pan.  I just put that on purpose just to highlight and emphasize the importance to see what children see, what they understand, what they experience and what they think, which is the whole point to our presentation to our workshop today. 

Today we had the opportunity and have the opportunity to be together all of us and to talk about the emerging problems from highly digitized educational environments.  I put a glossary of a couple of terms that I think about in invite everybody to think about in terms of the kind of normative assumptions and normative sequences that may follow as a result of words that we use on a daily basis like EdTech, what do we actually understand of EdTech?  What do we think about when we talk about applications?  Can we make distinction between one application and/or a platform ‑‑ we are going to involve them in the discussions so we can hear their voice and see their perspectives.

So the pandemic propelled global education towards two extremes as it seems.  On the one hand, it exacerbates an equality and on the other it forced radical Digital Transformation by those groups perhaps we can argue to seek directly profit from these transformations, from instruction to assessment to practice on a daily basis, to increase the EdTech.

This generates immense amount of data.  And to understand what kind of data, I look to primary and secondary education a little bit in Europe and in the United States and here is an example just to visualize what I mean by that.  I hope it works.

Imagine a 10‑year‑old Avery who is from a primary school from a typical education whose data about ethnicity and race, data for parental employment, transcripts from grades collected by peers and education, which is a media company but they also are software developer among many other things.  Pearson Education collects data about Avery from on line communications like her e‑mails, from behavior data and conducts and her responses on services questionnaires.  And there are hundreds of others companies EdTech companies that tap into Avery's data.

While some legal frameworks such as the general data protection regulations within the European context exists and offer data privacy and data ownership on an individual level, the recent research by for example the national Digital Accountability Council revealed that EdTech ‑‑ these were the most commonly used EdTech companies during the first lockdown of the COVID‑19 pandemic used by parents and schools.  These companies still manage to tap or circumvent data privacy, share location, expose personal data to third parties and so on.

Now the obvious thing is that EdTech provides us to collect data, they need data to fine‑tune credited and drive profits.  I have two questions that I want to pose today and first one is thinking about agencies.  Can children and youth be empowered through data literacy?  It should be a collective effort to push industry to let educational institutions to lean towards this collective learning.  And it's not just about the students and children to learn about the data but it's also collective.  So teachers, educational leaders, school leaders and collectively as a society.

Now research shows that people have a better understanding once more abstract policies like data processing are visualized, people have a better understanding.  And that is part of this collective data learning and data literacy I'm talking about.  However, research is also pointing that even if jurisdictions provide framework for data ownership and measurements for data literacy, the majority of individuals still don't know much about it or there is lack of clarity with regards to the processing of personal data, or simply people don't care.  But I want to hold on to that kind of word, don't care, when we don't care about our personal data.  Who is collecting it?

And link it to my second question, so what drives EdTech in schools?  The total ‑‑ spending false on governments and families so in the current crisis we are talking about these expenditures being put a strain on the financial strain.  EdTech providers rush to offer saving solutions as we see with the global education coalition and you can see some of the names part of that coalition.  And they offer saving solutions and improvements.  While acknowledging the inequality and lack of connected activity especially in this current situation, I want to look at the evidence of what really improves education.

Back in 2009, a really great book was written and argued injecting aid money in Africa isn't working.  Similarly, research shows that injecting EdTech hasn't produced any substantial impact on learning.  Since education is a national responsibility falls on governments and families, they should collectively find other means to aid development.  Educational development is not about merely scaling and mixing up existing EdTech.  And our children voice in this anyway.  Could we in the pursuit for better means collectively learn to care about our personal data?  I'll stop here and I want to give voice to my colleagues first.  Nick?

>> NICK COULDRY: I want to approach freedom.  Freedom brings together all the tracks of this conference, environment, inclusion, trust and data.  Freedom is a fundamental value that we want to apply to all of our dealings in the technological environment and obviously trust is essential to it.  So we know that education today at all levels is being transformed by commercial platforms.  And we also know that the pandemic is intensifying the practical and social pressures to use commercial educational platforms.  As a result, the spaces and the times of the educational process are being transformed.  And I want us to focus on that point for a moment.

Now we know that space and time are the dimensions in which everyone, including children, acts out their freedom.  So what are the implications for the space and time of education when, for example, education is now being conducted on platforms with their massive capacity to extract data and to archive data for the future?

More broadly, we have for a century at least, maybe many centuries, thought about education as a preparation for adult freedom and as a space of free development for children themselves.  So what are the implications of educational platforms for children's freedom right now and their future freedom when they grow into adults at a time when no doubt, data from their childhood education on platforms may still be stored?

And if there are issues here that educational technologies raise for the fundamental value of freedom, are we currently giving children themselves opportunities to contribute to discussions on those very issues?  And if not, why not?  I'll pass over to Doaa.

>> DOAA ABU‑ELYOUNES: Hi, everyone.  Building on what my colleague said before, we are going to go back to this points and discuss them more later.  The point I wanted to focus on in my brief presentation is the law.  And in particular, the General Data Protection Regulation, which is European regulation turned in 2016 and applies to everyone who reside in Europe as well as European citizens who reside outside of Europe.

It has special implications regarding children so I thought it would be interesting to address.  Threat to the GDPR states that children requires special protection with regards to their personal data as they might not be aware of the risks, consequences and safeguards.  So, the specific protection should apply to particularly to the use of a personal data in marketing and profiling and basically in any digital services provided to children.

And the rationale behind that is that children might not understand that sharing their e‑mail address or their hobbies would lead to marketing, for example.  And if they are unable to assess critically the content that might be encouraged or assumed that ads they are receiving from the specific subscriber they subscribe to like that they have to buy certain products or encouraged to adopt unhealthy lifestyle, for example.

So the special protection should be part of the design.  For example, the privacy statements has to be adopted and to be understandable for people at any age.  And by default, it can be limited by age, so for example, you can apply to people who are less than 14 will have certain privacy settings that are different from others.

Now, consent ‑‑ according to the GDPR, you need a reason to process the data.  You need a basis for the processing.  And if the processing is done based on consent, the competence of the child should be assessed in their ability to understand the implication of the collection and processing of data.  In this regard, GDPR clearly states an age of 16, so under that consent cannot be given.  Under that consent of a parent has to be given.

It can be lowered but no less than 13 for sure.  There is an exception of course, that if services are provided by an intermediary like school, or when I said would not apply that much because it will not be considered a services provided directly to children.  However, in general, GDPR treats children similarly to adults in regard of processing the following rights: So the rights to object to the processing of the data and some exceptions, also children should be able to give a copy of their data if they ask for it, the right to ask for their data to be deleted, especially if they consented to give it when they were adult and they asked later on ‑‑ when they were kids and ask as adults.  So the company cannot say you consented.  It doesn't work.  And lastly, data portability.  So data they can ask for data to be taken to another platform and they can ask not to be subjected to any other processing.  Thank you.  Beatrice or Gretchen?

>> GRETCHEN GREENE: I'm not sure either.  I heard it both ways.  I'll tell you some of the ways that AI is being used in education today and also talk about where I think the promise is where AI is strongest and what we might want from it for education.

So, first let me tell you a number of ways it's being used.  Some are not about the learning process itself but about the infrastructure of schools.  So it can be used for threat detection so face recognition has been used in some schools for identifying whether someone has permission to be in the hall on campus add at a school event.  It can be used for another threat detection which is, is someone likely to fail out of school?  Are they likely to drop out of school?  And by identifying who you think is likely to drop out, then can you make an intervention?

It's used for adaptive and personalized learning.  And also for learning engineering, which is can we learn more about how we on average learn things?  And can we learn more about how individuals are learning things and then change lessons to improve our teaching things?

So as an example for that, how often, if you're learning a new language, do you need to be reminded and tested on a certain vocabulary word?  What is the optimal repetition time on average and for you as an individual?  So there is research around that and then applying that to, I need to hear it every so often until I really have got it nailed down.  If you wait ‑‑ if you ask me all the time, you're wasting my time and if you wait too long, I don't remember the word.

Being used for identification proctoring for remote exams; you can for instance do data detection and try to identify where someone is looking if you want to enforce that they are looking at the screen and not at something else that is off camera.  Some colleges are using it to track how early and how often are people engaging with that collegiate website and in what ways because it affects your U.S. News & World Report rankings if you offer to people and they don't accept you.  So there is some interesting questions there.  That's not about somebody learning really, but affecting the whole system.

Chat bot tutors.  So automated tutors to try to answer questions for a student to kind of scale the teacher up because there are a lot more students than teachers and if you can take that instructor's time and free it up, they can spend it on other things.  Also then students, if this is their homework buddy, can ask questions at 2:00 in the morning when the teacher just doesn't want to be available.

Emotion and affect detection both on line and in some classrooms in China.  Looking for joy and engagement based on student's facial expressions or their voice.  You can imagine different ways that can be used.  It could be, is this student paying attention?  Should we intervene with this student?  Or is this teacher compelling attention from their students?  Are they causing them to be engaged?  And should we use that to evaluate teachers?  Algorithms are also being used.  There are U.S. court cases about them evaluating teachers using various factors, how should we decide; which we do anyway.  We evaluate teachers and children but some of it is getting automated now with algorithms and AI in way it is wasn't.

Automatic grading.  So as an instructor when you're grading, I don't know that I have ever met a colleague who loved grading.  I definitely don't.  And so, if you can automate that, which multiple choice exams are great for that.  The low‑tech way to do that.  And then there are ways to run those through machines.  But now also there is AI being used to try to automate essays and try to give feedback on essays.  So you could use it either for evaluation or for the feedback for someone improving, which isn't quite the same use.

There is also on the infrastructure, even things like predicting maintenance needs and failures for the bus fleet for the district.  So now you have got some ideas of all the what is happening right now.  It has not yet reached its promise of relieving teachers of all grading and evaluation they would not like to do so they could either have more time for things they did want to do at work or do things they wanted to do at home instead of grading all night.  But what have we gotten really good at?  Let's look at education more broadly.  Let's talk about libraries.

Let's talk about your laptop or your phone, your connection to the Internet and Google.  Where do you go now when you want to find something?  What is the first thing that you do?  You use AI on a search engine.  You use AI on recommendation systems to get information.  So the already achieved ability of AI to create access and to control access and to drive us to certain things for information, is enormous.

Scaling education is one of the biggest hopes for AI in education that you could remove barriers of geography but you can do a lot more than that.  You can remove barriers of that you got a certain grade in a certain class that you went to a certain school, that you had a certain amount of money; because a lot of barriers are set up because we have a system where we only want a certain number of people to qualify for certain things.  We have got these funnels of only a certain number can come to this school.  They are artificial in a way.  Some of them are real in terms of if you have not mastered competency at a certain level of a subject, you can not then master the next level.  But some of them aren't.

AI scales cheaply.  Once you have set something up you can then send it broadly.  So adaptive and personalized learning, AI is good at pattern recognition.  Natural Language Processing.  We had enormous advances.  So the automatic translation of languages is huge for being able to create broad access to informational resources of all kinds.

So engagement.  One of the problems with ‑‑ back when it was by correspondence by U.S. mail and then it was turned into mooks (sp) at some point .  I don't know how many of signed up for mook class, if you have and you finished it, I'm impressed.  I think I finished one but I dropped out of a lot of them.  However, there is enormous amounts of the best and brightest computer engineers in the U.S. and beyond working on engagement.  How many times can they get you to click a thing?  If that same kind of energy was used on how we keep students engaged, that would be the thing that it would take to really scale education with AI and really create not just access, but to help people be really excited in a way that a really great teacher does.

All right.  Thank you.  I think Beatrice?

>> VELISLAVA HILLMAN: Thank you.  Beatrice?

>> BEATRIZ BOTERO ARCILA:  Thank you.  It's nice to be with you here today.  I want to bring up a little bit of the U.S. context on data regulation and education and one of the key challenges is privacy laws in the U.S. are sector‑based.  Education has been already regulatory recognized as where data is sensitive and there are additional protections for students.  The thing is, however, that when technology companies start write education services they are not really covered by the sectorial regulation privacy protections that apply to schools, universities and so on and so forth.

So I think in those scenarios in which AI and data analytics can be used to ask Gretchen ‑‑ as Gretchen was describing, keeping track of students who might be at risk of dropping out or who might not be learning entirely or to help Professors and teachers grade and so on, that the privacy risk there is that these companies are not covered by those regulations and extra protections and thus, there is a risk to improve on one maybe the type of consequences for liberty and freedom that Nick was talking about, materializing in a way, because there is already controls to what happened to information downstream.  So I think some of the key things that we should be thinking about in a space like this is about expanding the protection that is exist a little bit along the lines, to companies that aren't necessarily educational institutions but are increasingly embedded in the educational ecosystem and that for example, might be happening right now with soon many schools and universities are going online.  But even web service providers like Google and so forth, I think there is an important need to try to think how we engage this actors of the education world in the protection and privacy that are important for students.  That was what I wanted to say.

>> VARUNRAM GANESH: Thank you.  I'd like to touch upon one issue and data privacy.  There is probably a very important to understand the intervention in most cases is better than a solution after data privacy violation or breach.  You can see in the slides here that trusted and really big companies, very popular toy makers, YouTube and children's hospitals, these trusted companies and institutions which we generally don't assume to have these kind of hacks, are being involved in these hacks, which begs the question, can we prevent such events from happening before they happen?  Because the implications of such data privacy and breaches are huge, and especially in children because there are a lot of bad people who might do things and be really ‑‑ we really want to protect our children.  The real goal is to bridge the gap between awareness and action because a lot of people are aware about privacy breaches.  We read about them in the news all the time.  Cambridge Analytica or hacks at the Pentagon or different governments.  But we don't really know how to report them.  And scaling it back into the school level, if is there a data privacy breach for example, in the attendance record, who do I report it to?  Do I report it to the teacher?  To the principal?  To the authorities?  And I think it's really important to bridge this gap so students and parents can understand how to take this process forward.

And parents are concerned about student privacy because as Beatrice was saying, Zoom and all these on‑line applications, known knows what is going on and there is a very famous article about Zoom earlier this year which pointed out that Zoom had very critical security flaws but if these capitalized companies have these problems, we are trusting our children's education on some lesser use, we need to shift our focus to try to understand what they do with the prospect of privacy.

And another important thing probably Nick touched on, was that we need to balance utility and privacy and the reason COVID‑19 pandemic we have seen the advent of contact tracing Apps and while they are created with extremely good intent, we have also seen instances where the records are open to the public or they had security flaws.  And the small notion of that can be seen in these different snippets from across the world where things are seemingly created in very good intent but the implications are huge as you can see a teacher creating a national database or tools being developed to track the pandemic.  It's really important to track what is really going on behind‑the‑scenes so we can try to prevent an event of a future privacy breach.

And I think as I'd like to close our presentations, our goal as Gretchen pointed out, is to utilize technology.  Because technology has so much potential and has improved our learning systems a whole lot and we don't want to leave the benefits of technology but at the same time, we don't want to have a big brother watching over us and who can track whatever we have done in our past lives.  And I think the goal is to have safer classrooms without fear of big brothered while still utilizing benefits of technology.  Back to you.

>> VELISLAVA HILLMAN: Let's open the floor for discussions and we have some really great youth representatives from Youth IGF.  Nick, my colleague, will moderate the discussion.  There are questions already coming in.  Nick to you.

>> NICK COULDRY: We have two questions so far from Jose, Antonio.  And first we want to give a opportunity to Joao and, Manuel to give us any brief comments you have as representatives of Youth IGF.  Whichever of you wants to go first.

>> JOÃO PEDRO MARTINS: I'm speaking to you from Portugal.  I have been part of, like you said, a youth initiative that was running in parallel to the Local IGF and gathered many youth from all around the world and we produced some specific messages on topics of interest of youth.  This is a very nice session because it links directly to some of the topics that were raised.

On what has been said, I guess from a youth perspective, usually youth has the opportunity to really choose or opt‑in or opt‑out on the softer and the platforms that are being used for their education.  They take for granted the good judgment of teachers, of the principal of the institution of national education agendas, and it's debatable up to a point how they can influence these decisions if they come up from that high.

A point I would like to share with you, is a specific situation where AI data can be a little bit damaging to youth.  And I speak on a personal capacity.  I have done exams through ‑‑ in this new era, in this on‑line virtual reality, and there is a software called, proctoring.  It's basically a software that you run on your computer while you take the exam.  It shares the teacher that you're not consulting Google or navigating on your PDFs with notes and trying to get the solution without being from your mind.  I then saw a very interesting article that got me thinking and I shared already.  And it basically raises the question, how is this not basically enforcing surveillance from an early age to young people?  How is it the trust between society goal or valued principle of education and undermining the relationship between young students and the teachers themselves?

A solution is then proposed, perhaps we should be rethinking the questions that are being asked to the students rather than trying to over watch everything they are doing, capturing their webcam, sound, on a deeper level the processes that are run on our own machines.  And I guess that would be my greatest input for this debate.  What are the things that we should change in educational level that would allow us for more transparent use of this technologies without undermining for instance the trust that should exist between these different stakeholders is in the education process?  Thank you.

>> NICK COULDRY: Thank you very much.  You raised a fundamental point and we must try to get that very soon.  First I want to give Emanuel, the other representative from IGF Youth, the chance to speak if you want.  Emanuel?

>> EMANUEL: Thank you.  Good evening from Nigeria.  Just a few points I would like to raise in relation to the subject we are currently discussing.  At this point, relates to the conceptualization across three panels of thought.  We have a growing mistrust towards the behaviors of AI like we have seen in recently.  And then the black box operations that characterize the operations of AI accordance in many applications which are most times oftentimes showed by intellectual properties and laws in many countries and then the reality subject to data and statistics that some of the schools and many of the students that actually subscribe to this education and technology platforms may not understand the ways in which AI uses the data of learners on this education platforms as well as possible ramifications in the short and long term.  So in reality, there are legitimate fears pertaining to how the things that AI learns about students and academic performance.  Might be able to influence their job prospectives positively or negatively, fairly or unfairly.  And then it is a concern that raises the question of possibility and occurrence in regards to is it possible that this can happen?  Well, antis dense gives us points to answer this question in the affirmative.  Is it going on now?  Well, that would make an interesting research question, which I think Gretchen would be one to talk about later.

And more importantly, how are youth voices involved in the choosing and opposing of these learning ‑‑ tech platforms in many areas?  In Europe and America, yes, there are regulations that could help to govern how AI behaves in these settings and protect the interest of the learners to a large extent but then when we come into the global South, in Africa and Asia and other countries of the Middle East and other regions, how would this play out?  With the same ramifications that they are playing in Europe and America?  Recently, it also has been seen in research that sometimes the privacy policies, the privacy regimens that are applied across this various platforms vary across jurisdictions so we have a different regimen of operations in Europe, a different regimen of operations in America, and Africa.

This is subject to the data governance and whatever regimes apply in those jurisdictions.  How are the larger interests of the learners and various participants on these platforms protected?  And just a few points that come off my mind and I would like to share with the rest of the discussion.  Thank you.

>> NICK COULDRY: Thank you very much.  Again really fundamental contribution and basically Emanuel has raised points about the institutional complexity of all of this in terms of who is consulted, whose interests are heard, mistrust and so on and so forth.

And then Joao raised the question about consent.  Is there consent from young people to this process?  So before we get back to the other questions, I would like to put a quick question back to the Panel themselves for a very brief answer for whomever wants to answer.  Which is, is this all really about children's freedom?  Is that the fundamental question here?

And put another way, who are the real users of EdTech we should be thinking about?  The institutions that supply them?  The schools that pay for them?  The governments that regulate them?  Or the children themselves?  Brief responses from anyone in the Panel who wants to comment on those, which I think were under laying both great points.  Beatrice you want to come back?

>> BEATRIZ BOTERO ARCILA: We see several layers of users and the application and we also see several ‑‑ I think there are varying distributive effects that also change per user.  So as an example, if we are using a type of grading App that helps teachers grade, then I would say something like the direct user is actually the teacher.  But there is student data running through.  This might be most likely being paid by educational institution.

And how it effects different ‑‑ like the different actors, it also varies.  So for example, it might be helping teachers focus on planning the classes instead of grading or it might not.  It might be really hard to manage the software and teachers are not really being helped by it.

It might be helping students get ‑ a fair assessment or it may not.  It depends on the underlying data and premise and so on and so forth.  And it also depends on what type of governmental money there might be for hiring the software.  So something that we shouldn't ignore in these conversations are what are the incentives in place?  And you see many schools and cash‑strapped districts or so forth that see an opportunity in federal funds for educational institutions to attract money to the school, attract attention.  Which is why my personal interests lies a little bit on trying to avoid the main challenges.

So we don't want students data to go outside of the school, for example.  Regardless of whether it is a private provider.  I would argue, at least, data that can be deidentified.  We might want to make sure that there are some form of impact assessments or transference mechanisms to ensure that the technologies aren't biased and so on.  I think my point was I think it is important to add some complexity to who they are.  Many layers of users and advantages and disadvantages and many layers of risk.

>> NICK COULDRY: So it's complex.  But does anyone on the Panel want to say something about freedom?  May or may not be fundamental here because we will have three questions.  So briefly before we get back tots three questions we have.

>> VELISLAVA HILLMAN: I want to build on Beatrice.  Because the complexity that sense of urgency that is created, and usually the idea of ‑‑ like I said in the beginning, first of all we are lumping technologies benefits and risks somehow when we are not really clarifying are we talking about a school where there is teachers or there isn't even a building?  Or the ratio between teacher and student is 1‑100.  Are we really talking about an area where there is lack of connectivity?  Really lack of resources and local languages?  Or are we talking about in an area where there is ‑‑ or talking about users or students and learning environments where there is already a structure of learning and teaching in place.  So there is always that lack of clarity when we are discussing.

And the sense of urgency in the meantime.  So suddenly there is the notion that we need to automate the process of for example getting the attendance for children, which suddenly is again more data being generated.  Do we really need to automate this kind of process?  What is the sense of ‑‑ why is that sense of urgency created?

And like Joao said about the proctoring systems, is this really necessity?  How necessary is this?  Is it imposed by the teacher?  Is it agreed upon by the school?  Who is imposing these technologies?  And again where is the ‑‑ in that sense, definitely the student is not the user, is not the user of the applications.

>> NICK COULDRY: Okay.  I think we should get on to the three questions which have been asked.  We had one from Jose and one from Lillian and then another as well and then maybe one or two more.

Now, Jose, his question which was asked in Spanish, we have more coming too.  Was, does technology substitute for affect for emotion in the process of education?  Or as he put it, more poetically, could a robot smile?  Could a robot teach a smile?

So does anyone on the Panel want to answer Jose's question about affect in education?  Is it being lost through EdTech?

>> GRETCHEN GREENE: Robot teachers are smiling today.  My multimedia lab has social robots that tutors is one of their applications and that research is around how can you put affect and personality into the robot tutor in a way that encourages a student to work harder and learn better and they are having positive results.  They have very friendly robots that do a number of things but these tutor robots have appropriate affect.

What we would think of as like a friendly interaction that students really seem to enjoy and bestow a certain relationship with that you would normally attribute to the a human.  Not entirely, but they laugh with it and are having kind of a relationship with it and the robot is trying to figure without they are frustrated and responds appropriately to sort of pull them along into doing more homework.  So yes, there is work on that.

I think that kind of work, whether it's within an embodied thing and there is also research that an actual physical robot has much more of a relationship ‑‑ like a person will attribute things to it and add more of a kind of relationship with it than they will with even a video of the same robot or something that has not even the video or not even an image of a physical representation.  But however, it is that sort of pseudo‑relationship in that affect does seem to be very important for this kind of engagement and inspiration that we associate with good teachers and that I certainly remember throughout my educational path that personality really helped inspire me to be more interesting in the subject.

And so, I think ‑‑ and also that relationships even kids who aren't enjoying the learning, but they have friends at school.  Like that makes a big difference.  And so whatever that is as well as why do we have schools?  Like what is the point of the school?  Part of it is that there is certain kinds ‑‑ there is a body of information we want students to leave with.  There is a body of skills we want them to leave with that are academic.  But that's not close to all of it.  So we also want out of schools, a certain kind of socialization and learning certain kinds of social skills and emotional skills and so that too, I think, is related to what is this relationship between AI or row bot and if we sort of go all remote all AI, can students still learn that social thing that they are supposed to be learning?  I don't know what the answer is no matter how good the robots are.

Certainly I hear stories.  I have not read the research but certain lack of social skills, human to human in physical real‑time eye‑to‑eye, that appears to be caused by texting, like phone attachment.  Like if you actually take those devices away and have some teenagers in your car with, they don't seem to know how to talk to each other much less like the adults.  So AI aside, devices aren't doing a great job of some of the relationship thing that we need to build.

>> NICK COULDRY: And interestingly in Spanish in the comments, Jose made a similar point with how do you build this social child to child bonding in an on‑line connected situation.  So this question of affect is complex.  So thank you Jose for raising that.  We should move on to this question.  How can we effectively ‑‑ efficiently protect children and how at the same time can we protect the classic educational curriculum, which is sort of a regulatory point and also a fundamental question about the protects of children themselves, their rights to be protected.  Does anyone on the Panel want to pick up on that point?

>> DOAA ABU‑ELYOUNES: I'm happy to.  So I think this is like the million dollar question.  Like how do we make it work?  How do we protect children?  So I mean, we talked about very specifically the GDPR and I think that there are like implications of the GDPR and it's been considered in other countries as well by ‑‑ I totally agree with Manuel's comment that it's very European thing.  But as I said, even in the context of the GDPR, we have exceptions and if the technology is not provided directly to the student by an intermediary, we don't have the same protections but ‑‑ so I think like one important aspect is that even if it looks like we don't have a lot of protection, one good thing that kind of came out of the GDPR in general is that it allows individuals to ask for their data and while most of us don't do it probably on a daily basis, I think there are like researchers and there are like NGOs and specific organizations that this is exactly what we are doing.  So by kind of utilizing that, we can know where our data is going, where is it stored, what kind of data is being collected on us.

So even if it's a bit harder to see that immediate connection, it is ‑‑ it has a broader implication.  And I think awareness is perhaps the first, before we talk about actual.  So I think the more awareness there is among children who what are the rights, what can they do in case of a breach and trying to kind of ‑‑ because honestly speaking, no one is going to read any privacy statement of any website.  Adults don't do that.  I don't expect.  So I think trying to find ways to talk about this subject in workshops like that, in schools, to be aware it is possible to say, no.  That's probably, I think, the first step before we ‑‑ the first step of protection.

>> NICK COULDRY: Thank you.  Officially, we have five minutes left but we are allowed to go on a little bit later because this is the last session of the day.  But in case anyone has to leave in five minutes, we'll take three points next and put them to the Panel.  So we have a question from Rebecca which is, are there any good practical models for the occasional platforms out there ‑‑ educational platforms to use with her children?  Anything we could recommend?  So if the Panel could bear in mind Rebecca's question and also do you want to make your point and then I'll go to Jutta.

>> Joana: Thank you.  I want to thank all the contributors so far.  I already have five pages of notes I can write a paper with.  They have been all invaluable and it was a multiperspective Panel, I would say.

So I want to follow‑up on Gretchen's point around how do we do ‑‑ how does AI enable us to do this social thing in education?  And the reason is that area nation, the idea of area nation ‑‑ Aryan Nation is at the crux of criticism against the current, the traditional paradigm of education.  So I'm kind of trying to bring everything together and so I feel that with this new, in this new circumstances, we have a new form of Aryan Nation emerging, and just in case not everyone is an education expert or whatever, usually we speak Aryan Nation between students and teachers, the school and the external realties and so on.

So my impression is that we have a new form of Aryan Nation emerging and that has to do with the alienation from key education stakeholders from this new actor that is coming in and it's the EdTech industry.  And this Aryan Nation argues consolidated by the hype, the political pressure to give more technology and of course the panics around how to we do this more efficiently and so on.

To go to my question and apologies for the big intro, as these actors are being more and more like consolidated outside the sphere of what is actually happening in education, how do we compel them and make them more accountable?  How do we compel them to educate the public about what they do, become more transparent as our youth in the Panel has raised?  And therefore, how do we increase their accountability towards ‑‑ because as I said, like there is so many ideas here today and content.  The production or generation of content raises other concerns around data and privacy and what constitutes data and so on.

So just one ‑‑ to boil down to one thing.  How do we bring them in the conversation and make them make us part of this?

>> NICK COULDRY: That's a big and large question.  I want you to have a chance to pose her question and then we'll go back to the Panel.  Jutta.

>> JUTTA CROLL: I have two things to say.  The first thing is to pick up on what Doaa said on the GDPR that we have to have in mind that the GDPR, the first data protection regulation that has made the age differentiation and put children on the spot that they need more protection.  But at the same time w that intention to protect children more, it has also some aspects where it is infringing children's rights, especially the right to privacy of children that might be infringed when they need the consent of their parents up to the age of 16, which is incredible.

And the second thing I would like to pick up on, what Gretchen said, I do think the best protection that we can provide for children is doing more research.  And we are in the situation that with the man demic in so many countries of the world, children are now in the situation of on‑line education and on‑line learning.  We have a basis for databases for more research on how that effects the learning processes.  We will learn how their social interaction is affected by this type of education, with not being with their classmates in school, and we will also collect data on how they perform in the future.  So to protect them, I do think we do need to do more research in this regard and we could make use and benefit from the database that we can gain now during the pandemic situation.  Thank you.

>> NICK COULDRY: Thank you very much.  So does anyone on the Panel apart from me want to comment on Jutta's points or Rebecca's question, are there any good models for EdTech we should be recommending?  And Joana's big point about how is EdTech made accountable?  Who wants to pick up on that?  We have time, I think, because we are allowed to go over since this is the last Panel of the day.  But not too long.

[ Multiple Speakers ]

Could I just get a reaction from the Panel to those three questions?  I'll come back to you in a minute if that's okay.

Thank you.

>> VELISLAVA HILLMAN: Thank you for the questions and thank you for the questions I'll address because it links with my message I was trying to convey earlier.  I touched upon the economic question and who finances education.  So these are governments and families.  They have the responsibility and they should have the upper hand.  They should be in control to push and to ask for how and what kind of EdTech and what will be the role of EdTech.  And I think so that is one thing.

The second thing I'm going back to the question of the sense of urgency, the sense of can we kind of slow down and really think about what EdTech we are bringing and for what purposes?  And the more we think about the kind of EdTech that is coming to sue plant or improve things, the further we seem to be moving away from what really is, what really we should be talking about when we talk about education and learning.  What really matters to education and learning.  So these are the two things.  I think just to sum up, but governments and families we need to collectively in the interest of young people, we should have the ‑‑ we as in I'm also a parent, but they should have a bigger say in the role of ed tack actors.

>> NICK COULDRY: Okay.  Does anyone else on the Panel want to pick up on these three points we just heard?  Gretchen?

>> GRETCHEN GREENE: Yes, so on the question of how do you control what corporations are doing?  So, I think one of the problems in the U.S. is that we underfund education.  If you do that then you make local school districts and local schools ‑‑ they are looking for money and resources that don't cost much money.  If a corporation then comes and says, we will give you low‑cost laptops or we will give you low‑cost software that comes with the following conditions, it becomes very hard to say, no.  So I think funding education better is part of the answer.  I think also to ask each individual school or school district to try to ‑‑ they don't have a lot of negotiating power, not alone.  And they also don't have a lot of resources to try to figure out what is negotiable in this contract?  Who could be negotiable?  What data are they giving up and how is it being used?  That should be done at a higher level that those districts that can then resource.  So if at the state level in the U.S., or at the national level in the U.S., and I speak of the U.S. because that's where I am and I know the system best.  But it could also be done at the state or national level at the U.S. and those resources could be shared Internationally.  The education resources of, we have evaluated this product and here is a checklist of what any school could be asking about.  And here ‑‑ in the U.S. we have with elections, we'll have when there is something up for people to vote on, there is often a paragraph written that summarizes it, that tries to be neutral about the summary.  And so that kind of idea could be done in a way that you wouldn't be asking each school or certainly not each parent to try to do.

>> NICK COULDRY: Yes.  And that point has been made in the chat by Dee Gordon who argues that we should slow this down and that we should be putting more money into local systems of manning education.  Which precisely that civic knowledge you're talking about.  But I want ‑‑ Rick wanted to make an intervention.  If he is still there.  I can't see that right now.  He steams have just gone.  Do you want to make the point you were going to make?  Can you hear me?  I think he is gone.  He is still on the call but I can't see him.  I can't send him a message.

I think we are coming to the end of the time.  Unless anyone else has an urgent question.  As I said, Rick if you want to speak, now is the chance.  Or does anyone on the Panel have any final comments to wrap things up?

>> VRIKSON ACOSTA: I just came out of a convention and Civil Society in Venezuela.  I also put in the chat that now a problem that ‑‑ (?) I would like to comment.  How do we deal with the problem?  How do we teach our children the Internet education?  It's very difficult but we have to manage two areas, education and ethics.  We have to mix education with ethics because we are doing education and we can get the children to have a safety on line and protect themselves.  And another question I think (?) asking if there are any institutions doing things right or whatever?  Well, I can finish thesis on education in sciences and I am trying to look worldwide but haven't found any institution that has like one solution.  I don't think there is a right solution.  ‑‑

[ Dropped audio ]

>> NICK COULDRY: We seem to have lost your connection there.  While we wait, he may have something more to say.  Does anyone on the Panel want to comment on the point he made about how to introduce ethics to this discussion?  How do we sustain the role of the ethics which is part of education in the digital setting?  Any comments on that from anyone?

>> VRIKSON ACOSTA: Since the pandemic got everybody, are there any Best Practices competing in other countries?  Right now there isn't anything like that.  (?) children have more stress academically but it's because before pandemic they ‑‑

[ Lost audio ]

>> NICK COULDRY: We keep losing connection with Venezuela.  But we have the question about education and ethics and also being reminded of Rebecca's question about can we recommend any good models here?  In all the complexity are there any models we can recommend?  Thank you.  I have just summarized your point and putting it back to the Panel now.  Thank you very much.

Any comments from the Panel?

>> VRIKSON ACOSTA: It's weird my phone isn't moved it's connected.

>> NICK COULDRY: I put your question to the Panel and they are going to answer it now.  Thank you very much.

So any points about education and ethics or the final point, are there any good models of EdTech out there we should be sharing and thinking about?  Anyone on the Panel?  Beatriz do you want to comment?  Your microphone is on.

>> BEATRIZ BOTERO ARCILA: I don't know.  I think I'll just repeat a bit of what my colleagues said already.  I think there is a good point for engaging students in conversations early on.  As I think Doaa suggested.  But I want to emphasize the importance of taking into account the structural type of incentives that lead to the situations that we might be sometimes concerned about.  So I think there is an important place for discussions about whether the technologies might be biased or not biased and so on and so forth.  But looking at how education is funded and whether schools have the capacity to provide quality education without having to start bargaining with companies that are offering equivalent and so on.  I think that's very important to at least see to it schools and principles and teachers in a good situation so they can actually make choices that are not ‑‑ that are too influenced by surmounting challenges like budget or too many kids in a classroom and so on.  I think that's an important place to start.  Because they are the experts.

>> NICK COULDRY: Maybe that is a good place to end as well.  The idea we need a good context for actually making decisions about this so all the interests are taking into account, including the interests of children.  So, I'd like to thank everyone for all of their comments and questions.  It's been a great discussion.  I want to thank Velislava Hillman for the concept.  We are happy to be part of it.  And have a good evening.  And thank you for being with us.

>> VELISLAVA HILLMAN: Thank you Panelists.  I want to thank you for being with me and for bringing different perspectives.  And thanks everybody for joining us.

>> NICK COULDRY: Can we save the chat if someone is able to save chat?

>> VELISLAVA HILLMAN: I think they can save it.

>> NICK COULDRY: It's been a good discussion.  But thank you to everyone and good evening or good morning or good afternoon, whichever is appropriate.