You are here

IGF 2017 - Day 4 - Room XXVII - WS154 The Distributed Denial of Democracy: Threats to Democracy: Threats to Democratic Processes Online

 

The following are the outputs of the real-time captioning taken during the Twelfth Annual Meeting of the Internet Governance Forum (IGF) in Geneva, Switzerland, from 17 to 21 December 2017. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

>> DANIEL O’MALEY: Good morning, everyone. I think we have a really great panel, and we only have an hour, so I want to get things started and make sure there's time for questions and discussion afterwards. This panel is “The Distributed Denial of Democracy, Threats to Democratic Processes Online.” This is organized by the National Democratic Institute, Center for ‑‑ and the Center for International Enterprise. I'm going to open it up with a brief statement. I'm going to introduce our six panelists. Each of them are going to have five minutes to discuss the issue that they are here to take about from their region. Then we're going to open it up to questions and answers from the audience here and also from the audience online.

 My name is Daniel, and I'm from The Center for International Media Systems. Let's go ahead and get started.

 The distributed denial of democracy refers to the online use of multiple actors or channels to deny citizens access to or interrupt the flow of legitimate political discourse, thereby undermining democratic culture and practice. Thus, while distributed denial of services act takes down a specific website, a distributed denial of democracy attack attempts to remove certain voices from what appear to be an open and participatory dialogue.

 As media platforms have grown, so has online activity by actors and individuals that seek to silence or exclude voices online. Anti‑democratic actors have created innovative new techniques that turn the attributes of the Internet against open institutions harnessing hyper-partisanship, filter bubbles, and age old human biases accelerated with content driven by criminals or outright disinformation to erode trust and increase social strife.

 These efforts which are often coordinated and well-resourced are frequently harder to detect and have the overall effect of undermining civic engagement. The challenges to democracy online are not a product of governments alone.

 For example, at the Center for International Media Assistance where I work we noticed how they are undermining the high quality news and information. This weakens the broader ecosystem and also undermines democracy, which depends on a robust free press.

 In a similar fashion, online violence against women is similar and has the effect of silencing voices of women and discouraging them in the participation of public dialogue.

 We need to find ways to make sure that the Internet fulfills its promise to enable people to become more active participants in the democratic process. Insuring that the future of the Internet empowers universal human rights and democratic values will require corporation from government, policymakers, the private sector, and civil society. This is why it is so important for us to discuss democracy and the Internet in places like the ITF that are multi‑stakeholders. I would now like to briefly introduce this panel, which offers a variety of different perspectives from the private sector, from media, social society activists, among others. I'm looking forward to their comments on the faces that we ‑‑ that they're seeing through Democratic processes online and the way that we need to work together to make sure that every region, every sector is heard in this debate.

 To my right is Hunane Boujemi. She's also the co‑chair of the IGF dynamic coalition on Internet rights and principles. She's originally from Morocco, but she's now living in the U.K. Next to her is the legal director of the Software Freedom Law Center based in New Delhi. She consults with and advises students in the U.S., India, China, and Korea. She's a member of the Bar Council of Delhi.

 Next to her is Jehan Ara. Pakistan Software Houses ‑‑ it's the I.T. body and technology‑enabled services and businesses in Pakistan. She is an entrepreneur and social activist and a strong propagator of extending the power and use of information and communication technologies beyond pure, traditional businesses to empower and enable communities.

 Next to her is Martha Roldos. She's an Ecuadorian. She's been Deputy of the National Congress and she was member of the Ecuador Constituent Assembly, and she's currently the executive director of an organization committed to human rights in Ecuador.

 Next to Martha is Chris Doten, who is the chief innovation officer at the National Democratic Institute. He has designed programs in dozens of countries that have provided technology to reach more citizens, track political processes, and improve organizing capabilities.

 Last but not least is Matt Chessen, who is a diplomat and technologist who has some of the most challenging assignments in the U.S. Foreign Service. He has also led implementation of an open source crowd working program called Open Source Opportunities and his particular expertise here that he will be sharing with us is how artificial intelligence enhances computational propaganda.

 Now I'm going to ask the speakers to give five minutes, and we're going to start with Martha.

>> MARTH ROLDOS: I will talk to you briefly about an experience in Ecuador. Internet was like the last frontier. We had a regime and communication law that was a censorship law. Then journalism moved to the Internet in order to be able to keep publishing. The problem is that they refresh and move to the Internet too by the government. We are not China. Our government didn't have the capability. They didn't want that too because they didn't want to be labelled as a dictatorship.

 They did things more intelligently, and then devised ‑‑ we have all the sorts of things that now we are seeing around the world. We have a lot of attacks in journalists and female journalists in particular. There is a lot of propaganda. In the case of Ecuador, it was curious because they say the fake news came from the government, not from the civil society. And the government used all these devices to express these views. It's to overcome the attempts of the government. People got afraid to talk on the social media and then a lot of participants, many journalists platforms. It's an important piece to interact with this course when you have ‑‑ you teach the public to discriminate against contents, and then you have ‑‑ you are asking someone who are putting contents on the social media. What is ‑‑ are you sure about that? People who have this year to discriminate they try to have the information campaign, but now people know it's better. Maybe you can do it by a date, but like the next day, the disinformation is contested, but by informing people. You have MP's in social media. You have lawyers in social media. You have a lot of human rights activists all in social media.

 We begin to counteract also the attack. Nude photographs were posted by people related to this government center, but all the women, like ‑‑ all the feminist ‑‑ not only feminists, but journalists, all people were pouring support and were condemning what they were doing. Now we are in a kind of ‑‑ we were able to still to keep publishing the news about corruption. We weren't able to human rights agenda because all the main players got into the social media too.

>> DANIEL O’MALEY: We will now turn to Jehan who can tell us about some of the experiences they're having in Pakistan.

>> JEHAN ARA: Thank you. In Pakistan since I represent the ‑‑ I would like to first own up to the fact that until still about five to ten years ago the I.T. sector in Pakistan was very new. We only engaged the government on policy in a very civilized fashion. We would write to them. We weren't very aggressive. We thought that was the work of activists. In 2007 when the government tries to push through is when we woke up, and we realized that we would have to do something, and I think one of the smartest things we did was we used the multi‑stakeholder approach, and we connected with activists, with media organizations, and with all stakeholders we knew where we had the policy meetings. That way we got together. We had meetings. We shared what the signs were, and we approached the government and lawmakers with one voice. That made us a lot stronger. That over the years has become stronger and has engaged with parliamentarians, and that's how democracy is supposed to function.

 Some of the things that have been happening as far as Pakistan is concerned is other than the bill which we continue to have government, they have finally made it into an act, and there's still some areas of concern. Although having engaged with them, we've been able to push through some of our concerns, and it is not as it was when they first drafted it. Platforms and mobile services being shut down under the guise of security. Are those kinds of things continuing to happen? That affects communication. It also affects essential services like emergency services when you can't work on your mobile ‑‑ your number is inoperative, you cannot contact emergency services. You can't contact services like Uber, and multiple services now that citizens, business people all depend on a regular basis. Those are things that we continue to engage with government on.

I think that's something that we will have to make sure that they understand. I think one of the things that I have seen myself is that there are always champions within parliamentary groups. You can engage with them, and sometimes it is ignorance that results in some of the worst policies. If you engage them and educate them, then they become your champion.

We have held press conferences with activists, with politicians and media to make sure that the message gets across. That people understand that it's not all a lot of noise. Activists, of course, have ‑‑ whether it's activists who are younger, who have now engaged on social media, or whether there's an activist who has been engaging with government with the last 30 years. They're not hiring social media and the Internet and all that sort of thing, but the younger people are engaging with them and making sure their voices are also heard on the Internet. Many of these women do not want to be openly visible on social media or the Internet simply because they feel threatened. It's our responsibility for Congress and the activist to teach them how to keep themselves secure on the Internet.

 We've all been working together, and I think that has resulted in a very powerful stance. Of course, the challenges continue, and we will have to continue to make sure that any policy that is made or any threats to privacy, there's no data protection law platforms. Sometimes they tend to be very compliant with the government completely because they want to continue to make money. Also, telecom providers whose licenses depend on the government sharing data with anybody and everybody. Those are the kind of things that concern us. I'm just grateful that the private sector, the I.T. sector, has woken up and is now actually engaged.

 Many of the I.T. business people are now more activists than you would even ‑‑ you know, than you would expect some people who were otherwise, and, you know, only you engage with them in a very professional manner. This is something that I wanted to share.

>> DANIEL O’MALEY: Thank you. I think that's a really powerful story. Kind of different sectors ‑‑ people listened, and you were able to find champions in government, which is really important. I'm going to now turn to Mishi, who is going to talk about what problems she's been working on and seeing in India.

>> MISHI CHOUDHARY: Thank you. Just for clarification, the law center is established in New York. I'm going to just concentrate on a little bit of ‑‑

(Audio fading)

It's not just the next platform that keep us watching. We also have a lot of ingenuity of objectives, and when we get ‑‑ now we can be as objective, and the power of shaping the politics or reshaping of politics, I think the government is finding it hard to have a focus on the human effect on the net as much as they want to figure out the political or the economic effect.

 There's a complete breakdown of the ‑‑ censorship and ‑‑ they don't actually solve the problem of the Internet, which is now something which we have. It suits the state sometimes, but in other regards, it's something that is absolutely ‑‑ it's the experience of the limitation of distraction.

 My organization's work is a catalog and the Internet shutdown. You find it on the Internet shutdown. It shows that in India the shutting down of the net is Japan's locating problem. It's broken over there and all over Kashmir. It's really an example in a way in which the government has yet to figure out how it has been being with the net. It's mostly about aggression and hostility, which is what the government usually has as an explanation.

 Three months from June 1, there are places that are completely covered, and the original shutdown and there's a fixed line shutdown. Most developing countries have access to Internet through the mobile network. 93% of Indians who are online connect to mobile and not fixed line Internet. Once the mobile network goes down, at least their connection to Internet is totally snipped.

 We tracked stories also. They become just numbers. There are monthly installments, and that is to apply to university because now there's only one place, and there are universities which are not at the same time as ‑‑ then if there's no Internet and no fixed lien or mobile Internet, it's difficult for them to have a life, and if it's not there for three months, then you can only imagine that after high school you have something which is fatigue, and you're not able to take it because it's not an effort connected and how it backs your future with respect to the rest of the country, which is actually ‑‑ we also ‑‑ there will be a catalog on our website.

 (Audio fading)

People run and take their orders, and once they have no activity, there are people that do not have a lot of capital in which they can rely on. We do appreciate having worked with the government to understand, but the policeman's job is mentioning public law and order on the ground. They're doing it right. Then they're doing it without much. That’s what ‑‑ the next shutdown imposes a fair amount of ‑‑ that's the conclusion. They are extremely limited, and the fact that they have this one, it means it's going to be overused. Not because there is a need for justification for this. If you only have one camera to shut down ‑‑ also, I particularly think that ‑‑ in order to have advertising, it’s become the political economy of the net. So much so it's now work on companies to be able to carry people's package in their own expense in return for the right of being able to look into what people are talking.

(Audio fades)

How this works in U.S. democracy is now very much a developing country problem. We are watching it play out in different parts of the world. It's a very different tone.

(Audio fading)

If you don't like it, get off the platform, but that's only just telling you what ‑‑ now we live in that era. It's not just wonderful people or women who do have opinions because they are equally citizens. When they are kept off some kind of discussion, you know that it would be everybody else who will face it. That's where ‑‑

>> DANIEL O’MALEY: Thank you. I like the fact that you pointed out that these are issues that are global, and I'm just ‑‑ sometimes we see some of the issues like the free basics. India ‑‑ people are really global, and we're all trying to grapple with that. I think it's really an important point to highlight. I'm now going to turn to Chris.

>> CHRIS DOTEN: It's great to be here. At NDI we work a lot on improving citizen access to political processes and particularly around elections, and so we've seen a huge level of impact on the political conversations ask discussions that take place at those most sensitive pivot points in the history of a country, which is an election where things can really swing dramatically based on a sometimes very, very small change in outcome. You know, the rise of a lot of these challenges to the democracy online, whether it's disinformation or trolling or so on, people will often poo‑poo this and say, well, this is not new. Mark Twain talked about how rumors are ‑‑ go all the way around the world. Thomas Jefferson ran a whole series of newspapers that was defined as fake news, attacks on John Adams. That may be true, but something that is new and different in terms of propaganda that is using computers to accelerate or increase the flow of targeted messaging, something that I think is new is manufactured consensus.

 You know, there are a number of things that are true and challenging when it comes to disinformation and political communications online. You know, it turns out that Mishi referred to human psychology here. You know, turns out that, like, consistency of messaging is as important. It doesn't matter how many times you get hit with variance on different messages that are falsehoods or incorrect ‑‑ it will still serve to undermine a narrative. We see a lot of attacks on what have been gatekeepers of proof or kind of authorities or leaks that had been before people that helped us establish what was correct or what was not.

There's a huge amount of implicit pressure built in to go along with what seems to be ‑‑

(Audio fading)

That can be problematic online thanks to the bots and propaganda. If a message is relayed a whole bunch of different messengers, what we think are different messengers, and we see that popping up in our news feeds or in our Twitter streams or even our messaging APPs over and over. Then it will appear to us as though that is what everybody thinks. That may, in fact, not be true. That's something we see with crowd sourcing in general, anything that people tend to use volume as a proxy for accuracy or attempts.

One of my favorite examples is an open survey of where Justin Bieber should do his next concert. Pyongyang was the selection of the masses. Not because that's where most of Justin Bieber's fans wanted him to go, but enough other people jumped on it. The whole democratic system can be undermined. We see this playing out globally. Whether it's things like bots in Mexico or the extraordinarily effective use Facebook messenger and other countries around the developed and developing world where we are seeing this take place. It's hard to know what the answer to some of this is.

There is a literacy question to this. People can be more aware. It also may be that the switch to messaging applications may actually help reduce some of the threats. When we're not in the kind of open public spheres of Twitter, Facebook, it may be that some of the smaller conversations that are more typically between people that you actually do know may be less impacted by this. We would like to label bots to help people distinguish what is legitimate conversation from actual organic humans to what is being pushed or propagated. If the genie is truly out of the bottle and there is little that can be done to stem the flow of computational propaganda, we may be in a weird world where democracy advocates have to start training who we consider the good guys, pro‑democratic forces, on how to use ‑‑ build and use their own bots. Leave it at that and look forward to questions later.

>> DANIEL O’MALEY: Great. Thank you. Matt, now you're going to talk a little bit about artificial intelligence. It's kind of like the next level of the bot armies that Chris was mentioning.

>> MATT CHESSEN: Thank you very much. My work has focused on how we combat disinformation online. My government, the United States, is still trying to fully understand this phenomenon, and I want to talk about some of the research that I have done on looking at some of the emerging A.I. technologies and how they can impact institutional propaganda.

First, I want to put in a word for preserving freedom of expression and avoiding censorship. We believe that people have the same rights online as they do offline, and the Internet has a tremendous amount of value to us from our ability to really associate with each other, share ideas, express ideas and opinions. To do that free of interference or censorship. We don't want to utilize. We don't want to take this problem and disinformation online and use it as a reason to censor free speech online.

That being said, I just want to play on a couple of things that Chris had already mentioned here. What are folks doing with some of the technology tools. Docking is very common where they'll release personal information about people in an attempt to comply it and use these armies of bots, basically, to reinforce the messages and the threats against people. They'll do hash tag standing. They'll just stand that hash tag with a bunch of garbage information so that the activists can't actually communicate with each other or organize.

Most of these techniques use ‑‑ most of the techniques violate terms of service, so they're not particularly illegal, but they do violate other social media platforms. What was mentioned was the use of fake social media accounts. These are accounts that impersonate real people and they try to manufacture consensus. These are sort of the things that we're facing and that we're dealing with. You know, we can't be judging the actual speech that's coming out because that gets us into the realm of determining truth from fiction. What we need to focus on are the bad actors and the bad effects they're having on.

 Now, let's talk about the technologies that are going to impact on propaganda, and to be clear on this, the technologies by themselves are not inherently bad. There's nothing inherently bad about these technologies. They will be used for a lot of good, but they can be used for some of these malicious purposes, and we need to be aware of what these are. Very quickly, what is artificial intelligence? We're not talking about sentient computers here. A.I. is defined as machines. Better definition is too think about it like biology or chemistry. They deal with pattern recognition, autonomy, perception, recognition, learning, and decision-making. It's a better way to think about it like that. It's a little more inclusive, and it gets us out of this realm of thinking about what Elan Musk and Stephen Hawking are talking about.

 Chat bots are increasingly able to have more human‑like conversations. People have proclaimed love. They wish she was a real girl, and they want to marry her. People develop very close relationships with these chat bots, and they're only going to get better and better at having conversations that emulate human beings.

 A.I. systems are getting good at creating a dynamically creating content. Books and screen plays. They've created real good visual art and music now. There's one out of Stanford where they manipulate Donald Trump in a Youtube video, and there's some dynamically generated speeches that Obama never made. These are going to generate all the content, and basically create the sort of pliable reality where they're modifying and shaping events in real‑time and creating fake events. You can figure out all of these things with a few pieces of data and use it to create personalized propaganda and disinformation. Either difficult tools. So once you create one, it's easy to then create many more of them. They learn, so they can optimize their behavior, and they can be working 24 hours a day, seven days a week. While we're still trying to shape responsibilities, they can be shaping narratives.

 We're going to shift from this bot‑driven propaganda that are pushing out human‑generated content to the computational content that's pushing out this machine‑driven content, and that's going to complicate some of the existing problems that we're seeing. Also, just interesting sort of cultural note. You know, we're going to see a lot more of these machines and accounts communicating with each other and sharing information with each other, and so we're going to have other intelligence moving into the online communications space. That's a big unknown how that's going to work. That will be human communication online. I wrote a paper about this, and it's called the –

(Audio fading)

>> DANIEL O’MALEY: Now we're going to turn to Hanane, whose expertise is working on principles, and that's one of the reasons why we have her on this panel. You just heard a number of different issues from trolling, surveillance, and in Pakistan the blunt use of shutdowns in India. Also, the surveillance of the business model, computational bots. We've heard about a lot of things that are going to shape how people engage in the future and how people will participate in the diplomatic processes. What are some of the things that we can do and where does this go?

(Audio fading)

>> HUNANE BOUJEMI: It's always political. They shape public opinion, and ‑‑ how do you solve all this? I was very, you know, pleased to hear all the stories from Pakistan, and how they tend to be an efficient actor to affect, you know, what's happening at the local level and open channel in the government. You cannot trust them to know what's happening. Shutdowns happen when we understand that operation is dead. Sometimes ‑‑ how do we solve all this? Work of the coalition is specifically concentrated on what we call ‑‑ you know, we can address these things. Why? Because we think that regulation is usually slow to adapt technology. Also, hampers many values in general.

 We do have a lot of decisions now which affect how people are going to connect to the Internet. I think with what's happening with the discussion, it’s a turning point, actually, in our field, and I think we are fighting for the perfect country for a long time to establish. I think the next ten years there is a set of principles, and it was a collaborative force which includes activists in general. Unexpectedly, we managed to ignore some of the important policy processes in New Zealand, Italy, you know, in different countries which viewed the principle as a way to design their own policy. Two becomes a framework of core values, and it's for our work. The importance and how important it is to preserve it for the values. That applies to many fields, and I think the discussion in the forum was about me, and I pointed that out. So we don't have the group specializing. We can look at all the issues specific to media and how we can link it to what's happening in the content of industry powered by technology and how it applies to the Internet. There's a lot happening, and I'm still learning what's going on in the media center. I think if we agree to maybe combine our work, to integrate the values that we've seen --

(Audio fading)

He just came across our work and there are certain sections which we probably need to adapt to the Russian context. We keep, you know, adding points to the chart. I mean, we think it's great. You know, we really appreciate, you know, people going through the details of the charter and finding the value and doing it.

>> DANIEL O’MALEY: We're going to open up the floor for questions. Do we have any questions from the online space? Okay. So we're going to take three questions from the floor, and then we're going to let our panel respond to that. If anyone has a question, you know, please raise your hand now.

>> My name is Joanna Bryson. I'm professor of artificial intelligence. I also work at the Pearson Center for Information Technology Policy. Maybe I can work with some of you there. In the first introduction you said that regulation can actually be ‑‑ I think of law as ultimately we do want to, you know, capture the good ideas and push it forward. Also, I think much of the space ‑‑ most of artificial intelligence does happen already in a regulated space, and what we need to do is improve the regulation because it's not the purpose. Those things change. I think of regulation is something we need to as a parallel process catch up. Why do you say it's bad for democracy?

>> Since I have attended and you have attended the session, my question would be how sensible. If you look for units online, you will get easily hundreds of ‑‑

(Audio fading)

Half a day or even more to find some documented article analyzing beyond just the slogans. What do I mean by that? I mean that in the world of much information, poverty but information excess, censorship and manipulation go this way. It is why some cliches are plenty and some are truth even. This mirrors to a large extent their own treasury. What makes things all the more difficult is in Ecuador's government it has a good image with society. Rightly or wrongly. They can hide behind the rhetoric, and so it will be very difficult to convince a civil society of some truths because they will feel that they are stabbing their own bodies.

(Audio fading)

>> PANELIST: When I mentioned that regulation can be ‑‑ I meant it because I work in the league, and I can assure you that it's not –

(Audio fading)

>> DANIEL O’MALEY: There was a second question about excessive information as a form of manipulation or censorship in Ecuador.

>> I would like to ask both questions. Sometimes I like to be able to speak Spanish because it would be faster, but I think something that is different from a minute ago is that we need to think how to foster good journalism. Part of the disinformation is that you need to focus good journalism. You need to be able to find ways to convince people maybe to write money, like this multi‑stakeholder operates money for good projects of journalism because that's the question. You know? To do good journalism takes time. It took us six months to investigate how our highways were being built in Ecuador because it was the main propaganda of our government, you know? Now we have highways. Then we find that some of our highways cost ten times ‑‑ 100 times more than in Europe.

 When you ‑‑ but it took two persons six months to go through that misinformation. It takes time. If you have underpaid journalists, it takes time, and it costs a lot of effort. People need to value that.

 In the Ecuadorian -- grade school is not the case anymore about our government. They are very, very ‑‑ they lack government. There's low popularity. When he was in his hay day, the ones who were pushing towards recruits, they were executed. I have friends that went to jail. I have leaders that went to jail just for protesting. It's very revolutionary to ‑‑ don't go to the minds, compromise companies and, I don't know, I think it's strange that you cannot tell everybody all the time. Somebody sometimes ‑‑ you cannot fool everybody all the time. It's the cost.

In Ecuador we work in seven platforms at the same time, and I have a big, huge group of feminist friends, and we have a what's APP group. When some of them knew that another friend was asked what being attacked by the government and they were talking, there's a photograph auditor and Facebook. We all organized. It's ‑‑ it takes a lot of effort, and it takes a lot of organizing, but it's happening. Ecuador ‑‑ you know, the table broke. So when you come from countries like ours, sometimes you are afraid of loss. Freedom of expression is the last ‑‑ it came back that ‑‑ that's our last resource. Our last resort. It helps because when you cry, someone helps. Thank you.

>> DANIEL O’MALEY: This is one of the tactics that these malicious actors use to spread disinformation. It's not just with bots. They'll actually create, you know, dozens and dozens of web pages that look like local NGO's. They look like regional newspapers. They look like expert. They're replicating the same disinformation over and over again. Part of that is to gain the algorithms. The platforms are always, you know, having a struggle to work on these things. What they're actually trying to do is capture people into this ecosystem of disinformation so they lure people in with emotionally pleasing information, and then they capture them in what looks like a rich media environment where they see a lot of web pages with the same information and a lot of social media accounts that are saying the same thing. Those are all being run by the same disinformation actors.

 I think Chris mentioned we might need to use some of the same tools to counteract this. Now, we should never use disinformation, and we should never use ‑‑ that's just going to contribute to the problem. A lot of the same tools can be used to direct streams of truthful information at these vulnerable and targeted populations to make sure that there's no gap that these malicious actors can fill. I think that's something we all need to figure out is how can we actually make sure that we use technology in a positive proactive way.

>> PANELIST: I'm very interested, and also I think that we move to messaging APPs and that can have some respect ‑‑

(Audio fading)

>> It's a free speech and expression with restrictions creeping in in various other ways. I am careful, and this happens. We as a lawyer at the time for the last several years say you don't have the U.S. First Amendment. You have a different limit. All your laws will go and face that limit. The interesting thing that is alarming is that it can be unpacking ‑‑ I just want to put out that it's very ‑‑ it's very unpopular that I would like to restore ‑‑ if people would stop thinking the people's time or Internet's time, perhaps we would have a little more ‑‑ we don't even let the silence begin to permeate the thinking again. I do not understand what is the extremely stack that you have in that line getting into this building which we're holding up, but there's so much ‑‑ you are just scrolling through shopping carts. You can do it elsewhere. People send emails, and people send important information in elevators, walking in the door, and everywhere else. If people could do their computing standing still, preferably using a ‑‑ we all know that there's secure WI‑FI, perhaps you could have a little more sanity restored in the system, and we don't have to get instant gratitude for everything. It's like it used to be I think, therefore I am. Now it's if I Tweet, therefore I am. Does that happen?

>> I'm not anti‑regulation because I just checked on my Twitter account, and it's flooded with anti‑regulation.

>> Yours and mine, there are huge populations of young people. Instant gratification is what we are looking for. The latest technologies are being to extricate important messages or very unimportant messages all the time, and this is going to include. We are not going to be able to stop it.

>> I totally agree about the fact you are observing, but I am more optimistic. I think it's more ‑‑ very fascinating. If you have not drawn upon, this is your first computer, it is very fascinating. It's very, very attractive. When I look at this video on demand, there are pictures on demand, it's very fascinating. It's going to be an education about the impact of this on us as human beings. I do think that people ‑‑ it's the same thing. We demand it even if it's not given, but we still demand it. I think it's a conversation that started to pay and discuss a little bit more.

 I know my idea is unpopular. It's not unpopular, but I'm just saying it may be.

>> DANIEL O’MALEY: I apologize we only have had one hour to talk about these issues. I think it's great that we end on a positive note and we heard some optimism. Also, the way that the Internet is enabling her feminist group to organize and mobilize. You know, we're not here to say that the Internet is bad. We need to marshal different social actors to make sure that it remains a positive place for everyone.

 I want to plug something that kind of piggybacks on Hanane's work on the principle coalition. Our institution, The Center for National Media Assistance, are working on, which is principles framework for democratic processes. We're kind of talking about these very issues and what it would look like in practice for people so we can take that civil society activist to some governments that just need a little bit of education. You can check that out at openInternet.global. We would love you to check that out, look at the principles that we have, the frameworks you've created. Many of the people on this panel enjoy this community.

We hope that ‑‑ I agree with you. I think that this type of conversation is going to be one that is going to continue going forward and become even more important. We want you to become a part of the discussion.

 Thank you very much. A round of applause for our panelists. They were excellent.

(Applause)

(Session concluded at 11:30 a.m.)

 

Contact Information

United Nations
Secretariat of the Internet Governance Forum (IGF)

Villa Le Bocage
Palais des Nations,
CH-1211 Geneva 10
Switzerland

igf [at] un [dot] org
+41 (0) 229 173 411