Deprecated: Function get_magic_quotes_gpc() is deprecated in /home/wgig/public_html/igf/website8/web/cms/libraries/cegcore2/gcloader.php on line 63
2015 11 12 WS 123 Indicators to promote evidence-based policy-making Workshop Room 8 Finished
 Welcome to the United Nations | Department of Economic and Social Affairs

The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

*** 

>> ALEXANDRE BARBOSA: Good morning, everyone.  Welcome to our session.  Well, first of all, I have to say that as a Brazilian I really would like to give a warm welcome to Brazil and I really hope this IGF edition will further contribute to the debate of the Internet Governance.  So welcome to Brazil.  I'm Alexandre Barbosa, head of the Regional Centre of Studies for the Development of the Information Society, which we are, which is a center linked to the Brazilian (?) operation center and of course CETIC.br.  We are (?) based in Sao Paolo. 

Well, for me it is a great pleasure to moderate this session on Indicators to Promote Evidence-Based Policymaking, which such a distinguished group of experts, and this panel was -- initially was thought that this was having reliable and internationally comparable data is very relevant in the debate of public policy.  So we invited people that are engaged in this debate related to ICT indicators.  So the original format of this panel included different actors from different segments in a multi-stakeholder approach, but unfortunately the person that is representing government would not make to come to Brazil from the government of Egypt, but we do have representatives from academia, international organizations and organization from Civil Society.

     Before I give the floor to our experts, let me briefly introduce some key concepts to our discussion, and I want this session to be very interactive.  The literature and public policy, tends to converge on the idea that better use of better statistics leads to better policies, and reliable data is really essential for ensuring public accountability.  Those the impact of policy can be measured with good statistics in different stage of the policy making process, and as coaching (?) 2005, he says if policy cannot be measured it's not a good policy.  So this shows that the statistics and reliable data is very important for policy making.

     We just had a session on big data, and big data is very, I would say, important topic nowadays, and of course producing reliable data will feed this policy-making process in a very crucial way.

     If you think of policy-making as a process, we can say that a good policy requires good statistics at different stages, as I have mentioned, including the very initial phase of agenda setting, then design and legitimization of the policy implementation, policy monitoring and of course impact assessment.  Usually if you go to the literature you will see that many authors, they haggle that there exists policy in practice gap meaning decisions made at the policy level not necessarily address real issues on the practical level.  So despite the growing debate on measurement and statistics productions on ICT access, there is a lack of systematic and reliable ICT statistics productions expressed, for instance, by the lack in several nations of data that can be used to measure the development of information and knowledge societies.  So I hope that we will bring to this debate key issues related to the use of this data.

     And the idea behind this workshop has been inspired by the ongoing debate at several international organizations, such as ITU or OECD or UNESCO that are dealing with standards and definitions, concepts, in the use of something reliable, internationally comparable data is crucial to address ICT policies, especially those related to digital divide, just to mention a key area in this process.

     Well, without further delay let me introduce our presenters in the session.  We have in my right Dr. Alison Gillwald.  Dr. Gillwald is the executive director for research ICT Africa, RIA, based in Capetown, South Africa.  She also holds an adjunct professorship at the management of infrastructure reform and regulation program at the University of Capetown, (?) School of Business.  Dr. Gillwald was also appointed by the then President Mandela on the advice of Parliament to serve on founding council of the South African Telecommunications Regulatory Authority.  She was also served on the first South African national digital advisory, and she's currently deputy chairperson of the South African national broadband advisory council.  In 2013 she assisted the (?) communications with South Africa's broadband plan and in 2014 she served on the international cooperation for signed names and numbers, ICON, (?) PENO on the multi-stakeholder innovation.

     Following on my left we have Professor Hernan Galperin.  He's a research associate professor at the Annenberg School for Communication at the University of Southern California.  Previously Dr. Galperin was associate professor at the Center for Technology and Society at the University -- University of San Andres in Argentina.  He's also Steering Committee member for DC, which is an ICT policy research consortium in Latin America.  He's an expert on telecommunication policy and development and leads a number -- number of research projects related to the regulation and impact of new ICTs in Latin America.  He has published extensively in major journals, such as telecommunication policy, development policy review and information technology and international development.  His most recent book is information technology for development, opening the Internet black box.

     Then on my left we have Mr. Fabio Senne.  Fabio is project coordinator at the CETIC.br, and he's coordinate -- Fabio coordinates our ICT national surveys in Brazil, carried out by CETIC.br and he has a master's degree in communications from the University of Brazil, a bachelor degree in social science from the University of San Paolo. 

     Last but not least Lorrayne Porciuncula.  Lorrayne is an economist and policy analyst at the Digital Economy and Policy Division linked to the science, technology and industry division at OECD.  Lorrayne works on the (?) which is the Inter-American development link, broadband policy (?) for Latin America in the Caribbean, that aims to situate policy recommendations to the specific regional and local context.  Previous to her current position, Lorrayne has worked as economic analyst in the International Telecommunication Union, ITU, at the broadband commission examining national broadband plans.  She holds a master's degree on development economics from the (?) institute of international of national and development studies, international national relations from the University of Brasilia, Brazil.  This is a long introduction but I thought it was relevant just to sit way that we really have expectations on the panel to discuss this issue.

     So I would like to pass the floor to Professor Alison.  We have for each speaker about eight minutes and then we will open the floor to participants, and also remote participants.  So you have the floor.

     >> ALISON GILLWALD: Thank you very much.  Good morning everyone.  I am from Research ICT Africa, which is a policy think tank based in Capetown but which hosts an Africa-wide network.  We have been collecting ICT indicators and doing policy and regulatory research for over ten years in this area, and so I really wanted to sort of move beyond the platitudes of indicators that we need for regulation and talk about some of the challenges we face and some of the problems we're having with what is being presented as evidence for evidence-based policy.  So I want to look at some of the institutional kind of challenges and then at the quality of the evidence we're getting.  So essentially looking at the evidence of global -- global evidence on ICTs that we have.

     And I think in that sense I want to sort of understand this from a kind of political economy point of view rather than just a purely statistical point of view, because I think the unevenness of indicators reflects the uneven development that we see across the globe, and in some ways it's kind of self-perpetuating, so we keep saying we need these international indicators, but you're not addressing the fundamental causes for why we have this uneven indicators.  And so we're in kind of this vicious cycle of the north and south that have these indicators with some exceptions.

     So I really wanted to refer to our experiences over the last few years.  Even the understandings of policy formulation and so-called base practice, et cetera, are very much informed by assumptions that come -- and the global indicators that come -- reflect the political economy of mature economies and democracies in the north, the political processing we're trying to feed this policy into assuming it's going to be objective assessed by competent officials in policy making is a challenge in many of our countries and many of our jurisdictions.

     So the work that we do in ICT policy and regulation together with our partner network in Asia, specifically in the policy and regulatory area but also with DOSI, is really around looking at alternative regulatory strategies, so looking at what are presented as best outcomes from (?) countries and OECD studies are not initially what are going to be the best policies for our countries, and therefore we may require different evidence.

     The very different access and use trajectories in the global south make some of the standard indicators quite meaningless and difficult to gather in our context so we need to look at alternative data gathering strategies and alternative partnerizing policies but ones that can align with the global indicators for comparative purposes.  And of course also influence those, which I think we have done.  I mean, I think, you know, basically the international standard for all ICT indicators and work is the ITU, so even if you're looking at World Bank indices or you're looking at OECD indices or you're looking at whoever's indices, basically you're looking at the ITU data that's underlying that, and for great parts of the global south, but in certainly great parts of Africa, that data is at best out of date and very often full of gaps and quite honestly, sometimes your entirely spurious.  The measures that are being used don't reflect the kind of usage.  So they're measuring prices in certain kinds of packages and broadband plans and that sort of thing, doesn't reflect the sort of multiple SIM card prepaid use, et cetera.

     So even with indicators like numbers of users, particularly broadband, these have been very problematic measures, and self-measurement, the problems of the supply side data coming unaudited through regulators but from the operators is, you know -- really makes the evidence that we're getting, you know, questionable -- questionable at best.

     So I think really the -- the problem that -- in Africa -- let me just speak of Africa, not the global south more generally, is that this absence of institutionalized data collection means that the data collection that's happened the past few years, publicly available data that's been available, has largely been beyond the supply side data, but even on the supply side, getting timely supply side data out and we do assist the ITU where we can with that.  But particularly getting, you know, surveys done in the demand side data done, which is obviously critical, means that the data collection is subject to, you know, donor and commercial donor agendas.  And certainly with the economic, you know, recession of the global -- global recession of the last few years, together with the entry of conservative governments in many of our, you know, standard donor countries, we've actually seen an enormous cutback in public funding, and the vacuum that's been created by this has really been filled by noninstitutionalized, nonofficial, nonacademic in the sense that it's gone through a rigorous process, commercial research that has been put out by big industry associations, by private companies and is actually being presented as if it was, you know, nationally representative surveys, rigorous statistics. 

     So, you know, studies particularly for -- just because they have currently been -- just been produced, you have studies on gender, for example, we all know that there are disparities in that but the reasons for them are not clear, and so a lot of these studies are put out, they're called global studies, they may include five countries globally, 10, 15, maybe, if you're lucky, you know, but they actually -- they're only done sometimes, you know, in the urban center, they're not done across the country, but the claims that are made are for the country, for the continent, you know -- the continent of Africa has this situation, and, in fact, because they're not nationally representative and because they're not rigorous, the policy options that are coming out of it are really quite unhelpful. 

     So if you are actually looking at disaggregated statistics, for example, and you see there's a disparity and you don't know what the causes for that are, the solution then is to have, you know, discounted packages for women in ping cell phones and -- pink cell phones that are proposed, various recommendations out of some of these reports.  So I think -- I really think we've been in kind of the doldrums.  There was a time when we had no statistics at all and especially supply -- reliable supply side statistics specifically in Africa.  We had a period of really positive support and at the height of our research, 17 countries across Africa with up-to-date supply side data and full -- pretty representative household surveys that have been cut back through this process of cutting back of donor funding, and the failure to institutionalize these national statistics offices or dedicated units such as you have.

     I should add at that time that this cutback has happened, major donors of this kind of ITT work have put this money into, you know, some of the richest commercial organizations that exist on the continent, on the planet, so, you know, GSMA has received vast amounts of money to do this research, various other groups have received vast amounts of money to do this research, but it's not nationally representative, it's not replicatable, it's not actually -- it doesn't inform good policy.

     So basically I just wanted to quickly touch on a few things in terms of kind of data we do need and hopefully we will get again.  Basically, I think they're not good and bad indicators.  It's really some indicators could measure some things and some things can't be measured.  So in a prepaid mobile environment you cannot assist agenda from supply side data.  There's no way you can use supply side data, and some people have suggested in the GSMA context with the compulsory registration of SIM cards you could begin to measure agenda breakdown.  Anybody who lives in a developing country knows I have at least six people who are registered on my name because they're not South Africans and they can't register and they happen all to be men, so, you know, what are those statistics going to help us. 

     So there are certain supply side data that, you know, could be gotten more timely.  Most of the supply side data that's been collected from the mobile operators is going to -- the ITU -- taken a year to process, coming back in a dynamic mobile environment.  It's out of date by the time it's collected.  So trying to get that data, you know, more quickly and more rigorously is critical.  There are useful relationship one could have with mobile operators and suppliers, so instead of GSMA or one of these big groups going out and trying to do global studies, not working with universities or groups that are already working in this area to pull it all together but as I said, doing substandard studies, et cetera, getting those collaborators to collaborate on supply side data, more timely data, there are opportunities for worthwhile collaborations with mobile operators and mobile associations and that sort of thing of that kind.

     The access indicator is of course important and we need to understand the importance of that but it's obvious the really equality issues are rising from research and from a policy point of view in the usage issue.  So it's not just sufficient any longer to measure, you know, how many men or women for example are accessing the Internet but it's how they are able to use that Internet, and we know from our research the factors of income and education are critical underpinnings.  So if you control for income and education with gender, for example, little but less in rural areas with Internet, but let's take mobile for example.  If you control for income and education, there are no gender differences or sex differences between men and women in terms of ICT access and use.  So your policy intervention then is not about the ICTs.  It's actually about ensuring that those large numbers of women that are concentrated at the bottom of the pyramid that have less education and income are actually lifted up.  That's where the inequality issue lies.

     And then I just want to take two quick last things.  A lot of research is presented as impact.  Impact assessment is very, very difficult.  You need proper baselines and you need proper in-lines, and it might just be an in-line study and not an impact study.  And so I think that's just one area we need to flag because of course everybody wants impact assessments and monitoring evaluation, and that doesn't necessarily give you an impact outcome. 

     And then I just wanted -- I had mentioned it, but I wanted to flag very quickly that I think there's lots of complementarities we're not exploiting between public and private and university kind of analysis of data that national statistical officers should be doing.  There's a complete exploitation of that expertise in many of our African countries, not necessarily South Africa but in many of our African countries, and I think there really are opportunities to -- in resource-constrained countries to use the data complementarily, so that we collect the supply side data as regularly as we can, possibly in collaboration with big data or independent audited data of some kind, and that we go -- we use this -- the national statistics data and the household surveys to verify various points in between, and that we go into the deep household surveys, you know, on a two or three-year basis because it's too expensive for many of our countries but that we could use it in that complementary way. 

     And then I just want to flag it -- I know we had a previous session on it and I know it's very controversial, but I really do think that there is such an enormous amount of data in the big data for development use, not only in mobile but particularly if we get mobile measurements, there are ways of ensuring that big data can be a public good in the sense of national statistics or a public good available to all for use, with all the caveats around anonymization and aggregation, et cetera.  It's something we need to seriously be looking at in resource-constrained environments.

     >> ALEXANDRE BARBOSA: Thank you, Alison, for this very interesting introduction, and I think that we have touched on very important points related to survey, importance of survey, not funded necessarily by private industry, that may induce to some bias and may lead to not so well-informed decisions and also the survey -- the supply side and the demand side indicators.  This is very relevant in impact assessment.  But we will leave the questions to Dr. Alison after the interventions of our other speakers.  With that I would like to pass the floor to Professor Galperin to introduce his initial remarks.

     >> HERNAN GALPERIN: Thank you, Alexandre.  I'd like to start with an example I've heard over and over this few days at the IGF, and it's probably the hot topic of the session which is zero-rating and Internet.org, and in several sessions the representatives from Facebook and some of the mobile carriers that are partners in this initiative have mentioned the statistical, the data point that 50% of the Internet.org users over a short period of time later become paid data customers.  And this is used as a way to say, well, this is a good entry point to the Internet for those who are not users. 

     Now, this sounds great.  Now, can any of us accept that if you work for Facebook or a carrier -- can any of you verify this data?  Can any of you contest or corroborate this data upon which policy may eventually be made?  Because this is an important input for us to think about how do we regulate or not those kinds of services and -- to the (?) debate as well, but we have no way of corroborating or contesting this data because we simply don't have the data.  So this is, I think, a very contemporary example of how important it is for regulators, academics, NGOs to collect this data, to share this data, because this is a classic example of registration asymmetry which will lead to one of the mechanism through which governments get -- capture the private interests.  Again, I'm not saying this is not true.  I'm not saying Facebook -- I'm not making a comment about Internet.org, I'm making a comment about the need for collecting data so that the general government, the public, NGOs, et cetera, Civil Society can participate meaningfully in this debate.  And there are many other examples of how much data is critical, the ICT data is critical for good government policy. 

     Another example is digital connectivity initiatives.  Now we've come to a new generation of connectivity initiatives where we say, well, those blank initiatives are not the best way to go anymore because we know now that we need to target initiatives to certain groups, maybe women, maybe not, maybe the elderly, and we need data to target initiatives.  And we also -- of course we need data to monitor and evaluate, as Alexandre was saying.  No public policy should be left without being monitored and evaluated, and that might lead to conclusions that we even may not like but we have to accept and improve on those policies.  A good example, we did studies on the impact of OEPC type initiatives in Latin America, we did one in Chile, one in Brazil that was different, we evaluated the bundle caller plan, which some of you may call, and we did these evaluations and we found out that the impact on education and achievement was basically neutral.  There was very little evidence of impact on education or achievement.  There was evidence of impact in other areas, and we can discuss why, but without the data we cannot discuss why.  We cannot discuss whether it's good or bad.  We cannot assume those programs are working without the proper data to look at them and evaluate them and improve on them.

     So these are just examples of why ICT data is so critical.  We have just been involved in collecting the data, but more recently using the data.  I think in Latin America we are in a good position now, we're in a fairly good position, different from what the situation that Alison is describing in Africa, because now I would argue we have -- we have really good data.  We have a fairly good amount of data, so our focus has been not so much on collecting the data, although we do collect shall data ourselves, but also -- collect some data ourselves but also trying to make meaningful use of this data.

     A question that is always on the table is, okay, data collection is great, who should pay for it?  I really commend the effort, Alison in Africa, and many others around the world are doing, but the critical -- my perspective is they are filling the gap for tragedy of a government failure to properly collect the data that the government should be collecting, and it really should be the job of the government for many reasons.  Most basic is information is a public good, information that will be always under investment by private sector in data collection, and this information will be proprietary.  It will not be shared as in the case of I mentioned Facebook and the carriers, which I am perfectly fine that they don't want to share their own commercial data, but the government should come in and collect their own data.

     Also because in many ways the infrastructure for data collection is already there, in countries which have, as we have in the majority of Latin America, national statistics office that are already collecting household data on a large scale and bringing them to the table to start collecting more and more ICT indicators.  So clearly I think the efforts of NGOs, private sector and others are great, but the real task is still -- the central actor here should always be -- should be government in a broad understanding, in a broad understanding.  I mean, you know, Sacheek (?) is part of government -- and so on, in a broader sense not only the national statistics offices.

     And for that reason I think information should always be free.  I think it's a tragedy that some of those agencies are either charging or thinking about charging, I think it's a tragedy that ITU charges for the data because it's mostly publicly funded.  I think, you know, following the same reasoning, it really should always be a public service so that others can participate meaningfully in the debate.

     I recently completed a fairly extensive survey of a situation in Latin America, and again, I think we are in a fairly good position, much better than a decade ago.  We have several countries that collect ICT data through large household surveys on a regular basis.  This still is about only half of the countries that do it on a regular basis, so we still have the other half to go.  And they really vary in terms of the scope and the regularity of the surveys.  And this basically -- I would say three different situations.  Some countries have inserted maybe two or three ICT questions in their regular annual household surveys, which is great, but it's clearly not enough.  It is very little to work with, so clearly not enough.  Other countries have gone a step further, and within household surveys they have what is typically referred to as the ICT module.  So Mexico does that.  Peru has done that.  So every X number of years, typically every two years -- Argentina is doing that as well -- is inserting an ICT specific module, typically 15, 20 questions within large household surveys.  So that's a step forward and it's great.  And others have gone even a step forward in having dedicated ICT surveys. 

     But more -- as I said, more progress is clearly needed, we need more countries, we need more regularity and we need more in-depth questions.  And for example, one of the things I've been pushing for is more questions and more data on affordability and expenditure.  And here's another data point that I've never been -- found somebody that can explain to me:  Why do we use 5% of your income as a threshold for affordability?  I have not met a person, maybe I'll meet them today, hopefully, but I have not met anybody who can explain with me with evidence, why do we use 5%?  You can argue, well, it's always arbitrary.  Yes, always arbitrary, but you've got to give me some evidence so I can say, why not 6?  Why not 4?  Why not -- there's got to be more understanding of affordability issues, and of course please let's stop using affordability as an average of income, because we know averages in our countries, in Brazil, in Latin America and in the developing world, they don't mean a thing, because we have a very wide distribution of income.  We need to have a better, more fine-grain measures of affordability that go beyond 5% of average income.

     And, you know, there are many questions.  There are many questions that we need to understand -- to understand zero-rating.  We need to understand -- if we want to go back into the debate of zero-rating we need more data to understand user -- how people are using the Internet, accessing for different devices and platforms, for different plans.  So we need all this -- it needs more data.

     We need more standardization in this -- in standardization of methodology and even of questions.  In the work I've done, even a basic question throughout Latin America of how do you fine an Internet user, it varies.  Some are using three months, some six months.  The question is have you used the Internet in the last three months, 6 months, 12 months.  Even within Latin America it varies.  So we need more coordination.  I know Sacheek has been involved in this coordination through the partnership and other initiatives and that's great. 

     We also need more coordination between government agencies.  For example, there are countries where you have the national statistics office collecting data, the regulator collecting data and even sometimes specialized agencies collecting data.  I'm thinking of Uruguay and so on, they have specialized agencies that collect data.  All is great, but there needs to be better coordination between those agencies.  Sometimes they don't talk to each other.

     And finally I think we -- I insist on this, I think we've come a long way, but what I start to see is a bit ironic, is we have sort of a usage gap.  There's a lot of data out there and -- as Alison was characterizing, collecting small samples here and there, and sometimes researchers and NGOs and Civil Society are not even aware that you have this amount of data.  It's fairly interesting how it's not hard.  You don't have to talk to anybody, friends and so on in the national statistics office.  It's there, it's on the Web site.  And I see again a usage gap.  We have the data but we have to be able to use it in meaningful ways to show that this is money well spent.

     >> ALEXANDRE BARBOSA: Thank you.  Very interesting points that you raised, especially about the transparency that is needed when discussing the methodology to produce data.  It has to be fully transparent, and of course when we have a survey from the industry, if you don't disclose the methodology, you never know how the data was produced.  And this may lead to -- or mislead policy makers to make decisions.

     And also in this regards, the engagement of policy makers in this production process is really very relevant, and with that I would like to pass the floor to Fabio from CETIC to give his impressions on this issue.

     >> FABIO SENNE: Good morning, everyone.  In order to cover some of these issues I would like to quickly highlight some general principles that we think in Brazil, (?) collecting and also use of indicators for evidence-based policy making and it should be addressed both by data producers and users.  I also think the connection between data producers and users is quite important to advancing this area.

     So regarding our experience in Brazil, I speak at survey project coordinator at the regional center of studies on the development of information society, CETIC.br, which was created in 2005, aiming at providing government to side with (?) reliable data on ICT access and use.  Cetic is a department of NIC.BR, a nonprofit civil organization that implements the decisions and projects of the CDI, and at this moment we conduct ten national sample survey projects, focus on the -- both from demand and supply side, focus on issues as ICT households, use by children, education, health, eGovernment, using nonprofit organizations, among other issues.

     The first point I'd like to address, and I think maybe the most important one that we have been discussing here is that indicators must be reliable.  So reliability is a main issue in producing indicators.  And as Alexandre (?) transparency is a principle that CETIC is very fond of and in the sense we think it's necessarily to raise awareness about the importance of quality in survey process.  So we do have a lot of data, but we need to stress the importance of quality in data, so we need to have transparency on what we are presenting regarding statistics. 

     So the importance here for the planning phase, the questioner design and the standardization of questioner design, we at CETIC do lots of qualitative analysis, like cognitive interviews to better design questionnaires that are really understandable in different parts of the country, design a simple selection so if you don't have a representative sample of your country, we are talking about small parts of the problem.  Control and monitor of data collection, data processing and dissemination and analysis.  So we need to improve on that.

     And from this perspective we started to develop in CETIC capacity-building strategies on Brazil and also in Latin American countries, in Portuguese-speaking African countries, to discuss how to use the statistics and indicators for policy making.  So we are trying to create opportunities to stimulate the (?) and national and Internet debate on methodology, which we think it's critical to support the policy decision-making process.

     The second point that I'd like to bring on the floor is that we must support timely decision-making.  So we have very fast-changing patterns of Internet use and it (?) on the data information and data.  We are talking about a field that is very fast changing, and I think national studies offices in our countries that are traditionally responsible for providing statistics for policy making have been challenged by this necessity of up-to-date information on the use of Internet, and also big data has been treated as a solution, but at the moment there is no consensus on how to particularly use big data to produce official studies.  So I think that there is a challenge to be addressed between the work of national statistics offices and the use of big data to produce relevant statistics.

     Our experience in CETIC shows that having stand-alone ICT service, such as ICT household survey, a large-scale household survey, are to provide society with more systematic and updated data on Internet use, and in our case have annual data on specializing on ICT helps, which delivers information that usually could not be provided by NSOs, by national statistics offices, with the same regularity.

     The other question already mentioned here is accessibility.  So accessibility is a main issue when it comes to indicators for sure.  We have huge amounts of data that are produced by industry or even by governments, such as administrative data.  They are not available to society.  So -- the example of Facebook is very interesting.  So it's not the lack of the existence of data but how it's accessible to the main stakeholders.

     So in CETIC we try to promote to share data, and also (?) data in order to support academic research and policy making.  This is one of our main goals.  And we also have created an alliance too for data utilization, which is available on our Web site.  This is one effort to disseminate data and make it more accessible to other domain stakeholders.

     The other question is the question about cost, and of course indicators production must be cost-effective, and the important question here, how to achieve sustainability in collecting data about Internet access and use.  And because of the costs implementing large scale service, especially in developing countries, it's very difficult, and several nations usually lack systematic time series for the monitoring of the ICT adoption.  We -- our experience shows that it is important not only to start measuring but to create a culture of measurement or regularity, in the commitment of CGI and NIC.br producing dissemination (?) is relevant in Brazil for us to have ten years of regular time series of data on Internet use and access.

     The question that Alison said about adaptation is very important.  Maybe we can reduce cost if you had this connection.  What can be collected in Brazil is not the same as in France or in Africa, so you need to have this other process of adaptation of methodologies.

     The investigation of socioeconomic outcomes of ICTs also need to be interdisciplinary, so we rely on an interdisciplinary approach, that connects (?) from different backgrounds in our countries.  So in this case we think that producing -- that producing statistics indicators from a multi-stakeholder perspective is also important, so not only choose the role of the different stakeholders, but in CETIC we try to produce the statistics using the multi-stakeholder process of discussion, of to what needs to be measured and how you conduct the survey.  We have now more than 200 voluntary experts from Academy of Civil Society and Government.  That's a part of our -- that are on part of our project from the planning to the dissemination phase.  In each of the issues children's education and health, we have groups of experts who really support the production in a multi-stakeholder process.

     And finally -- that's a very important question -- the indicators must be relevant to address policy issues, and not just for monitoring goals, established nationally and internationally, but also for understanding what are the barriers and the drivers related to digital inclusion. 

     So -- and I end with an example of Brazil.  We recently found out that -- by using qualitative methods, we found out that in Brazil sharing WiFi connection was very common among households, so you had one household that shared one -- one --

     >> Subscription.

     >> -- subscription of Internet connection with the others.  After discovering that we included the new question in our household survey, and we discovered that 13% of Brazilian households share WiFi connection with their neighbors.  And in the northeast region, that is where we are now, this number increased to 22%.  So understanding that there's unintended uses of ICT is also very important for policy makers, and this is not possible to obtain in a demonstrated database or by another kind of data, and this approach is only possible by listening to people's voices and to doing research in the field, and that's why a simple survey, in our view, are so relevant to be conducted.  So with this I pass the floor to Alexandre again.

     >> ALEXANDRE BARBOSA: Thank you, Fabio, for this detailed stage that CETIC is doing to produce survey data.  With that I will invite our last speaker, Lorrayne, from OECD, maybe to give us a little bit more perspective from international organizations, very much linked to the debate of policy and how the digital economy is being affected.  So Lorrayne, you have the floor.  Thank you.

     >> LORRAYNE PORCIUNCULA: Thank you very much, Alexandre.  I'm Lorrayne Porciuncula.  I work in the OECD in the Digital Economy Policy Division, and as many of you may know, the OECD does a lot of evidence-based policies, that's basically in our mission, better policies for better lives based on evidence.  So we collect a lot of statistics, and that's what we do. 

     When Alexandre invited me to join this panel I was thinking, okay, my impression is that many of these meetings end up being we are preaching to the converted already, and I feel that many of you who are here and are interested in a topic and feel the sense of urgency of doing evidence-based policy already know that.  So I was asking, so how do we do to convince those policy makers who are not doing evidence-based policy?  And how do we convince them that indicators and measurement needs to go hand in hand with policy making? 

     And with that in mind just -- in my view, the -- the activity and the exercise of doing measurement and trying to find evidence is something that needs to be intertwined with policy.  It needs to go before policy, it needs to go while policy is being implemented, and it needs to go after policy has been implemented.  You need measurement before you even think what policy you actually want to implement.  You need evidence to understand what are the market favors and what are you actually trying to correct?  So you can have targeted policy.

     You need evidence why you're doing the policy so you can adapt it to see if there are other policies actually achieving the objective that you set out to achieve in the first place.  And you need evidence to evaluate if that policy objectives were -- were actually achieved or not.  We need evidence also to improve accountability of policy makers.  There is growing pressure from citizens, from taxpayers, to see what is -- what is my money going to and how are these policies being chosen and what's the criteria and why is this considered to be a priority to the government?  So evidence measurement, cost benefit analysis are crucial for that, for informing policy makers.

     And then I wonder, we all talked about the lack of systematic and reliable ICT statistics, and we've been working -- we work with our member countries, our 34 member countries in OECD, but as Alexandre mentioned, right now I'm working in the broadband policy for Latin America.  So just to give a brief introduction what that project entails, it is a project with the IDB, and the aim of the IDB was to -- they came to us because they felt that there was a lack of capacity on many of the governments in Latin America to deal with broadband issues in a holistic way, not only thinking about supply-side issues but also demand-side issues.  That was very important for them.  That's something that they heard from the stakeholders, and they got back to us and said can you help us doing that?  The first stage for us was well, we don't normally work with Latin America, Caribbean countries, apart from our member countries, that's Chile and Mexico, Colombia.  We are close to doing the process of (?).  And then we thought we need to understand the region. 

     So we sent questionnaires to 26 countries.  They are dividing modules.  We had focal points both in administrative communications and in the regulator, and the point was that because that is such a broad issue that deals specifically with demand-side issues, we chose focal points to be able to deliver these questionnaires to other parts of the government.  And we got these questionnaires back, we got good responses from them and we were asking basically what kind of initiatives are you doing, what are your challenges and what do you feel you need to do more?  And we also -- the mission is to understand and to hear from those countries what was going on.

     At this point we are in the second stage where we are producing chapters and evaluating what's happening in the region and trying to find good practices, but not only those that come only from OECD countries but trying to identify what countries in the region are doing that maybe the same countries, their neighbors can learn from.  And my understanding is that in the -- in each evidence there is -- and Fabio mentioned that there is a break between those who produce the data, from those who use the data, and sometimes would sit around the -- we'd sit around the table and we'd be talking about indicators and because it was the first time that supply-side issue policy people were meeting with demand-issue policy people, they're like, oh, you actually have data on that?  And that was surprising and not surprising, I guess because so much of these conversations tends to be left in silos, and they don't leave the room of the experts and the statisticians and the people who are actually interested in that.  Policy makers, they don't know what is available and how crucial that is.

     With that in mind we aimed at doing a toolkit that would put in the center this interconnection that this -- that cannot be separated between policy objectives and indicators and measurements.  So as we set out to do the -- the chapters, and now it's early stage and we're drafting, we have two parts.  We first put in every single topic that we're mentioning, and it goes from ICT framework, spectrum license to affordability issues, taxation, eHealth, eEducation, eGovernment, we set out the policy objectives.  So when you think about these issues, what -- what is your goal?  What is your goal with those policies?  And right after that we have the measurements and tools, and then we link to available tools that exist in other countries.  We link with indexes that may exist, also platforms to share data, and in our understanding that's something crucial for informing policy makers, because one thing cannot go without the other.

     And then further to that, and that's -- and that's just talking about the basics and the basic indicators that exist, there is also a need to move beyond that, and I think every panelist has mentioned that we need to -- we need a shift from the measurement from diffusion, from how many firms individuals use, have access to ICTs, to actual use, and what they do with them, what they do with those technologies.  But at the same time those individual and firm surveys, normally they cannot understand what's the actual use of it.  So we actually need to think about other forms of producing indicators and collecting data, and I think that's -- as we talk about big data, for example -- big data doesn't have to be the super bullet or you don't have to use big data for everything, but I think the national statistics offices, they need to be more flexible to be able to input from other sources slowly, and not necessarily try to think that surveys are going to be replaced by big data in the next two years.  That's not going to happen.  But they do need to be able to revise those indicators and open up to new sources, to new indicators and for new partnerships.

     I think that's it for now.

     >> ALEXANDRE BARBOSA: Thank you Lorrayne, for bringing the international organization perspective.  And I do agree that we do have a lot of data available, and what to do with this data.  I think that we have to move the existing data from the shelf into action, into policies, but first we need to raise awareness about policy makers that it is important to use the data and how to use it.  Right?

     So with that I would like to open the floor for questions, and also for remote participants.  So please, if you have any questions?  We have one -- two questions.  The microphone?  Who is taking care of the microphone?  If you can just come here and use this microphone, please.

     >> Good morning.  My name is Esteban, I'm from the (?), and actually I do a lot of research about Internet issues in my country.  So I was wondering, how was your approach to the statistical office, at least in Latin America.  But there is a lot of lack of data.  It's not that it's on the shelf.  It just doesn't exist in that point.  So I wanted to know if you had an approach to that offices because there is now other official data that we collect.  Thanks.

     >> MODERATOR: Thank you.  Next?

     >> CHENAI CHAIR: Hi, I'm Chenai from --

     >> ALEXANDRE BARBOSA: It's not working.

     >> Hello?

     >> Yes.

     >> CHENAI CHAIR: Hi, I'm Chenai Chair with Research ICT Africa.  Thank you for the very interesting presentation, but my question is that there's been focus to policy makers and I'm assuming within government, people with the power.  How about has there been -- in your opinion, how do we also engage like Ictivus, (?) who come to these spaces and they engage on these particular issues around Internet Governance but often without the evidence that we have.  So how do we as collectors of this evidence push out our data to these other members of society.  Thank you.

     >> ALEXANDRE BARBOSA: Thank you.  Any other questions?  Yes, one more.

     >> Hello, Kasama from Thailand.  We are on the team drafting the digital economy plan coming out this year.  I have a quick question.  If you have some sort of a global, like some sort of guidelines to collecting ICT data, because -- because we are practitioners most of the time, we would make policy first and find data to justify the policy (chuckle).  And I think if you have some sort of guideline to how to go about collecting ICT data.  For example, just a few months ago we were trying to measure the contribution of ICT or digital technologies on maybe the country's GDP, so we were looking at all places and we found all ICT country data but we -- there's no Thailand in there.  And I was looking for a way to measure.  I mean, if -- because this is an international forum, and if you could help us, like if you have a guideline, so, you know, you could help a lot of people, a lot of countries if -- in this regards.

     >> ALEXANDRE BARBOSA: Thank you very much.  We have three questions, one related to the role of NSO to provide data.  In fact, we do have some country with lack of data.  The second, how to engage other actors in the process of data production.  And the last one related to global guidelines for measurement.  I would like to pass the floor to our participants, maybe we start with Anderson and then -- ladies first.

     >> ALISON GILLWALD: Thank you very much for all three questions.  To the first question, I'm not sure which information you're looking for particularly, but certainly across many African jurisdictions, NSOs are extremely unresponsive.  It's very difficult to penetrate them.  It's even difficult to get, you know, public data for many of our IDRC-sponsored surveys, so donor-sponsored surveys that we're doing without government support, we need to -- they need to be nationally representative.  We have to get the (?) from countries.

     Of the 17 countries we did, when we were doing 17 countries, I would say at least between -- between five and ten of those countries we actually had to bribe national statistics officers to get the national (?) frame in order to do the study.  Public interest study they used was not (?).  Certain governments require you pay a percentage, even though it's donor funded.  It's a public interest thing that you pay a percentage of the fieldwork so in addition to these high cost things you have to pay them in order to just use the data, not to conduct the survey for you, just to get access to on the (?) frame, et cetera.

     So yes, NSOs have to be made more transparent, have to be part of a national agenda and I think there's some good examples where that is happening.  Kenya statistical office, bureau of statistics in Kenya has gone a long way to opening up and responding to ICT indicators.  They're trying to do something in Mozambique, for example, that's been far more transparent than anything that's happened in the past.  So I think it just is part of this general lobbying, making governments accountable.  Now that we started this discussion a lot of the base practice and indicators assume functioning governments, not fragile states, institutions that have got the capacity to do this work, et cetera.  And so a lot of our work is actually about how do we get data, if we can't actually do a national survey that requires a sample of, you know, 20 or 30,000, what -- what methodologies can we use with samples of, you know, 2 or 3 or 4,000 that could actually produce nationally representative data.  So that was the first point I wanted to make.

     The second was in relation to, I suppose, the misuse of information and the absence of existing evidence, and, you know, as people who spend our lives, as Chenai was pointing out, producing this evidence and trying to get it out there, it's enormously frustrating to come into a forum like this and people to say, oh, well, there's no African data on access or there's no gender data at all, whereas really like a basic Google search would show that up very quickly.  So I suppose again, like the others, it's the same thing about engaging and advocating and making people aware of it.

     And then I just -- I mean, the Thailand example was a very, very interesting -- interesting one.  I was just going to say that our partners in Asia do nationally representative household -- have done household surveys and bottom-of-the-pyramid surveys I think in Thailand as well, but maybe not as recently, and hope to do so again soon.  So look at those because I think you're right, you know, getting evidence from, you know, mature economies and that is not particularly helpful in developing digital plans for these.  Best practice there is probably not best practice in the developing context.

     >> ALEXANDRE BARBOSA: I just would like to add in terms of the question from Thailand, we do have a lot of activities going on in terms of defining guidelines for measurement, especially for the partnership and (?) for development, which includes UN agencies such as ITU, Economic Commissions in the five different regions, OECD and UNESCO in different areas.  So ICTA (?), for instance, in terms of measuring digital economy and also ICT has a large debate on measurement, just to add on that.  Maybe Lorrayne, can you have an input for the three questions, please?

     >> LORRAYNE PORCIUNCULA: I would like to take on the question of Thailand, which is a very interesting one.  Actually since we started working on the broadband policy toolkit for Latin America and the Caribbean, our member countries and many who are present in the region had thought that it would be interesting to do one in Southeast Asia, so we are actually starting a broadband policy toolkit in Southeast Asia, and in the context of that project, which is just starting, we visited Thailand, and we were received by the Electronic Transactions Development Agency, ETDA, they are producing great data.  They are producing, I think, the best data on ICTs in the country, and that's great and we are very impressed with the level of that data you have. 

     On that note, we will be producing the same kind of guidelines for Southeast Asia, so hold tight.  We'll do that.  But we also -- parts of the partnership of measuring the digital -- the information society, and we have ICT guideline on measuring information society, and I can pass you the link.  Certainly we can exchange contacts.

     >> ALEXANDRE BARBOSA: Professor Galperin?

     >> HERNAN GALPERIN: Thank you.  For the first question, I don't know which country was it referring to in Latin America, the lack of data, as I mentioned, in our mapping of the existing data, we found that about half of the countries in Latin America had fairly good statistics that are collected on a regular basis.  Of course the other half wasn't quite there.  But I think there's also sometimes a question of culture within national statistics office that data is for government use only, and really, is a cultural tradition of bureaucratic secrecy and so on that do you have the data but it's only for internal users and for government policy making.  But I think that's changing, and I think it's changing in part to the efforts of organizations that are pushing for open data initiatives. 

     Many countries actually in Latin America have signed on to declarations of openness of data in government, so I think it's a question of trying to make those governments accountable for what they sign and for the commitments to open data and to change a culture.  Of course we don't have corruption as in Africa and Latin America, so that wouldn't work (laughter), but I do think there's an education component to this, to the statistics office and whoever is collecting the data.

     And to the -- I think the other question of how to measure the contribution of broadband and ICTs to GDP, I think -- I think Lorrayne and others have mentioned there's great efforts there.  What I would -- I would add to that is I think it's great that countries are undertaking this because I think it's still appalling that -- I've heard at least twice in these sessions over these days people still referring to the wall bank study that has been highly criticized about the contribution of broadband to -- there are no heterogeneity controls in that study.  People are still citing over and over again that study as a contribution.  IDB has done a similar study even though it's been highly criticized and the conclusions are more glorious for broadband and so on, but I think now we've come around and seen there's more -- more sophisticated efforts to try to measure that, which is far from trivial, but I think there's many -- I think the partnership is doing some efforts.  I mean, there's several projects that are trying to have more refined measures of the real contribution of ICTs to economic activity.

     >> ALEXANDRE BARBOSA: Fabio?

     >> FABIO SENNE: I'd like to add regarding the role of other actors, I think we should also talk about the role of Civil Society in using and pressing for more data in Brazil.  It's interesting because most of the time the result of the surveys are not good results, so we -- sometimes governments don't like to explore these evidences, and usually you need Civil Society to press for more data, and why they are so -- the digital divide is so high and this kind of thing.  So in our case in Brazil, we try to make the data more available as we can to include -- involving Civil Society organizations in this process.  All of this service are (?) in these kind of reports that have -- they are fully available online, and with aggregated data from each survey.  So I think Civil Society also has an important role in this discussion.

     >> ALEXANDRE BARBOSA: Well, let's go to the second and last round of questions.  We have one question here, and one there.

     >> In my teaching I sometimes tell our students to tell me whether the city where the traffic lights are erratic, is better regulated than a city where the traffic lights don't work at all.  So my question is about the quality of data and the fact that the data -- the evidence stream is being polluted by poor quality research, as Alison clearly pointed out.  Now, the American Psychological Association actually worked with the U.S. media trying to make sure that every time a survey is presented in a responsible newspaper, the sample size and certain methodological methods are published.  Is that something we should be trying to talk about?  Is it -- how do we -- how do we communicate the quality or lack of quality of some of the qualitative studies that are being brought about and discussed in public settings?

     >> ALEXANDRE BARBOSA: Thank you very much.  We'll have a second question here and another one over there.

     >> Good morning, my name is Pana from the (?) agency from Thailand.  This is not a question but might be observations regarding things all the panel mentions, but the thing that you mentioned sustainable government goals about ICT as a tool to improve the quality of the people in the country as well.  But we have many perspective from different organizations such as UNESCO, ITU doesn't call indicator.  But I'm just thinking about what if we had the one organization said this is the key indicator, all the country needs to be at least collects and then custom indicators might be tailored with the country need based on whatever they would like to say something, right?  This might be the ones individual to look at in the whole, the big picture of the world, when the ICT -- the impact of ICT throughout a country, comparing with each country and then to be the better information for the government to look at what kinds of elements that need to be improved based on this co-indicator.

     The second one is suppose that you mentioned about the mechanisms to make sure that whenever we provide good data, but the governments don't know exactly what kind of things that it means to use it.  Even if we have the good data collected in a systematic way as well, but the systems (?) approach needs to be established.  I agreed as might be use the multi-stakeholder or even the public/private policy -- or public/private partnership programs that might bring the private sector to be a part of the calculations, through the systematic base and plus the incentive for them to be a part of -- I mean, contribution to the data collection.  This might be the link between the government and the partnership to institutionalize the technology or even to know about a data collection throughout the country.  And third it might be (?) collaboration.  I know each country has different methods, and the way they can communicate might be help each other to improve the situation of the calculation.  Thank you.

     >> ALEXANDRE BARBOSA: Thank you very much.  We have two more questions here.

     >> Hello.  Can you hear me okay?

     >> ALEXANDRE BARBOSA: Yes.

     >> My name is Rutin, I'm sorry, I have a cold.  Picked up in the main hall.  (laughter) I represent eNACSO, which is the European Network of Children's NGOs, in 23 countries in Europe, and I want to make a really -- a slightly different point from what's been raised so far in terms of how NGOs can use data and insights, which are not necessarily sort of the macro kind of data that you're talking about but very much microdata, and we've found that -- done that successfully in Europe through the NGOs I represent, where there have been focus groups of local children and young people in telling what they're actually experiencing online. 

     For example, in advertising or when they play online, what kind of issues are emerging.  And we've been quite successful in getting those kind of insights and new challenges that are post by Chilean people -- because they're using that a lot, impacting that at European policy making level at the European Union.  I really want to say that because NGOs might think this is all above us.  We can't really use any of this kind of data, but we can collect data in a different way and it can be also very successful in impacting on policy.

     >> Hi.  Is this on?  My name is Taylor Roberts.  I'm a research fellow at the Global Cybersecurity Capacity Center at the University of Oxford.  This might be a bit of a curveball but given that the title said effective policy-making, it didn't specify what kind of policy, I'm going in a slightly different direction.  I've noticed while there's a plethora of cybersecurity policy initiatives, the effectiveness and data that reflects the measurement of cybersecurity policy effectiveness isn't directly connected to the policy at this point.  There's a lot of things, number of both numbers -- type of security data that's available but no one has made a great connection as to what is quality data and what really represents cybersecurity as a discipline, to my understanding there hasn't been a relationship back to the policy to make sure that the national cyber strategies or awareness campaigns are doing what they were intended to do.  I don't know if there's anything you all have seen that could counter that.  Thank you.

     >> ALEXANDRE BARBOSA: Is there any question from the remote participants?

     >> Yes, over here.

     >> ALEXANDRE BARBOSA: Thank you.

     >> Hi.  The last question may be -- just another (?) we are having is how to measure the ICT industry as a vertical market, or segment.  It's not an easy task.  We are doing this (?) which is the ICT (?) either Latin America.  We have a lot of lessons learned.  I don't know if somebody has suffered this process.  We have started in Argentina.  I don't know if you remember, I started with Andrei Lopez and (?).  You remember that?  So from that point of view we're trying to replicate that methodology of measuring in the ICT.  Because we don't have -- we are lacking of having in all the international organizations or the NGO, whatever, a single standardized parameters or indicators for the industry.  For example, how many -- how much of exportations or the ports are in the region or the country?  How many -- the embassies, how much are the direct employees?  I mean, there are all sorts of items very interesting that are very useful for public policy. 

     So the path we are going through is not just helping the industry, the national associations, also making liaison with governments showing efforts together, going from bottom to up in this process, and also we are -- this year we have started to replicate it at the worldwide point of view with the support of the UNCTAD (?) and always help in this case.  But it's a very difficult path, there are a lot of efforts to make.  We have a lot of asymmetry.  We have countries like India or Argentina.  There are a lot of countries that have a lot of experience in this issue, but there are countries that have not.  In Asia, in Africa, even in Latin America.  Even so, my question is, is there any other initiative like this who is interested to show efforts?  Let's do something.  By the way, my name is Sylvia Vidard (?), I'm vice president of WITSA, (?) National Director of (?) member, and also within the elect process, which some of us are very involved.  We are working on this.  So this is my call to everybody.  Let's join forces.  Thank you.

     >> ALEXANDRE BARBOSA: Thank you very much.  It is indeed a very difficult task.

     Well, let me try -- we have five questions.  I will address -- I will choose some of you to speak up.  Let's start from you, Professor Galperin, related to quality of data and methodological issues.  If you can give us some insights.

     >> HERNAN GALPERIN: Thank you.  Well, it's definitely a difficult task to ensure quality, particularly because sometimes the problems with quality are so engrained in the data that then gets aggregated and (?) aggregated that we actually never see the lack of quality in the data and the obvious example is ITU data, which is gathered through regulators which then -- it's really gathered through the industry and the operators.  So as is, you know -- is incorrect or biased data is reported, then it goes back to aggregated, back to aggregated so you see these beautiful ITU graphs and the development wall going up.  And lots of problems, but they go down to the level of the regulators and actually the operators that are reporting to the regulators.  So sometimes it's really about data forensics that you really have to dig down into the data and see where the quality problems are.

     And another issue I think pertaining to quality and I think Rohan's point was in terms of the public sphere.  Who's applying the data space in the public sphere?  The problem is when government and good researchers are not occupying the space, is the companies that are perfectly reasonably try to occupy the space, I think we had a good example in Latin America, something that was called the Cisco barometer which isn't happening now, but it was the authoritative source about issues of Latin America through the Cisco barometer.  Which was a great effort on the part of Cisco, but it took the place of solid and authoritative data when, in fact, when you look at the data it was not.  I mean, there was so many issues with the data, which again, as not a critique of Cisco, simply a critique of those who are not occupying the space that should have been there, especially the government and the statistics office, and then it was occupied by this -- the project.

     And excuse me, I want to mention one more thing, which I think is encouraging, which is actually related to a previous question, but I remember a case -- recent case in Argentina where -- it's related to the question of engagement.  Yes, we need to engage with the government, the statistics office and collaborate with them.  But if they're not collaborating sometimes you can challenge them and Argentina has good cases now.  Of course we have a tragic case of statistics in Argentina but many NGOs have gone to statistics office and challenged on access grounds.  We have not quite a law but a regulation of information and access.  They had a challenge to the Court and they won and the statistics office had to release data on poverty.  It's not ICT data but it's a good example of how NGOs -- if they're not engaging go to the courts and challenge them.

     >> ALEXANDRE BARBOSA: If you want to add something on quality, data quality, and also how CETIC is providing or offering microdata access to this aggregated level.

     >> HERNAN GALPERIN: Yes.  Of course quality is the word, I think, just regarding the NSOs, I think CETIC works a lot in collaboration with NSOs, not only in Brazil but in Latin America, and we do agree that there is -- they are unresponsive -- unresponsive in some sense of use of the data, for instance.  But I think they are also -- and it's important to remark there, also very full of quality and rigorous.  So if we start to collaborate with NSO from aspect of quality, is it possible to use other forms of data collection?  Maintaining the standards of quality?  I think we have a type of collection that might change the situation. 

     So in CETIC we try to establish agreements with academy and government in order to provide microdata for this aggregate survey and to provide them opportunity of study in a more -- I would like the name of the book of -- you know, that it's opening the black box, sometimes when you open the black box of national data that we have, so in Brazil it's very difficult to talk about total Brazil.  It doesn't say much about the inequalities that we have in the country.  So I think we have -- we should attend the policy of aggregating data and giving of giving accessibility of information we have produced.

     >> ALEXANDRE BARBOSA: Thank you very much.  Alison, if you can add something on data quality and probably the cybersecurity issues and also ICT industry as our colleague from Argentina has mentioned.

     >> ALISON GILLWALD: Thank you very much.  I just -- on the quality issue, I really think the issue that I was speaking about when I started about this, you know, vacuum of institutionalized data being fooled by commercial research, market research, filled by them, and also donor funded research of particular interest groups who are doing this kind of research is really responsible for a lot of the very poor quality that's around.  And definitely I think we have to rales raise the question of whether bad evidence is better than no evidence at all, because I think really in the last few years that's the point we've reached because of the failure certainly in Africa, a public space to be filled and of academic institutions to be testing that rigor.  But even where they are, I think there's a problem with -- we ought to use indices because they're nice for evocative purposes, you can name and rank, but there are problems with the indices and at the moment they are so supported by, you know, globally institutionally powerful people so that the World Bank -- even though the World Bank itself has said that that study is problematic, continues to get used. 

     I mean, the world economic forum index is a highly problematic thing from a methodological point of view.  Essentially it's a business survey of policy effectiveness.  So if somebody who runs a fisheries industry is actually deciding whether, you know, government procurement, et cetera, is effective.  And then of course complimented by the ICT data -- ITU data which is problematic and then has a single country expert assessor of the data.  So you can see how it pays out ideologically in terms of results data and that because it's got more stake driven policies, scores like that, countries that should score far worse like South Africa in terms of things. 

     So I think looking at these indices more critically is actually critical.  The misnaming of these indices.  So calling an affordability index something that's actually looking at cost drivers and policy -- affecting -- and calling it an affordability index is nonsense.  What is the assessment of affordability and affordability index because the countries that are coming out on top of that index in Africa don't have the cheapest prices, so what are we actually measuring?  Top few, some of the highest prices. 

     So these I think are very problematic and simply making the methodology available, which I think in critiquing some of these, some of us who have critiqued these, some of these indices and durable platforms and things that are putting out this research have responded by putting their methodology in an appendix at the end.  Even if you could critique that methodology and say the way you're doing this gender study you can't make these claims, international, national, whatever level.  It's knowledge that you've made, but the report still goes out and it's published and the same methodology is used the following year with slightly different adjustments.  So it's a real challenge and I really think in terms of addressing the quality challenges is about getting it institutionalized, getting it rigorous, getting these proper national coalitions of public interest groups, the national statistics offices, you know, the universities, and of course Civil Society and other people, these multi-stakeholder activities, to participate in it, but I think it's very problematic and I think simply -- although I think it's important to publish the methodology we have to go beyond that because simply people are like, oh, I published by methodology, it's highly problematic, but never mind.  We're still going to make these claims.

     I actually wanted to say about the cybersecurity stuff that I actually don't know a lot of it -- a lot about it, other than we do have a project which Enrico Calandro is involved with, that is trying to look at reconciling some of the technical measures with policy, particularly in the African context of course, not just leaving it in the technical terrain and outcomes in terms of technical, you know, application, but also contextualizing it in the Human Rights environment, which is so problematic.  So also looking at technical measures around censorship but looking at those technical measures around the same quality instruments that we're using, in terms of -- in terms of outcomes. 

     So maybe I'll ask if there's time, Enrico to speak a bit more about that because I'm not as familiar with it.  But I think there is a serious problem, in that cybersecurity is grasped by policy makers in Africa when a horde of other things that are arguably or more important, fits comfortably into the surveillance and national interests, and proper use of its effectiveness and -- quickly I want to refer to the last question about the -- measuring the verticals and stuff because I think the ICT has a lot of work that's been done in South Africa, but particularly, if I'm correct Chile has done it for several years. 

     But essentially it's not only looking at the vertical but actually looking, in terms of your SIC, which might be tied up with your transport or whatever it is, but looking at the contribution of the ICT horizontally across the sector, so from the national accounts, from the labor survey, et cetera, and extracting that data to get -- to try and understand the real contribution to the economy.

     As I said, Chile I know does it more regularly.  I think Australia has done it, and South Africa is in the process of doing it.  Incredibly underutilized by government itself.  It's put all this money and effort into it and, you know, you still have budget addresses that don't refer to this very valuable data, et cetera.  So that's just something else I want to mention too.

     So just in that regard, the indicators that we need in order to assess the SDGs and the 20/20, I really think we need to move beyond the current indicators that we're using in the ICT field.  So really begin to look more at what ICT data is available in the labor force surveys and those kinds of things to get a better understanding and better measurement of those indicators.  We're going to need a whole new set of indicators, not only for the STGs but for the (?).

     >> ALEXANDRE BARBOSA: Thank you, Alison.  Indeed, it is -- cybersecurity is a whole complexity to be addressed in a panel like this, but we lack formal connection with a framework for measuring cybersecurity.  We have some initiatives, but we don't have something as well established as we do have for measuring other segments like households or even enterprises or ICT sector.  And the ICT industry is also something very complex.  If you consider that it is -- it encompasses services, products, it is a very comprehensive area to be addressed but we do have some initiatives like ink tad that is addressed in eCommerce, electronic government, et cetera.  Our time is almost ending but I would like to give the final word to Lorrayne, if you want to address one of these five topics that was mentioned.

     >> LORRAYNE PORCIUNCULA: Sure.  So there are many questions and I think all the panelists have touched issues that I'd like to touch on myself, and as I look at the questions back, the issue of qualitative data on co-indicators versus sustainable development or additional security data, I'll come back to my first point on having this as a priority for policy makers.  So when you talk about the lack of quality data in international, for example, I've been there in the ITU and I know the quality of data that they get, and there's little that you can do with the data that you get, and that's what professor was saying, it's the data regulators collect from operators, and the data is not there.  So what do you do when there's no national data?

     And so it boils down to the question of convincing policy makers that this is such an important issue.  It goes back also to, okay, so how do we do -- how do we find core indicators (?) development?  It's the fact of making sure that those two go together, on cybersecurity and (?) CD, we now use the term digital risk management, and I'm also not an expert on that and my cybersecurity and digital risk management expert is here in IGF, and we've just launched a recommendation in a companion document on that, and the idea is also to move beyond the concept of cybersecurity because it's -- it's not cyber and it's not only for experts, it's supposed to be interconnected with any strategy of companies and with any government plan, because those are risks that exist in real life, because our real life now is connected. 

     So it's the idea that those issues need to be taken as endogenous problems.  It's not -- indicators is not an afterthought.  And cybersecurity, it's not an afterthought either.  It needs to be with policy -- the policy-making process, and that was a topic that was raised in the panel that we had yesterday that we organized the OECD and someone from the private sector said data wasn't a problem.  We had a question on that and he said, we have data, and we have the data.  We are aware of the risks and we know how much they cost, and to which we responded, data -- it is a problem, because policy makers don't have this data, and we need it.

     So I think if we are able to take this conversation further and closer to policy makers so they can feel the urgency of taking into account indicators and measurement as they design the policies and as they evaluate them, I think that we -- we can respond to many of these questions, because, well, when you think that if -- if it is a high priority to have measurement and to have proof and evidence for policy makers, maybe we'll have better oversight of the data that's being produced, and maybe we will improve on quality of data too and quality of research and then we'll be able to fill those voids, and the professor is also speaking about, and with that I'll end my contribution.

     >> ALEXANDRE BARBOSA: Thank you very much.  I hope that this very insightful presentations will help us to understand the existing gap between policy and practice, and that we'll go back to our organizations and start to make a focus in terms of the importance of measurement.

     With that I would like very much to thank all our guest speakers, and thank all of you for being -- attending this session.  Thank you very much.

     (Applause)