Internet Universality Indicators: measuring ICT for development

31 May 2024 09:00h - 09:45h

Table of contents

Disclaimer: This is not an official record of the session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed.

Full session report

Experts convene to update UNESCO’s Internet Universality Indicators for contemporary challenges

During a consultation session on revising UNESCO’s Internet Universality Indicators (IUIs), experts including Ms. Anriette Esterhuysen, Mr. Cédric Wachholz, Mr. Fabio Senne, and Dr. David Souter, engaged with participants to discuss the need for updating the IUIs to address contemporary internet challenges. The IUIs are tools for assessing internet universality in countries, focusing on rights, openness, accessibility, multi-stakeholder participation, and cross-cutting issues like gender equality and environmental concerns.

Mr. Wachholz introduced the session by highlighting the holistic approach of the IUIs, which consider human rights, openness, accessibility, and multi-stakeholder participation. He noted that 55 countries have started assessments, with 40 either finishing or in the process. The panel clarified that the IUIs are not for ranking countries but for helping them improve their internet environment according to their own objectives.

Mr. Senne discussed the revision process, emphasizing the goal of updating the indicators while retaining the framework’s structure. A consultation and survey with participating countries were conducted to understand their experiences and identify new demands. A steering committee was formed to support the revision process, and new themes like environmental issues and emerging technologies have been introduced.

Dr. Souter spoke about the new draft, mentioning the reduction of indicators to make them more manageable and selective. He stressed the importance of qualitative indicators for assessing expert analysis and stakeholder views, which provide insights into impacts and changes beyond quantitative data.

Participants provided feedback, with one from Uruguay suggesting adjustments to enhance the effectiveness and applicability of the IUIs, such as ensuring the research team’s composition and focusing on the most impactful indicators. Another participant inquired about aligning their dissertation on digital transformation in Africa with the IUIs, indicating academic interest in contributing to practical policy applications.

The panel acknowledged the complexity of analysis and recommendation generation, emphasizing that the IUIs should complement existing research and work with national statistical agencies. Ms. Esterhuysen invited participants to become reviewers of the draft, highlighting the collaborative nature of the revision process.

The session concluded with a commitment to continue consultations, with the updated indicators to be launched at the global Internet Governance Forum in December. The panel underscored the importance of diverse research teams and multi-stakeholder engagement throughout the process.

Key observations included the recognition of the internet environment’s evolving nature and the need for the IUIs to adapt. The emphasis on qualitative indicators and the challenges of data collection and analysis reflect an understanding of quantitative data limitations and the importance of nuanced evaluations. The session demonstrated a strong commitment to inclusivity and active participation from various sectors and regions in shaping internet governance and policy development.

Session transcript

Ms. Anriette Esterhuysen:
Welcome to the consultation on the revision of UNESCO’s Internet Universality Indicators. My name is Annette Esterhausen, I am based in South Africa, I am associated with the Association for Progressive Communications, but for the purpose of this consultation I’m supporting UNESCO in doing outreach and consultation on the revision of the indicators. So just to give you a brief overview, you’re going to find out more about the indicators. Just a quick show of hands, who in the room has some familiarity with the UNESCO Internet Universality Indicators? Good, not everyone, but enough. But we are going to tell you more about it. So we have a very distinguished panel. We have on my right, Mr. Cedric Wattles, which I’m not sure I pronounced correctly, who’s the Chief of Section for Digital Innovation and Transformation at UNESCO. And on my left is Fabio Sene, who’s the Survey Project Coordinator of CETIC-BR from Brazil. And Fabio is leading this process of revising the indicators. And then on his left is Dr. David Souter, he’s the Managing Director of ICT for Development Associates. And David is actually doing the writing and the drafting of the revision. And as you will hear a little bit when Cedric gives you the background, both David and myself were involved in the drafting of the original indicators, which was a process that took place several years ago, but Cedric will tell you more about that. So welcome everyone, and it’s good that we don’t have too many people because we do want to hear from you. So Cedric, can I hand over to you?

Mr. Cédric Wachholz:
Thank you so much, Juliette, and a warm welcome to you all. And thank you for getting up early to join us here today. Yes, so I saw that a number of you are already familiar, so I will keep it short, but still for those who are new to the indicators, say a few words about them. The internet universality indicators look actually at most of us here in ITU are aware about connectivity questions. 2.6 billion people not connected, but the internet is so much more. And UNESCO in a long process starting in 2012-13 leading up to 2018 looked at the internet in a very holistic way, looked at the human rights dimension, looked at the openness, looked at accessibility to all, but also to which extent the internet is shaped in a multi-stakeholder fashion. So you see here the approvations for the rights, for the openness, the accessibility and multi-stakeholder participation and the X stands for cross-cutting issues such as gender equality or the environment. We will get to a few of these just in a second. So we have therefore these categories with a number of themes and questions and then 303 indicators currently and 109 core indicators. I don’t know to which extent you can see here the different themes going. Sometimes there are five to six themes depending on the categories and honestly, you might think this is a tool for specialists, 303 indicators is really a lot, but if you dive into it, you will find questions in a very systematic way which are really important and you might not have thought about before. And you will understand that this more holistic view. So I encourage you to deep dive. I could go now into some of these, but we will rather take the time for deep diving in some new topics. So actually, 55 countries have started, but 40 have either finished or are in the process of finishing their assessments, the policy recommendations, and then parts of the implementation. And of course, we wonder also why did some of them drop and set this part of the revision and how to enhance the effectiveness and the usability of our indicators. And typical indicator process looks like that. So the country initiates, there’s a group in the country which wants to do this indicator assessment, and we help them establish a multi-stakeholder advisory board. And then a research team needs to be put together, an action plan for research, the data gathering, data analysis, but of course, the report writing and the recommendations. And that is, of course, a very important part in this process in a multi-stakeholder fashion. And that’s always easy because there are different groups having different interests and different interpretations of the analysis and assessments. And then, of course, a national validation workshop and the impact assessment and monitoring afterwards. This is a process now. Is it me still? Or are you going over now, Fabio?

Ms. Anriette Esterhuysen:
OK. Just before you start, Fabio, I just want to ask you, Cedric, some people think that the purpose of the indicators is to rank countries. So can you just respond to that? Does it involve any ranking? Is the purpose to compare countries with one another?

Mr. Cédric Wachholz:
No, so thank you for asking that question. That’s an important detail. It is, of course, to advance the country along their own objectives. And it is initiated by the country. And there is no ranking at the end. So that is an important thing because we don’t want a country. countries to not do the assessment because they fear for a bad positioning. And sometimes this distorts also then the policy recommendation and the actions, because it is just to get the country up in the ranking. So consciously and thank you for that question. And I think it’s an important part of the approach. It is really about enhancing the country’s Internet environment in a holistic way, as the country wishes.

Ms. Anriette Esterhuysen:
But along the categories which are set. And thanks, Cedric. And I think just to add to that is that the indicators are designed for any country with any level of uptake of the Internet infrastructure, economic development. It’s not a tool that’s designed in the global north for the global south. It’s actually a tool that has been used by countries in the in the developed world. And the example I always like to use is Germany, who thought they would do extremely well on all aspects of the framework. And to their surprise, they actually, through applying the indicators in Germany, realized that there were areas where they were not sufficiently universal in access to Internet, for example, access to migrants in refugee centers. So it is a tool that’s designed for any country at any level of Internet development to use. So, Fabio, tell us more now about why we are revising it and how we’re going about it.

Mr. Fabio Senne:
Thank you, Aniette, and thank you, Cedric, for this invitation. Well, just first, just to know why we are here at CETIC. We are working with this. CETIC is a research center based in Sao Paulo, Brazil, and it’s connected to the stakeholder structure of model of Internet governance in Brazil, which is also the NIC.br and CGI.br. And we, by the way, were the first. We were one of the first countries that implemented the model back in 2008. Actually we piloted one of the test versions of it and we supported the revision, the creation, the publication of the third phase in 2008. If you go to the actual publication from 2019, you can see that it’s already there, the idea of revising it five years after the process. Of course, because the circumstances, the dynamics of the field is so much changing. Of course, we’re going to see from David that lots of things happened in these five years. The idea here is to really update and to keep the indicators updated, but at the same time keeping the structure and the dimensions and the main components of the framework. So, CETIC started to support this process back in last year, in 2023. We first started with going into all these 40 other countries’ reports in order to understand what countries did and how was their experience in providing the national assessments. Then, by the middle of 2023, we developed a consultation and a survey with participant countries. So, we did it in a qualitative way, like interviewing people in research teams from other countries to understand what they did. Also, sending an online survey, which was completed by 27 responses in 23 countries. By doing this, I think it was important to not just to understand better how was the process of implementation but also to feel a little bit about what what are the new issues or what are the new demands that the countries have when they are doing this type of implementation. So the main conclusion of this consultation is of course that the framework was positive from the countries that decided to implement it. We had some complaints on the unavailability of specific data, especially disaggregated data in some countries. This is an issue because the framework is also meant not just to present, to assess the national state of a particular country but also to fill the gaps. So what are the data that we need but we don’t have? So this is also something important. And of course we have COVID-19 in the middle of the process so this is also something that they complain is important. And finally just before, if you can come back Cedric, one slide, just to say that after CETI contributed with this report and consultation UNESCO decided to create a steering committee and this is the group of the steering committee which is supporting the process of revision and taking all the steps to have a really participatory process of events and in other meetings to guarantee that this is going in the right direction. So this is where we are and I think now David can tell us a little bit about how the revision is now shaping the new indicators.

Mr. David Souter:
Okay, so let me start by going back to the purpose of this. This is not an index. like so many other things which are done within the UN system in this area, it’s a set of indicators which are intended to enable in-depth analysis by a group of people within the country. So it’s not about box ticking, it’s about looking in in-depth at the evidence that is available and using that evidence, collage of evidence from different sources, using that evidence to assist the country’s own development towards an internet environment that is more beneficial to society as a whole, to users, to all stakeholders, and that requires it to move towards realistic recommendations. So the purpose is to have recommendations that can then be considered, implemented by the stakeholders within the country. Realistic recommendations, things that are actually achievable, so not utopianism but actual practical measures which can improve the quality of the internet environment. So that’s the purpose and we’ve had this experience over what five or six years now of doing this in a number of countries and from that we have learned a number of things, as a result of which we come to this revision. A key part of which is that the proposal for the future indicators is that there will be fewer indicators and that they will be more selective in the sense that you do not need to measure everything, in fact it’s not practical to measure everything. One thing we have seen is that most users of these indicators have chosen to use only the core indicators. So what we’ve tried to do is restructure it in such a way that the majority of the indicators are the core indicators and there are some supplementary ones that research teams might like to consider in their own countries because that will be useful to them. Also trying to make it easier to manage the data. collection process. So we have a clearer structure. I’ll say something about the indicators themselves in a moment, but they have a clearer structure. Each category is now starting with an overarching question, which we’re asking the research team to address. So we’re looking here for analysis rather than the answer to question one is X. We’re looking for an analysis that brings together all of those questions within the particular theme, within particular themes or within particular categories. And Cedric, if you pass on to the next slide, it shows these are the overarching questions we’re asking in those four main ROM categories. As you’ll see from those, they’re broad generic questions that allow the research teams to focus on the things that are particularly important in their countries. They’re not looking for binary answers here. They’re not looking for positive or negative, good practice or bad practice. They’re looking for an analytical concept. Can you go back now, Cedric, to the previous one? So in each of these ROM categories and in each of the themes within the X category, there’s an overarching question, which we’re asking the research team to address. And then what follows is the structure that is there within the current indicators, but with adaptations to make it simpler and more straightforward. So each category is divided into a number of themes. You saw the slide earlier in which those themes were illustrated. And within each theme, there will be one or two, usually two, core questions, which will have one or two, usually two, core indicators. And those are the things that we’re asking people, the research teams, to look at. Research teams to look at in order to make the. make their assessment. So I say that one or two usually two core questions with indicators and then one or occasionally two subsidiary indicators within a theme. It’s also possible of course for research teams to think well in my country this other thing is also especially important and to add that and we would encourage that. We haven’t yet developed the list of sources and means of verification that will go with this but that’s the next stage of the process. One thing I want to say about indicators is there are three types of indicator in this structure. So there are institutional indicators that’s basically the question of is there a law in place and is that law enforced for example. So that’s about the governance structure that exists. There are quantitative indicators so what are the numbers that tell us something about the prevalence of a particular thing that we’re examining and there are qualitative indicators and I would emphasize the importance of these. The qualitative indicators are about looking at the expert analysis that there is available and the views that are expressed within the stakeholder community about how well things are going or how badly things are going what what the state of play is in a particular area and I think the qualitative indicators are really important because so often we end up measuring just quantities and when we measure quantities we measure inputs. We’re not measuring impacts so the qualitative indicators are about trying to get to the impact to the changes that are resulting from things which may not be nearly so impressive as the numbers suggest. So I’m showing you the overarching questions. I’ll just helpfully move on two slides now. and give a couple of two or three examples of the changes we’ve made. So these actually tend to be in the qualitative area. So the first of these is where we’re looking at digital strategies.

Ms. Anriette Esterhuysen:
This is an evolution of an existing area within the current framework. David, perhaps let’s say now, under which category does this fall and which theme?

Mr. David Souter:
Yes, this falls under R for rights, does it not? The rights with which it is concerned here are the economic, social and cultural rights, which are very often ignored in assessments of rights in the Internet environment. So this is concerned with the digital strategies which can be enablers of those rights and the relationship between digital strategies and national development strategies. So the two questions here are concerned firstly with the overall framework for policymaking and the inclusion of these economic, social and cultural rights within those national strategies. And then secondly, to take certain aspects of economic, social and cultural rights, employment, education and health, and look at whether those ESC rights are included in strategies concerned with the sexual digital development in those areas. So this is an analytical process that needs to be done here. The answer is 34. This requires consideration and it requires discussion within the multi-stakeholder team that is doing the research process. The supplementary question that we are proposing here is a much more specific one, which is to do with the participation of people in cultural activities, obviously a key part of UNESCO’s mandate. the second one, this is one of the two, one of the newer areas, Cedric. So on the environmental issues, I think the environmental issues have become much more prominent in discussions of digital development over the last five or six years since we first designed the IUIs and also over the last 15 years, and it seemed important to have this as a new standalone category, a theme within the X group, that is within the cross-cutting group. It’s much more important than in previous times, and I think the three critical elements of environmental analysis here, which you’ll find emphasised in UNESCO, in UNCTAD’s forthcoming digital economy report, are to do with emissions and climate change, to do with e-waste and pollution, and to do with the overuse, over-exploitation of scarce resources. So what we’ve done here is look at, is propose questions that are to do with the overall, again, with the overall policy structure and framework. Is government or other stakeholders incorporating environmental impacts in their policy development, in their business model design, and so forth? And is there adequate understanding of the statistical situation that there is here that enables those policies and business models to be developed appropriately? We, question two, specifically chooses e-waste, on which there is international agreement around a particular statistical model, though that’s not currently enormously well deployed. UNITAR, we’re discussing this at an earlier session this week, the UNITAR being the agency that’s been responsible for developing it. So, specifically concerned with e-waste, and then the supplementary question is concerned with carbon emissions. It’s a difficult area, clearly a crucial area to understand, but it is a difficult area for governments to achieve statistical evidence on, not least because businesses are not very forthcoming. And then the third one that I’ll pick out is the preparation for the emergence of new technologies. Again, this is a broad policy area, and it’s really concerned with whether governments in particular, but other stakeholders too, have a strategic approach to this, which you might say is consistent with the overall WSIS goal of shaping the information society and the interest towards the common good, as it were. So is there a legal framework, is there a national strategy? Are there multi-stakeholder fora in which the evolution and the deployment of new technologies, artificial intelligence in particular, is discussed with reference to the ethical frameworks and the common good? What I tend to describe as preserving what we value, promoting what we want, and preventing what we fear. So that again, those are the three areas. The last two are the areas that we’ve particularly added. The digital strategies is one where there’s been significant adaptation. Many of the other areas, it’s just simply a simplification of what has been there in the past. Was that within my time? Thanks, David. Yes, just before we open to the floor, I just want you to

Ms. Anriette Esterhuysen:
explain the X category. So you would see, we talk about Rome, rights, openness, accessibility, and multi-stakeholder, but the indicators has the X, which stands for cross-cutting. So maybe just tell people a little bit how we use the cross-cutting category, why it’s so important, and maybe mention an example, if you can think of one that’s linked to one of these new areas you mentioned, or just give a general overview of what we do with the cross-cutting area.

Mr. David Souter:
So the initial, when the RUI framework was first developed, it came from a concept which UNESCO had adopted of internet universality, which was based around the R-O-A-N categories, so rights, openness, accessibility for all is how it was defined, and multi-stakeholder engagement. And then it was thought that it would be useful to have indicators which were analogous to UNESCO’s existing media development indicators, which have been around for some time, and which were intended to bring about this in-depth analysis of the media environment in a country. So there was a process of developing this, and in the course of which Henriette and I were both involved in, and in the course of doing that, it clearly made sense to place these four elements, which are particularly relevant to UNESCO’s mandate, within a broader framework of the internet environment as a whole, because you couldn’t really understand the rights issues unless you also understood them through a gender lens, or you looked specifically at the interests in terms of protection and in terms of promotion of rights and access of children. And clearly, things needed to be looked at through the overall UN goals of sustainable development. And then the other areas of trust and security, cyber security, was obviously a large area which covered all of those four main themes, and the legal and ethical aspects were increasingly important to all discussion. So, the X category was put there as a way of ensuring that there was an overarching lens of the overall Internet environment that was applicable in each of the four ROAM categories.

Ms. Anriette Esterhuysen:
Thanks, David. So, let’s hear from you. Any questions or any reflections if anyone has had any experience of working with the indicators? Please, go ahead, just introduce yourself and make your contribution. Good morning to all.

Audience 1:
I am Duarte from the Abrinches, the Brazilian Association of Internet Service Providers. First of all, I’d just like to highlight we have a very good relationship with CETIC and it’s really a pleasure to see your work and highlight, I think, the importance of CETIC to the Brazilian environment and all the good data they produce. Looking at your presentation and thinking about also the latest report from CETIC about meaningful connectivity on the A line, let’s say, what are we moving forward with data and indicators on, specifically, meaningful connectivity? Not only looking at access, but also quality, speed and all that, because I believe that involves a lot of challenges, as seen from the latest report from CETIC, on even when you talk about digital skills, how can we define that? Are we also working on refining concepts around meaningful connectivity? Thanks very much for that. Abram, you just go ahead.

Ms. Anriette Esterhuysen:
Okay, hello. Hello. Thank you very much.

Audience 2:
My name is Abraham Sobi. I am from Ghana, but currently I’m in the U.S. a postgraduate student at University College London in UK. My studies is focused on public administration management, but I’m focusing on digital transformation and public policy. So my dissertation currently I’m working on is digital transformation in Africa. I want to know how my dissertation and research can contribute to this work, basically on the specific indicators, because I know some research academic can also contribute to the work, and if you have some working groups that people need to join and learn and contribute. And with these indicators, have you have an impact using these indicators in Africa, if I can learn more about the African scope? Thank you very much.

Ms. Anriette Esterhuysen:
Thanks very much, Abraham. And yes, we’ll take one question, and then we’ll go back to the one last question.

Audience 3:
Right. Thank you very much. I’m Mercedes Aramendia. I’m from Uruguay. I’m from the regulator, actually. Of course, once we receive the draft, I ask the specialists of this topic. One thing that I really appreciate that in Uruguay nowadays, we work with more than 40 actors in order that all of them are involved in implementation in Uruguay. We have civil society, academia, different organizations from public sector and also private sector. And we have some questions related to the proposal that you sent to us. I don’t know if this is the moment to ask them or not. Will it take a long time, the questions? No, it’s easier. Go ahead. We still have time. Great. In relation to the question or proposal, we suggest the following adjustment to enhance the effectiveness and applicability of the indicators. Firstly, related to the methodology. For us, it’s crucial to ensure the composition of the two expert groups and mainly the leadership of the process. In most countries, the same actors are often repeated, which can compromise the independence of the experts. Secondly, related to the access, firstly, rights. We were not sure whether the participation topping is being removed or not in that aspect, because we read the proposal. And for us, it’s not clear if the participation is still remain or not. And of course, we believe that is very, very important. In relation to the openness, we agree on differentiating open data from open government to provide clear insights. In relation to cross-cutting, we support the change and think it is beneficial to place additional emphasis on environmental issues and their impact, as well as including emergency technologies. Final relation to indicators, we believe that currently there are too many indicators, which can be overwhelming. We propose reducing the number of indicators to focus on the most impactful ones. Actually, in Uruguay, for example, we could only advance in the main indicators. Avoid including theoretical indicators whose data sources do not exist in most countries, such as the number of court sentences or compliance on various topics. It might be beneficial to close and codify response options to enhance the precision and comparability of the data. And finally, we think it could be positive to better define the request for compliance evidence. Currently, evidence can be very diverse in hierarchy, which may include non-existent sources. Thank you.

Ms. Anriette Esterhuysen:
Thanks very much. Now, that’s exactly why we invite countries that have used the indicators to be part of the process. Maybe I’ll ask Devin, Fabio, if you can start responding, because it was complex. And you’ll send that to us in writing as well. The consultation process is. not finished, it’s actually just starting. And then I’ll respond to Abraham’s question and Fabio, if you can also respond to the question about Brazil. David, do you have some immediate reactions to the extent to which we are addressing those issues? Yeah, so let’s be, can we be frank about the initial process here?

Mr. David Souter:
One of the problems we had in the initial process was that we started- This is the 2017 process. We started with a much smaller number of indicators and there was pressure put on us from consultation processes to add subjects to the framework. And this is why we ended up with the core and the supplementary, because that was a kind of way of dealing with that particular problem. But you’re right, there are too many indicators. Now, there will be about a third of the total number, I think, when we come to this particular process here and they’re much more carefully structured. And one of the things I think that is important in this consultation process we’ll now have is that if somebody wishes to propose an addition, they should also propose something to remove in its place so that we don’t have that same sort of problem. In terms of the evidence base, the evidence base varies from country to country. And so this is meant to be usable in all countries, but that doesn’t mean to say it should be the lowest common denominator of an evidence base. So if there isn’t a substantial evidence base in a particular country, that means that that is part of the findings and should lead to recommendations concerning the need for there to be a more developed evidence base. I’m, one of the reasons I talked about the qualitative indicators earlier is that I think it is really problematic if we rely solely on quantitative indicators here because those quantitative indicators tend to be indicators from the supply side. So how many people have subscriptions, for example. And that’s not really telling us about what the impact of the internet is. So the qualitative analysis is really, I think, really important in getting to understand what’s actually happening as a result of the internet. So I think it is important for those qualitative indicators to be there. And if there isn’t, I mean, e-waste is a good example of a place where some countries do have evidence, quantitative evidence, and others don’t. But everywhere should. And the UN promotes a model for analysis that can and should be adopted everywhere. Because this is a really crucial issue. It’s not at the top of people’s agendas, but it actually is really important. So I think it is good to have those quantitative things there, even if a lot of countries aren’t currently able to do them.

Mr. Fabio Senne:
Thank you. No, just I would like to address on this. We had a very sound discussion on meaningful connectivity. And it’s interesting to note that if you look into the framework, from the beginning, the idea is we are talking about accessibility to all, not just access. So if you already go to the original categories, things like usage, affordability, equitable access, the local content. So the framework is really, from the beginning, is really going into this idea of meaningful connectivity, although this concept were not used in that way. So now we are updating to make sure that in the indicators, countries can report policies, for instance, to achieve meaningful connectivity. So there is a new indicator. One of the new indicators is that if the country has any statistical information concerning meaningful connectivity. So we are advancing in proposals. And we had, along with ITU, we are also developing some suggestions for measurement in this time. study in Brazil does this. And also, if the government has a policy clearly related to meaningful connectivity or meaningful access in a way that the connections are reliable, affordable, and effectively implemented. So I would say that although we don’t used to use the concept, it’s already present in the way that we cover all these dimensions of what we call nowadays meaningful connectivity. Thanks.

Ms. Anriette Esterhuysen:
Thanks, Cedric, that you want to add. But I just wanted to respond a little bit. I think what’s so interesting about your contribution, the Uruguay experience, is that it points to the fact that we should update not just the indicators, but also the guidelines. Because I think your point about the composition of the research team and how that can influence the results is very important. And I think that was part of the original conceptualization of the indicators, that it’s a multi-stakeholder process. But also people from different institutions, different sources, that they have to, in a sense, engage with one another, sometimes disagree with one another to give that holistic picture that Cedric mentioned. And I also want to emphasize that we do not want the research done for the indicator assessment processes to be new research that’s in parallel to existing research. We wanted to work in a complementary way with national statistical agencies. Because ultimately, we need, every government needs to have the capacity to gather and analyze data. So we actually want to work collaboratively. And also with specialized, in-depth research initiatives, like the Brazilian study on meaningful connectivity. So it’s really not intended to generate a kind of parallel new initiative to gather data. It’s really intended to work with existing data sources. And then, as David says, I identify where the data doesn’t exist, and then feeding that into recommendations. Abraham, your question, actually, I think the last slide will probably help us to come to that. We now have a new draft. So we’ve got this reduced draft with fewer indicators, a simpler structure, like David mentioned, and we’re now going into the phase of consulting. So you, for example, I think because it sounds to me as if your research is covering Africa as a whole, you’re not focusing just on Ghana, but you could be one of our expert contributors. You know, we would send the draft to you and you could have a look at it and just assess whether you think that the indicators, the questions, the themes, you know, how we structured it, are not overlooking something particularly important. So, in fact, I invite everyone in the room who would like to be a reviewer of the draft,

Mr. Cédric Wachholz:
just come and leave your email address with me afterwards and we’ll follow up with you. And Fabio, can you say a little bit more about the next steps, because I’m losing my voice. Oh, Cedric, you go. Just, thank you, and we really appreciate all the inputs received and it reflects also the interest and in terms of our work in Africa, we have currently 15 countries and Ghana is one of them, which have been or are currently undertaking. And some are also in their second, Kenya is the second round. So perhaps even, I mean, Ghana hasn’t indicated the interest, but if they would, then that would be, of course, an opportunity to really contribute directly and update. And that’s part of the interest. On one hand, to see how were the policy recommendations implemented and what has progressed over time. So there is, of course, a huge need. And in Africa, 34% of women have only access to the internet. In some countries, it’s only 21. There’s a huge opportunity to improve in a holistic way, the internet environment. So, yes, in addition to what you said.

Ms. Anriette Esterhuysen:
Thanks very much. Cedric. In fact, Africa is the region, I think, where we’ve had the largest number of studies completed in one area. I mean, I won’t dwell on this, you can see this, we’re in this consultation phase now on the new draft, so let’s comment on that, let’s get your input. And our goal is to launch the updated revised framework and guidelines for implementation at the Internet Governance Forum in December. We have a few minutes left, are there any other questions or panelists, do you want to make any other contributions? Go ahead, please.

Audience 4:
Thank you very much for the brilliant presentations. I’m Graziela from CETIC.br. It’s just a brief question, it’s like, I understand that we have like this roadmap to collect indicators and that there is this huge effort in order to provide all the possibilities of indicators, and even knowing that when you don’t have an indicator, this is an indicator, no? That you have to improve that area. But my question is, how can we provide some tools in order to help countries to develop analysis? Because, you know, like when you have 109 or 40 indicators, how could we help, how could we be more suggestive in terms of analytical tools and how to the countries could absorb this kind of information in a way that they can prioritize actions and policies? Thank you. And develop recommendations, actually,

Ms. Anriette Esterhuysen:
and I think that you’re absolutely right. It’s very, very difficult to do the analysis in order to do the report. But I think what we found that even more complex is how to then come up with realistic, implementable recommendations. Fabio, do you want to say a bit more about that? David, do you want to say something? Well, actually, one thing I think is

Mr. David Souter:
really important, and this is from my experience working in the media development indicators, is that what you don’t want to have, okay, your research team. needs to be diverse. And clearly, individual researchers in that will look at particular areas in which they have most expertise. But the really important part of the process is the discussion amongst the research team about the findings across the board. So that although one researcher may have looked at a particular area, what comes out in the report should be the outcome of the discussion amongst the whole team, which has different perspectives, different viewpoints, different ideas, different experiences. So it’s a sort of methodological point. It’s about how to work, as opposed to how to research.

Mr. Fabio Senne:
Yeah, I think one of the richest thing of the model, I think it’s the mood stakeholderism in it. So the mood stakeholder part is not just a set of indicators on mood stakeholder participation, but also the process that need to be mood stakeholder. So this is something. And when we did the Brazilian report, for instance, we had for some indicators that they are more qualitative on perception of the sectors. We decided, for instance, to show the differences between sectors in the report. So there is no need for have one single answer to the problem. You can reflect on the actual debate that you have on the country. So this is, I think, one of the ways to go beyond the list of indicators and really coming out with good recommendations.

Ms. Anriette Esterhuysen:
Cedric, why don’t you add to that? And then you can also close for us, because our time’s up. Such a good time manager you are.

Mr. Cédric Wachholz:
Thank you. And just to add that I was impressed also how generous the different teams who did IUI assessments are in terms of sharing their experiences. So there is a dynamic coalition at the IGF where many come together and where countries exchange. And where also some questions were addressed, researchers had. And we had also some online meetings between researchers experience. So that’s an important point. The second one is, of course, also the one of the implementation and how can we assist, because it’s not just to produce a little nice-looking book, but it is also about how to improve. And that’s another area, but UNESCO is helping there too in areas of our competence, of course, and then we try to connect to other agencies who can do better work in other fields where we can’t cover, but we have assessed. So it’s the research, but also the implementation where we try to facilitate. Good. Now I’m invited also to Columbus because we are on time. And just thank you very much. The room has filled up, even though we started early. And so thanks to all of you who participated also actively. 45 minutes, of course, a little bit short to cover everything, but the process we have here still, how it will go on, and I will say two works about that too. Thank you, of course, also to Henriette, David, and Fabio, not only for today, but for more than a year of work. And it was intensive and continues to be intensive work in consultations, many, many, many hours of interviews and so on leading up to not just a routine update, but really also in response to Uruguay streamlining the process, increasing the relevance of feasibility and in the end, and also the impact of what we are doing where we will and continue to be inclusive. We have the new themes. We have looked at some of the environmental and some of the new technology areas, including meaningful connectivity, better spelled out. But there will be also, there’s something about online platforms and the governance of these. So it is more, the update will address more contemporary challenges and remain relevant and effective. So looking ahead, we will have more and longer sessions in different areas. I mean, at EuroDIG, for example, but also we will publish online and invite people to have a comprehensive view and comment on the entire set of indicators for feedback and improvement. 45 minutes, again, is too short to go through everything. But regional IGFs, we’ll be at all the regional IGFs. And we will be at all the regional IGFs. And there will be then more upcoming consultations. So the updated indicators will then be launched in December at the global IGF. But we hope and trust we will see you before. And don’t hesitate to come to us if you have any specific inputs or other questions after the session. So thank you again for all of you for joining. Do you want to close anything else?

Ms. Anriette Esterhuysen:
No, just thanks very much, everyone. And just come to the front. And I’ll take your details if you want to be a reviewer of the draft. Thanks very much, everyone. Thanks also for those who were online joining. And thank you to all. I think it’s almost the end of the interview.

A1

Audience 1

Speech speed

142 words per minute

Speech length

167 words

Speech time

70 secs

A2

Audience 2

Speech speed

155 words per minute

Speech length

137 words

Speech time

53 secs

A3

Audience 3

Speech speed

156 words per minute

Speech length

444 words

Speech time

171 secs

A4

Audience 4

Speech speed

154 words per minute

Speech length

148 words

Speech time

58 secs

MC

Mr. Cédric Wachholz

Speech speed

162 words per minute

Speech length

1472 words

Speech time

546 secs

MD

Mr. David Souter

Speech speed

156 words per minute

Speech length

2639 words

Speech time

1016 secs

MF

Mr. Fabio Senne

Speech speed

145 words per minute

Speech length

1006 words

Speech time

417 secs

MA

Ms. Anriette Esterhuysen

Speech speed

168 words per minute

Speech length

1534 words

Speech time

549 secs