The potential of technical standards to either strengthen or undermine human rights and fundamental freedoms in case of artificial intelligence systems and other emerging technologies

29 May 2024 10:00h - 10:45h

Table of contents

Disclaimer: This is not an official record of the session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed.

Full session report

Exploring the intersection of technical standards and human rights at the WSIS forum

At the World Summit on the Information Society (WSIS) high-level forum, a panel discussion was held to explore the complex relationship between technical standards, human rights, and the involvement of diverse stakeholders in the standard-setting process, especially in the context of emerging technologies such as artificial intelligence (AI). The session was moderated by Niki Masghati from the US Department of State and was initiated by the Kingdom of the Netherlands.

The panel recognised the crucial role played by open, transparent, and consensus-based standard development organisations (SDOs) in the success of the global, interoperable internet. These technical standards not only facilitate international communication and innovation but also have significant implications for human rights, as detailed in a report by the Office of the High Commissioner for Human Rights (OHCHR).

Gurshabad Grover, from Article 19, addressed the impact of ICT standards on human rights from a civil society perspective. He identified three key areas: universal and meaningful connectivity, censorship and surveillance on the internet, and data-intensive technologies including AI and machine learning. Grover stressed the importance of standards in expanding internet access, enhancing privacy through encryption, and managing AI systems’ lifecycles. He also highlighted the need for inclusive representation in standard-setting discussions to ensure the voices of underrepresented groups are heard.

Isabelle Ebert, representing the UN Office of the High Commissioner for Human Rights’ B-Tech Project, discussed the roles of the private sector and governments in ensuring rights-respecting emerging technologies. She emphasised the private sector’s responsibility to respect human rights during standard-setting and implementation, and the state’s duty to protect individuals from private sector human rights abuses. Ebert noted the challenges of integrating human rights into technical communities and called for increased expertise and capacity within SDOs to support this integration.

Mallory Knodel, from the Center for Democracy and Technology, shared her extensive experience in advocating for human rights within SDOs. She suggested several strategies to lower barriers to meaningful participation, including public documentation of discussions, welcoming systems for new participants, clear codes of conduct, mentorship programmes, and financial support. These measures would enable broader engagement from civil society, small businesses, and the Global South.

The panel responded to audience questions from representatives of Cloudflare, the Czech Republic’s digital diplomacy, the Atlantic Council, and the European Commission. Topics included the role of startups in shifting paradigms, the importance of clear communication between technical and human rights communities, and the coordination of efforts across different SDOs to leverage past lessons and successes.

In closing, the session underscored the complementary nature of a free internet and human rights, advocating for the continued integration of human rights considerations into standard-setting processes. The panellists highlighted progress in organisations like the IRTF, IEEE, and W3C, and the need for ongoing international alignment to promote human rights in the digital era. The discussion concluded with a call to utilise high-level agreements and documents, such as the Global Digital Compact and the Net Mundial Outcome, to further the conversation on human rights in standard-setting.

Session transcript

Niki Masghati:
Hi, everybody. We’re going to go ahead and get started. So good morning. I hope everyone has had a good WSIS high level forum so far and are excited to do the panel today and get into AI for good during the rest of the week. I’ll be your moderator today. My name is Niki Masghati. I’m joining you here from the US Department of State. I would like to thank the Kingdom of the Netherlands for bringing us together here today for this important and timely discussion. As we all know, open Internet protocols and a broad range of digital technology standards largely developed and maintained in open, transparent, and consensus-based multi-stakeholder standard development organizations, or SDOs, have been the key to the success of the free and open, global, interoperable, secure, and reliable Internet. Technical standards enable communications networks to operate worldwide, enabling billions of devices to seamlessly interact across borders. Standards also help to manage risk, security, safety, privacy, and quality in the development of new innovations. There are ways in which some standards, and by extension the technologies that incorporate such standards, can impact the exercise or enjoyment of human rights. OHCHR examined this complex issue in their report last year on human rights and technical standard-setting processes for new and emerging technologies, which emphasize industries’ responsibility to respect human rights in standard-setting processes consistent with the UN Guiding Principles on Business and Human Rights, as well as states’ obligations to respect, protect, and fulfill their human rights commitments, which applies to their involvement in standard-setting processes. mean in practice and across the various standard setting development organizations. As we know, among the principles for international standards development processes are that they are open, transparent, and consensus-based. These are intended to be inclusive of experts across the multi-stakeholder processes, so industry, academia, civil society, and government to reach the solution with the most merit. The most accurate assessment of the potential benefits and risks results from bringing together these diverse stakeholders and developing a comprehensive understanding. For example, the inclusion of geographically diverse civil society stakeholders in these processes, including women in all their diversity, people with disabilities, and LGBTQI individuals is particularly important, given that those who face multiple and intersecting forms of discrimination in societies may be heavily impacted by standards set for how new and emerging technologies are developed, and then also how they are deployed. So how do we facilitate greater multi-stakeholder exchange about the potential risks across technological development and deployment, including how to help ensure that these standards-setting processes are inclusive of relevant stakeholders? So today, our esteemed panelists will dig deeper into the intersection between standards and human rights, and consider strategies for how we can build a better shared understanding of these linkages across diverse stakeholders and a complex standard development ecosystem. So first, Gurshabad, are you online? Yes, hello. Perfect. While Gurshabad Grover is joining us virtually today, he serves as a technologist and a legal researcher based in New Delhi, currently serving as interim head of the Global Digital Program at Article 19 and co-chair of the Public Interest Technology Group. Gurshabad, thank you so much for joining us here today virtually. From your purview in civil society, in which areas do you believe technical standards can generate human rights impacts? In your response, it would be helpful to hear your recommendations for how we can build a better shared understanding of the nature and extent of human rights and technical standards linkages across diverse stakeholders and diverse organizations in the standard development ecosystem. Gurshabad, over to you.

Gurshabad Grover:
Thank you. It’s a pleasure to be here and perhaps I’ll start by speaking from Article 19’s perspective, which is a human rights organization focusing on freedom of expression, the right to access information and media freedom more broadly. So the underlying framework of our engagement with technical standards and the bodies has always been international human rights and we aim to bring these considerations to information and communication technology at large and technical standards in particular. On the question of what areas of ICT standards create or influence the exercise of human rights, we primarily work on three, but I’m happy to mention a couple of others as well. I think the first is on universal and meaningful connectivity and where technical standards play a role here is that they can dictate avenues for expanding internet access to regions that are currently not served or underserved. My colleague, Dr. Raquel Reino, for instance, with many civil society allies has been highlighting how standards at the ITU and 3GPP, now we’re looking at the decisions on whether six gigahertz, for instance, would be sufficient. will be licensed or unlicensed, et cetera. And these decisions can influence whether small internet service providers or community networks can gain a foothold in what is emerging to be a really concentrated telecom market. But at the same time, you’re seeing a plateauing of internet access or otherwise increasing rates of data. The second focus area we have and of particular interest to me is censorship and surveillance on the internet. If we dial back to just a few years ago, the internet in terms of security and privacy looked quite different, and in my opinion, objectively worse. Internet service providers and any intermediary network device or service could easily say what website a user was trying to access. And now work in particular at the Internet Engineering Task Force has tried to encrypt portions of the domain name system and strengthen parts of the HTTPS protocol, the Secure Hypertext Transfer Protocol, to prevent that information from being leaked to the network. And similarly, there have been advances in preventing mass surveillance, especially after the Snowden revelations. So we had evidence of states using these vulnerabilities to not just engage in targeted surveillance and censorship, but also mass surveillance and censorship. And we see recent work at the ITF is sort of closing the gap and patching these vulnerabilities so that the potential for censorship in really all across the world is reduced. The third I’ll say is data-intensive technologies, especially artificial intelligence and machine learning. learning. At Article 19, we’ve been focusing on biometric surveillance, and here standards can dictate the lifecycle of an AI system, including how they’re trained and deployed in practice. These standards are at many bodies, but IEEE and ISO-INPUT are also working on them. And Nikki, thank you for mentioning representation and diversity. I think even in the two examples I gave, I think at the IETF and ITU, it is quite rare to see representation from people who are underserved in terms of access or are otherwise prevented from the full enjoyment of the internet because of network filtering and censorship. And both these topics, I think, are sometimes not even mentioned in the rooms where standards are being discussed. So that definitely is an important factor that belongs in the human rights framework as well. And I’ll just give an example of how these linkages have been articulated. I know Mallory and others in the room have been involved with the Human Rights Protocol Consideration Research Group at the Internet Research Task Force, where the effort was to concretely establish these linkages, link particular human rights to particular properties or elements of network protocols and architectures. And so that is one way where you articulate these linkages. And I think civil society, of course, can be an important counterforce to powerful state or corporate actors if their decisions are undermining the exercise of human rights. And these articulations provide a framework for that engagement as well.

Niki Masghati:
Gurshabad, thank you so much. You know, hearing you highlight the three sort of areas that you’re looking at, I think is really relevant to the discussions both that we’re having at WSIS and that in the upcoming AI for Good conversations, I imagine there will be quite a bit of conversation on your third prong, as folks are looking more and more into the, you know, types of bias and discrimination that happens while using AI technologies and how to mitigate against that, whether it’s intentional or unintentional. So thank you for that. Next, we’ll hear from Isabelle Ebert. She works in the UN Office of the High Commissioner for Human Rights, BTEC Project. Isabelle, having engaged with the private sector and governments, from your vantage point on the BTEC Project, what do you see is the role of the private sector in ensuring emerging technologies are rights respecting? And similarly, what is the role of governments in developing and implementing technical standards around emerging technologies in a stakeholder-led standards ecosystem?

Isabel Ebert:
Thanks very much for the invitation. Many thanks also to the organizers who bring this panel together. I think it’s a very important one, in particular in this setup and in our interdisciplinary community. It’s a great pleasure to be here. Just to speak to the necessity to really be aware that the role of the state and also the role of the private sector in this area are very closely interlinked, in particular when it comes to ensuring that technologies are used in a rights-respecting manner. And that includes evidently the focus of today’s session in terms of standard setting, but also it includes wider policy engagement happening at this important forum today and in the coming days. As many of you in this room will know that international human rights law places a duty upon the states to take the necessary action to protect people from human rights abuses. And this includes also abuses from the private sector. And that, again, also includes abuses from tech companies. And in that manner, the state can employ both binding but also non-binding instrument to fulfill this duty. And it can also support intergovernmental as well as business initiatives in terms of elevating important voices around human rights. The UN Guiding Principles bring to the table that also corporates have a responsibility to respect human rights, and they… conceptualize what this role of the private sector embodies. In that regard, what they really emphasize is that the corporate responsibility to respect is a process-oriented and risk-based approach, meaning that the UNGPs speak to the governance of human rights within tech companies, but also the actual management systems that companies have to have in place in order to identify, prevent, and mitigate adverse impacts on human rights. And these human rights responsibilities, in the more general sense, carry, of course, into the A, engagement around such a setting, and B, also, when companies are implementing technical standards, so the two sides of the coin, really. And when the companies are implementing technical standards, it’s important that businesses do that in a rights-respecting manner. What we found during the research and the work in writing the report that Niki already mentioned on technical standards setting in human rights is that the standard-setting work is really a prime example how we need to work across thematic silos, why conferences as today are really important to really make human rights understood as something that can offer concrete value to the standard-setting work and also brings life to human rights in a very concrete manner. The report that we issued last year discussed the relevance of technical standards, including also AI technical standards for the enjoyment of human rights. The consultations around it and the report itself really demonstrated that it is, in fact, possible to break those silos, to work across those thematic silos, but it also showed that this is not a very easy task, and it was, at times, difficult to bring human rights aspects into technical communities and really operationalize them. Many of those obstacles linked to the integration of human rights are intimately linked to the limited scope of input as well as the limited scope of work. scope of actors that feed into technical standard-setting processes and here it’s really also the private sector that has an important role to play in terms of increasing the quality of inputs regarding respect for human rights and standard-setting processes. A condition for doing so is to carry out meaningful engagement with potentially affected people and stakeholders as part of human rights due diligence. There are some companies in the room today that engage very actively on standard-setting processes. Cloudflare is one of them. They have been really quite vocal about privacy-enhancing aspects in standard-setting processes and so it’s one of those examples where we see that business actually can commit to human rights and standard-setting engagement. In terms of what the standard-setting organizations do and you can kind of also to a certain extent call them an enterprise. They have good practices often in one area and then some not so good ones in others. Generally speaking, we need to increase the expertise and capacity of standard-setting organizations to human rights integration. There has been really some progress in the last month. Often, of course, evidently standard- setting is a very technical endeavor that also influences which type of people are in the room and as we’re moving to much more advanced types of technologies that have a huge impact on people, it’s also important that some people are in the room that actually know how to, so to speak, carry out quantitative and social science, bridge social technical discussions from both the data science point of view as well as some of the human rights considerations. Most of the standard-setting organizations at the moment don’t have clear human rights commitments. To some degree, there has been active resistance to embodying human rights in some of the processes, but as I said in the recent months, there has been also a lot of progress. I’m naming a couple of examples. The IRTF hosted a human rights protocol consideration research group and continues to host it. IEEE has adopted ethically aligned design principles for autonomous and intelligent systems that have really elevated respect for human rights as core principles. And the ethical web principles on W3C’s technical architecture group stated that, I quote, we need to put internationally recognized human rights at the core of web platforms. End of the quotation. So we really see there’s a lot of movement. So that’s in terms of what the private sector can do. Also bearing in mind that the primary security barrier is the state. Do we have a little bit more time to talk about the states? Great. So also there’s some movement in terms of the ITU member states. 45 ITU member states have recently called for a human rights-based approach for technical standard-setting processes, including at the ITU. The ITU also recently hired a human rights expert that is responsible for standard-setting processes. And as you know, states can both act as participants in standard-setting processes as well as regulators. And with some of the AI regulation being adopted in the recent months, this is, of course, a very important function. And if states regulate, they need to ensure that they’re doing so in a transparent and accountable manner and also assess the impact of regulation on human rights. So the impact of the regulation itself. That also means that when states delegate powers to standard-setting organizations, they need to take steps to ensure that both processes and outcomes are based on human rights, ensure transparency, accountability, participation, and also challenges around potential access to remedy. There are some examples that might have been discussed already in some of the panels that you attended. We know that with the adoption of the EU AI Act, there has been a standard-setting process initiated. Human rights are currently not part of it, but I’ve heard that there’s some interest by some of the working group members to conceptualize also human rights as part of that process. But the report also spoke to that currently in terms of the participation of the stakeholders in those standard-setting processes. There is a certain gender gap. So 27% of ITU study groups’ participants are women. It can be a bit of an improvement hopefully soon. And then also there’s an imbalance in terms of the representation of the majority worlds. The global south is dramatically underrepresented. Civil society is often struggling to have the… necessary resources and staffing capacity in order to attend all the working groups. So yeah, it’s a relatively mixed picture that’s emerging, but also a lot of hope in terms of some trends towards rights respecting conduct of many of the important actors. And last but not least, really, the awareness raising has increased. This is a really good opportunity to also use the voices that we have here in the room today to continue raising the awareness. Thanks.

Niki Masghati:
Thank you so much. That was very comprehensive. And if folks have not actually read the OHCHR report from last year, we highly encourage it. It’s actually linked on the page for this panel. I agree. I was here actually in Geneva last year when you actually ran that session, eight hours in the Palais. And I feel like there’s actually been quite a bit of progress since then. But as you said, more to be seen. Finally, we’ll hear from Mallory before we turn to questions from all of you. Mallory Knodel is the Chief Technology Officer at the Center for Democracy and Technology. She is the co-chair of the Human Rights Protocol Considerations Research Group of the Internet Research Task Force, which we’ve heard actually multiple times mentioned, and an advisor to the Freedom Online Coalition. Mallory, you’ve spent years elevating human rights issues within standard setting bodies. What are ways these organizations could lower barriers to meaningful participation, as Isabel highlighted, is a barrier for participation, to incorporate more civil society and human rights expertise in support of a strong multi-stakeholder approach to internet governance. We welcome hearing in your response any challenges that you’ve personally experienced in ensuring broad multi-stakeholder engagement in technical centers and development processes, and how you and other experts have overcome such challenges.

Mallory Knodel:
Thank you so much for inviting me and really appreciate holding this workshop today. I just want to say, in the last year, after that meeting at the Palais, I’m glad to hear folks feel like this space is moving fast. But since I have such a much wider lens on this, I just want to say that that meeting at the Palais was a real moment. And this is, as well, massive progress. It’s so good to know that we actually have folks paying attention to this now, that the human rights community itself is really invested in this work. It’s just a massive milestone for me, so just to acknowledge that. Because it has been years. It’s been a long time since we realized that these spaces where standards are being set, where an industry is coming together to design, deploy, and then talk about the implementations of technology are really consequential for human rights. So just massive impact for all people who use the internet. So that impact is really wide. Then I think the ways that we’ve been enabled or we’ve been able to engage, we started, of course, with the more open fora. It’s just been easier to engage in internet governance spaces because of this commitment to multistakeholderism and this view that it’s an open space and it’s structured accordingly. But not all SDOs operate in that way. And I also am going to just speak from the perspective of global fora because, of course, there are national bodies, there are regional bodies, and those can be actually very impactful and they have a role to play in the landscape. But in terms of where we’ve prioritized our time historically, where civil society needs to, with its limited engagement, limited ability to engage, that’s where we primarily have focused. Simple things. Meaningful openness to participation will include things like public documentation, not just of the outcomes, but of the meetings themselves, of ongoing discussions, of how decisions are made. Drafts as well as final documents is really important. This work being done in a transparent manner, it means that oversight is possible, that there are multiple ways to engage. One can be deeply, deeply engaged. One can even be an author of such documents. Others can be running analysis over vast quantities of mailing lists, discussions to understand trends and so on. So there’s many different modes to engage when that kind of transparency is possible. And it also allows for moments of public feedback. There may be times where someone like me calls on the real experts in my community. to parachute in, right, for the right reasons and at the right time to lend their voice to a discussion to help it tip it over. And that’s not possible when the discussion is closed or it’s more difficult. Other things, I’ll just list a few. Systems, actual systems in place and effort being made to welcome and introduce new participants is really important because it can be very daunting to enter these communities that are very closely knit. They’ve been working together for a long time in that context. Clear and meaningful codes of conduct are now something we can take for granted but have not always been in place. You have to really recruit and go out of your way to find participants from underrepresented areas. And that includes whole sectors. Civil society as a sector is a sort of minority. Global South governments, small businesses, these are all underrepresented. And open space is not enough. You actually have to then figure out how to boost these participations. Another mode that we’ve seen work is mentorship programs. So some SDOs use mentorship as a way to bring in new people within communities even if it’s actually focused on areas of expertise, right, new technologies and so on. And then actually all of this kind of indicates some sort of support and funding support, right? This is all relatively expensive when meetings are held in different places multiple times a year. The amount of time that needs to be dedicated to follow discussions can be full-time equivalents, maybe multiple ones. And so that’s something to really be mindful of. I also wanna point out too that it’s not just the full-time equivalent of in a civil society organization, a junior staff. You’re actually looking for your most senior experts who understand both the social issues and they have technical grounding. And that can be difficult. So I’m actually getting into the challenges part. which is also part of your question. So certainly, some of these processes are open. Not all of them are. So even if something is maybe multi-stakeholder, it may have membership fees. It may have different kinds of requirements. Multilateralism is difficult, because even when we’ve seen this excellent trend of states opening up their delegations to other stakeholder groups to become members of the delegation, that’s not always possible. It’s not a sustainable strategy for the whole globe. It works in finite places, and we’d love to see more of that. But multilateralism itself, I think, needs to stretch a bit and maybe even adopt practices that would effectively make it multi-stakeholder. I think that I would also want to just say that, yeah, we’ve talked sort of about the expertise issue. But I want to confront that a little bit head on. I think there is always these two competing views. Do you teach technical people social, legal, ethical, human rights issues? Or do you train these experts in technology? And I don’t think it has to be either one, actually. I think this is one of the biggest. When I’m talking about the milestone of a year ago, this is the biggest sort of change, in my view, is that we’re not trying to take the human rights model and build it anew within a technical organization. The two communities are in conversation with one another. The human rights community is saying, we want to engage. We are interested in what you’re doing. And then the technical community, in response, is hopefully willing and able to receive that expertise as a sort of diversity of competence. So that’s sort of the hope, right, where you get experts working. working with other experts. It’s actually something that experts are used to doing. We talk about silos, but actually we’re quite willing to hear other people’s point of view. I think we just need to expand it beyond technology to include these other things. I think that there are plenty more things I could say. We also, in our submission to the OHCHR report, the Center for Democracy and Technology, also the Internet Architecture Board made a submission, and the World Wide Web Consortium made a submission. All three of those worked on a little bit and have those three different perspectives of these questions. I think those submissions can give more fine details to some of the things that I’ve said today, but I’d be happy to expand more in the questions part.

Niki Masghati:
Thank you. That was really just helpful to hear, and I think you touched on quite a number of the solution steps forward that we’ve discussed internally within our own government as well. So we have a little bit of time for questions from the audience. I actually see a very great representation of the multi-stakeholder community in the audience. So if you could please raise your hand, please introduce yourself and where you’re coming from, and then maybe I’ll gather a few questions and then turn it over to the panelists. Please go ahead.

Audience:
Thank you, Nikki. Good morning, everyone. My name is Patrick Day from Cloudflare. Thank you so much for having this panel. I was super excited to see it on the agenda as part of the week. For those who don’t know, Cloudflare is a global network and security company. We’re based in the United States. We run one of the world’s largest networks. We have 300 data centers in 111 countries. Our mission is to help build a better internet, and part of that is how we position our products for our customers, our free programs, where we offer DDoS mitigation and content delivery services to the public for free. But a big part of it is the work we do, sort of improving the basic standards and protocols and many things we’re talking about today at the IETF and other SDOs for the internet. So we’ve been involved in Oblivious DNS over HTTPS. We’ve been involved in Encrypted Client Hello over TLS. There are certain people at Cloudflare who are much better positioned to talk about the technical details of those, so I welcome connecting you with those folks. But for one, I just want to say thank you for sort of bringing it up in this forum. And I think to the point of today’s discussion, one of the things that our team is focused on, part of the Summit for Democracy in 2023, one of our commitments as a company was to try to get civil society, work with civil society, to provide them with the necessary expertise to be able to weigh into these issues sort of exactly in the ways that Mallory was just discussing. So my question for the panel, and you’ve touched on it already, but just sort of for the room in general we welcome those suggestions. That’s sort of why we wanted to be here to learn from you all, and that’s something that’s important to us. So thank you.

Niki Masghati:
Thank you so much. I think I saw another question right next to you. Go ahead.

Audience:
Good morning to everybody. Marek Janowski from the Czech Republic, based here in Geneva, digital diplomacy. I just wanted to maybe speak about a few things that we managed with the EU to actually do here with the ITU and, of course, many thanks for bringing this topic forward. Thanks to Dutch colleagues as well because they’ve been really one of the leaders on this jointly with the other EU plus other regions. And maybe I would start with that remark because it’s good to see other regions represented around the table because I think what is key to what you guys mentioned is to have… converging current understanding between the regions that such a thing as merging or bringing together the healthy standards based on human rights and you know the economic viability of these new products being then rolled out to the market is you know the way forward and I think I see in that regard I see really you know that there’s been a lot of things that that we did here in Geneva as well and so I would encourage the others to basically like stir the discussion in their respective communities even more both you know Asia region Africa regions and Latin America regions so that it’s not sort of EU plus other other countries Latin initiative we’re just trying to inspire the others to think about that other issue of that I would like to touch upon is something which actually concerns let’s say the better dialogue between the communities as you hinted so maybe one one base forward would be to to try to adapt the lingo that we that the two communities are speaking to another I realize when speaking to the technical experts here at the ITU at TSB that we need to be really clear to bring clear cases so it’s not about being general which is very it’s actually irritating to them but we need to really see this is the problem we want to solve it now let’s find a solution together full stop so there’s not much you know room for any fluffy language around which the diplomats are good at including myself but we really need to you know present clear cases and try to really you know go head-on to find solutions another point that I would like to mention is to try to find good companies maybe startups that are able to change the paradigm because if we are really far with the digital transformation I’m sure that you know there are ways to to marry the human rights and technologies it’s just a matter of technical you know to be out of the box to try to think differently and I think there’s a way to do it so the room is open for startups that are it can be innovative so the innovation goes with it and that’s my last point it’s not about stifling innovation vice versa there can be companies actually can benefit from that and to you know to be trustworthy for the for the future so by that I’d like to encourage, you know, the companies to also join in the efforts and be maybe champions in ITU, but elsewhere as well with ISO and other standard-setting organizations. Thank you.

Niki Masghati:
Thank you so much. We have one more question and then we’ll go to the panel. Go ahead.

Audience:
Thanks, Niki. Good morning. Konstantinos Kromaitis with the Tech and Democracy Initiative at the Atlantic Council. I think that my question is more targeted to Mallory. There are some processes like in the ITF, and I know that you have been leading this process, and you have been doing a lot of work and spend a lot of time trying to get that community to talk about human rights that have preceded and have started long before and have done some significant progress. And now we know, of course, that UN institutions are getting more involved, like the ITU. How best do you think, what will be the best way, better yet, to coordinate those processes? Because there are some lessons learned from all the years of the work that has been done in the ITF, and it would be good to just take them and not replicate those failures, if you want, or successes that we have seen in the context of the ITF within the ITU. So, what is the best way to coordinate in order to make sure that we actually advance and we take advantage of everything that we have learned in the past few years? Thanks.

Niki Masghati:
I’m going to summarize the questions very quickly, but I do see we have one more very quick.

Audience:
Very good question. Good morning. Fabrizio Benigni from the European Commission. It’s a question for Dr. Isabel Ebert. You mentioned that most standard-setting organizations do not have human rights commitments to which they subscribe. I wanted to ask whether you have seen a case of a standard-setting organization that carries out an impact assessment of the effects of human rights or the standards that are adopted, and if yes, if these can be used as a model. Thank you.

Niki Masghati:
Okay. So, just to quickly summarize for the panelists, a question on what are some of the additional solutions, suggestions, steps forward, conversation, kind of comment. on regional diversity, you had a second question in there but remind me when we get to it. Our last question on what are some of the good examples of what have been done and then I guess it’s kind of related to Konstantinos’s question about you know there’s been so much that’s already happened in some SDOs, how do you sort of share those best practices? So maybe Mallory, why don’t I turn to you. Gurshabad, please feel free to come in if you want to answer any of the questions and then I’ll turn to Isabel. Go ahead.

Mallory Knodel:
Yeah I was gonna respond to Merrick and then thanks also to Konstantinos for also talking about this because I see there are actually a lot of problems that or a lot of things that governments could be working on, problems that can only be solved by states and it can sometimes be frustrating that the open process, the open processes are mostly large companies that are ossifying the processes. We need smaller companies, we need new companies to come in and then we also desperately do need governments. I mean I would frankly prefer if governments were coming to the IETF because I just feel like it’s a it’s a fast-moving, fast-paced, you know solution-oriented kind of culture but it also doesn’t have simultaneous translation so that’s a barrier. It moves around the world so it’s difficult to follow and you know frankly a lot of again it’s it’s sort of ossifying itself so the things that it’s working on are not necessarily of interest to enough parts of the world frankly so I think the ITU is where a lot of this is gonna happen and I do think that this then should be coordinated with other UN processes. I think there could be agenda setting that happens and again the focus should be on what are some of the problems of the Internet and our digital age that can only be solved by states and the most obvious one I think we can all agree is access to the Internet that is not being solved by the market by any stretch and you have to get governments involved somehow so that’s a really obvious one but then there are some others I see right there’s Also, we need to ask governments to back off of encryption. That would be really helpful because that tends to be a policy threat that states are leveling. Another one I can, another handful of things are around weapons in war, right? Private sector or civil society can’t solve that. So there’s some things, right, that we could actually really benefit from governments getting involved in in good faith in a multi-stakeholder way. So that would be my sort of suggestion there.

Niki Masghati:
Thank you so much. Gurshabad, do you have anything you’d like to add?

Gurshabad Grover:
Just to that point of, states should open, like, wherever they have immediate access to a standardized body in terms of delegations, they should really open it up. And I think ITF, despite its other faults, shows that if you open up a mailing list, if you open up an event, hell is not broken loose, right? Like, everything is structured, you can still get work done. And second, of course, I totally agree with Mallory. And I will just say that in the ITF where access is not state-mediated, sometimes state actors may not be very well because of the history of trying to undermine encryption or other standards. So I think those cultural aspects are also important to understand. And if states are explicit in their advancement of human rights, I think that barrier is broken down all the more.

Niki Masghati:
Thank you. And then I wanted to just really quickly jump in on the question of some of the solutions that we’ve heard sort of in the last year is the idea of integrating human rights impact assessments as appropriate and with the appropriate standards development organizations as one of the things, including the Code of Conducts that Mallory mentioned earlier as a potential sort of solution and way forward. But Isabel, I’ll turn it over to you.

Isabel Ebert:
Thanks. Thanks for the question. And I think that was a really broad array that we covered. Very helpful. Yeah, definitely these initiatives in terms of where states can get active to support civil society are very important. I stressed earlier the imbalance in terms of geographical representation, gender representation. And these are some of the areas where state support can definitely help to accelerate more diversity. And also that could, of course, happen in sort of business initiatives where larger companies work with smaller companies to also bring them on board. terms of all the requirements engaging in these conversations. Also startups do not necessarily have the expertise in terms of how you have to engage in such fora, so teaming up with larger companies can also help bridging some of these gaps. Just to send home the core message of today is that an open and free interoperable internet is not something that is opposite as an aim with human rights. Quite on the contrary, human rights and a free internet are very complementary aims and that’s also why we are promoting the UN Guiding Principles on Business and Human Rights as a standard that can also speak to the technical standard setting community. And we’re closely also working with the colleagues at the OECD and the expert group in AI who are also having a working group on AI risking and accountability that will influence the standard setting on the long run. So also in terms of responsible business conduct we’re trying to create this international alignment also with the OECD Guidelines for Multinational Enterprises on Responsible Business Conduct and the upcoming guidance by the OECD on due diligence and AI. And it’s very important that these institutions such as the UN Human Rights Office and the OECD continue to work together and cooperate in terms of also ensuring that business plays its parts in standard setting processes. So as companies operate globally it’s really important that also standards follow that global operation and we’re not risking fragmentation in terms of standards. Just regarding the questions to which standard setting organizations currently carrying out impact assessments, just to say at the time of writing of the report no standard setting organizations had clear human rights commitments or a systematic impact monitoring going on. At the same time I also want to stress that that does not necessarily mean that there are no ongoing discussions internally or that there are not work streams that might consider human rights as a relevant aspect. It was very much a snapshot in time and we’re also happy to follow up bilaterally on some of the ongoing developments on that.

Jacco-Pepijn Baljet:
Yeah thank you. I’m going to turn it over. I think I’m going to have to close the session. I’m going to turn it over to our Dutch colleagues who are our hosts for today. Thank you so much. Go ahead. Yeah thank you so much and thank you all for… for joining us also online. We can actually see comments coming in, but we cannot actually open them, so sorry for that, but thank you for your engagement online also. So I know that we’re running late, so I will keep it very brief, but we are very happy to have hosted this session as the chair of the Freedom Online Coalition. This year is a very important topic to us, also for the Freedom Online Coalition, where we really want to combine the multistakeholder approach with this human rights-based approach, and also in the standard setting area. I think we had a very constructive discussion. We were all very, both realistic and optimistic, so I think that’s a very good balance that we found here, and also between all the stakeholders and the mix of stakeholders that we have in the room. So, yeah, I just want to finish by mentioning a few points that I think this discussion really shows that it’s both a sort of long-term game, where we are all, I think it’s more important that we are all on the same page and all cooperating for this objective to really get this done, instead of trying to rush things through, and I think Mallory can also agree to that, because I think it’s really important that everyone shares this objective and is really invested into this, and I also think next to that, it’s important that we do look at these high-level documents, negotiations that are ongoing, because they can also be a stimulus to moving this conversation forward. I mean, when we talk about the Global Digital Compact, or there’s language about respecting and protecting human rights in the entire life cycle of digital technologies, it’s also about standard-setting, and when we talk about the Net Money Outcome and STEM contracts we’ve had last month, we’ve also had a lot of agreements there and a lot of, in the outcome document, we had a lot of stakeholders present that are all engaging in these standard-setting organizations. We all agreed that human rights and rule of law are also key to take into this process and diversity and inclusion, so I think it’s important that we do both of these things, the bottom-up, working together and moving forward, and also the higher-level agreements. Thank you so much. Thank you, everyone. Recording stopped.

Niki Masghati:
.

A

Audience

Speech speed

193 words per minute

Speech length

1263 words

Speech time

392 secs

GG

Gurshabad Grover

Speech speed

144 words per minute

Speech length

911 words

Speech time

380 secs

IE

Isabel Ebert

Speech speed

169 words per minute

Speech length

1977 words

Speech time

701 secs

JB

Jacco-Pepijn Baljet

Speech speed

225 words per minute

Speech length

505 words

Speech time

135 secs

MK

Mallory Knodel

Speech speed

177 words per minute

Speech length

1744 words

Speech time

592 secs

NM

Niki Masghati

Speech speed

170 words per minute

Speech length

1500 words

Speech time

530 secs