Internet Human Rights: Mapping the UDHR to Cyberspace | IGF 2023 WS #85
Event report
Speakers and Moderators
Speakers:
- Michael Kelly, Civil Society, Western European and Others Group (WEOG)
- David Satola, Intergovernmental Organization, Western European and Others Group (WEOG)
- Joyce Hakmeh, Civil Society, Western European and Others Group (WEOG)
Moderators:
- Joyce Hakmeh, Civil Society, Western European and Others Group (WEOG)
Table of contents
Disclaimer: This is not an official record of the IGF session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed. The official record of the session can be found on the IGF's official website.
Knowledge Graph of Debate
Session report
Michael Kelly
The analysis explores two main topics: the importance of defining digital human rights and the roles of big tech companies ahead of the AI revolution, and the preference for a multistakeholder approach to internet governance over a multilateral approach.
Regarding the first topic, it is argued that as human rights transition from physical to digital spaces, regulation is needed to protect and promote these rights. The AI revolution necessitates a paradigm shift towards creativity-based AI platform regulation, and defining digital human rights and tech companies’ responsibilities is crucial in this evolving landscape.
The analysis emphasises the proactive definition of digital human rights and the roles of big tech companies to establish clear regulations governing the interaction between technology and human rights. This approach is essential to ensure responsible and ethical use of evolving technologies.
Regarding the second topic, the analysis supports a multistakeholder approach to internet governance. This approach involves involving various stakeholders, including governments, tech companies, civil societies, and individuals, in decision-making processes. It aims to ensure diverse perspectives and interests are considered for balanced and inclusive governance.
Concerns are raised about a multilateral approach that may exclude big tech companies and civil societies from decision-making processes, hindering effective internet governance. The analysis also identifies a draft cybercrime treaty proposed by Russia as a potential threat to digital human rights, potentially limiting freedom of expression and privacy online.
In conclusion, the analysis highlights the importance of defining digital human rights and the roles of big tech companies in the AI revolution. It emphasises proactive regulation and creativity-based AI platform regulation. It supports a multistakeholder approach to internet governance and raises concerns about exclusions and threats to digital human rights. This comprehensive analysis provides valuable insights into the challenges and considerations at the intersection of technology, human rights, and internet governance.
Peggy Hicks.
The discussion centres around the relevance of human rights in the digital space and the potential impact of government regulations on online activities. It is acknowledged that the human rights that apply offline also extend to the online realm. However, there is ongoing deliberation regarding their practical implementation.
The significance of the human rights framework in the digital space is highlighted due to its universal applicability and legally binding nature. This framework encompasses obligations that the majority of states have committed to. Additionally, a multistakeholder and multilateral approach plays a key role in addressing human rights in the digital realm.
There are concerns about potential government overreach and its negative impact on free speech. Many legislations globally are viewed as hindering human rights rather than protecting them, raising apprehensions about government interference and censorship.
The responsibilities of companies in respecting human rights, particularly within their supply chains, are recognised. Companies are urged to understand and mitigate risks associated with human rights violations in their operations. The UN Guiding Principles on Human Rights outline the role of states in regulating the impact of companies on human rights and establishing accountability and remedy mechanisms.
However, there are also concerns about legislation on content moderation, which is seen as often leading to the suppression of free speech. The push for companies to take down excessive content can result in the repression of opposition or dissent. The Cybercrime Convention is highlighted as an area where potential overreach is observed, which can curtail rights.
The implications of legislative models, such as the German NetzDG statute, in different global contexts are discussed. It is noted that exporting these models without considering the varying contexts can lead to problems and conflicts with human rights principles.
Furthermore, worries are expressed about regulatory approaches in liberal democracies that could potentially compromise human rights and data encryption. Measures such as client-side scanning or undermining encryption are viewed as problematic, as they could have adverse global impacts.
The breadth and severity of punitive measures under the Cybercrime Convention also raise concerns. Instances where individuals have been imprisoned for a single tweet for three to four years prompt questions about the proportionality and fairness of these measures.
While negotiation processes are still ongoing, there is a recognised need for continued dialogue to address concerns and improve the Cybercrime Convention. Multiple states share the concerns expressed by the Office of the United Nations High Commissioner for Human Rights (OHCHR).
In conclusion, the discussion highlights the importance of upholding human rights in the digital space and cautions against excessive government regulation that can impede these rights. The responsibilities of companies in respecting human rights are emphasised, along with concerns about the negative effects of content moderation legislation. The need for careful consideration of context when enacting legislative models and the challenges posed by regulatory approaches in liberal democracies are also brought to light. Ultimately, ongoing negotiations are required to address concerns and enhance the Cybercrime Convention.
David Satola
The analysis explores the importance of upholding equal rights in the digital space, irrespective of an individual’s identity. It stresses the need to establish virtual identity rights prior to the impending AI revolution. The fast-paced progress in AI technology adds a time constraint to defining these rights, making it crucial to formulate and establish them promptly.
One of the key arguments in the analysis emphasizes that while everyone theoretically enjoys the same rights in physical spaces regardless of their identity, the emergence of a new front in the digital space necessitates extending principles of equality and non-discrimination to the virtual realm.
Another aspect highlighted in the analysis concerns the rights of avatars and posthumous social media accounts, raising questions about the legal framework and rights that should govern these virtual identities, particularly in the context of the AI revolution. Addressing these issues in advance becomes essential to safeguard individuals’ virtual identities within a legal framework that ensures equal rights and protections as in the physical world.
Furthermore, the analysis underscores the potential challenges to the universality of rights brought about by the migration of our daily lives into cyberspace. As our activities and interactions increasingly occur online, it becomes crucial to ensure the preservation of fundamental human rights in this digital domain as well.
Additionally, the incorporation of national or regional laws without adequate context may pose a threat to online rights. This observation underscores the importance of crafting carefully designed and globally aligned legal frameworks governing the digital space, to prevent discrepancies and inconsistencies that could undermine the universality of rights.
In conclusion, the analysis emphasizes the need to guarantee equal rights in the digital space, highlighting the significance of defining virtual identity rights in anticipation of the AI revolution. It also discusses the challenges posed by the migration to cyberspace and the potential threats to online rights in the absence of cohesive global legal frameworks. Given the rapid advancements in AI, it is essential to act swiftly in establishing these rights to pave the way for a fair and inclusive digital future.
Joyce Hakmeh
Joyce Hakmeh, Deputy Director of the International Security Programme at Chatham House, moderated a session focused on the Internet Governance Task Force. This task force was established following a report by the American Bar Association’s Internet Governance Task Force, co-chaired by Michael Kelly and David Sattola. Michael Kelly, a professor of law at Creighton University specializing in public international law, and David Sattola, Lead Counsel for Innovation and Technology at the US Department of Homeland Security and Director of the International Security Programme at Chatham House, co-chaired the task force.
In the session, the speakers discussed the complexities of internet governance, stressing the need to find the right balance of responsibilities. They highlighted concerning practices of some autocratic countries that suppress dissent and violate human rights. They also drew attention to regulatory approaches proposed by liberal democracies, which raised human rights concerns, such as breaking encryption for legitimate purposes.
Peggy Hicks, Director of the Office of the UN High Commissioner for Refugees, participated in the session as a discussant. She raised questions about the responsiveness of countries at both national and global levels to the concerns raised by the speakers. Her inquiries covered issues related to autocratic countries and potential human rights implications of regulatory measures proposed by liberal democracies.
The session also touched upon the Cybercrime Convention, with Peggy Hicks noting that the OHCHR has been actively engaged in publishing commentary and providing observations on the content and progress of the convention. Although specific details of the convention’s progress were not explicitly covered, they discussed its complexity and potential for abuse, particularly regarding procedural powers and broad criminalization.
In conclusion, the session emphasized the importance of raising awareness about the complexities of internet governance and the potential for human rights abuses. The discussion shed light on various perspectives and challenges related to this issue, contributing to a better understanding of the topic.
Speakers
Speech speed
0 words per minute
Speech length
words
Speech time
0 secs
Report
The analysis explores the importance of upholding equal rights in the digital space, irrespective of an individual’s identity. It stresses the need to establish virtual identity rights prior to the impending AI revolution. The fast-paced progress in AI technology adds a time constraint to defining these rights, making it crucial to formulate and establish them promptly.
One of the key arguments in the analysis emphasizes that while everyone theoretically enjoys the same rights in physical spaces regardless of their identity, the emergence of a new front in the digital space necessitates extending principles of equality and non-discrimination to the virtual realm.
Another aspect highlighted in the analysis concerns the rights of avatars and posthumous social media accounts, raising questions about the legal framework and rights that should govern these virtual identities, particularly in the context of the AI revolution.
Addressing these issues in advance becomes essential to safeguard individuals’ virtual identities within a legal framework that ensures equal rights and protections as in the physical world.
Furthermore, the analysis underscores the potential challenges to the universality of rights brought about by the migration of our daily lives into cyberspace.
As our activities and interactions increasingly occur online, it becomes crucial to ensure the preservation of fundamental human rights in this digital domain as well.
Additionally, the incorporation of national or regional laws without adequate context may pose a threat to online rights.
This observation underscores the importance of crafting carefully designed and globally aligned legal frameworks governing the digital space, to prevent discrepancies and inconsistencies that could undermine the universality of rights.
In conclusion, the analysis emphasizes the need to guarantee equal rights in the digital space, highlighting the significance of defining virtual identity rights in anticipation of the AI revolution.
It also discusses the challenges posed by the migration to cyberspace and the potential threats to online rights in the absence of cohesive global legal frameworks. Given the rapid advancements in AI, it is essential to act swiftly in establishing these rights to pave the way for a fair and inclusive digital future.
Speech speed
0 words per minute
Speech length
words
Speech time
0 secs
Report
Joyce Hakmeh, Deputy Director of the International Security Programme at Chatham House, moderated a session focused on the Internet Governance Task Force. This task force was established following a report by the American Bar Association’s Internet Governance Task Force, co-chaired by Michael Kelly and David Sattola.
Michael Kelly, a professor of law at Creighton University specializing in public international law, and David Sattola, Lead Counsel for Innovation and Technology at the US Department of Homeland Security and Director of the International Security Programme at Chatham House, co-chaired the task force.
In the session, the speakers discussed the complexities of internet governance, stressing the need to find the right balance of responsibilities.
They highlighted concerning practices of some autocratic countries that suppress dissent and violate human rights. They also drew attention to regulatory approaches proposed by liberal democracies, which raised human rights concerns, such as breaking encryption for legitimate purposes.
Peggy Hicks, Director of the Office of the UN High Commissioner for Refugees, participated in the session as a discussant.
She raised questions about the responsiveness of countries at both national and global levels to the concerns raised by the speakers. Her inquiries covered issues related to autocratic countries and potential human rights implications of regulatory measures proposed by liberal democracies.
The session also touched upon the Cybercrime Convention, with Peggy Hicks noting that the OHCHR has been actively engaged in publishing commentary and providing observations on the content and progress of the convention.
Although specific details of the convention’s progress were not explicitly covered, they discussed its complexity and potential for abuse, particularly regarding procedural powers and broad criminalization.
In conclusion, the session emphasized the importance of raising awareness about the complexities of internet governance and the potential for human rights abuses.
The discussion shed light on various perspectives and challenges related to this issue, contributing to a better understanding of the topic.
Speech speed
0 words per minute
Speech length
words
Speech time
0 secs
Report
The analysis explores two main topics: the importance of defining digital human rights and the roles of big tech companies ahead of the AI revolution, and the preference for a multistakeholder approach to internet governance over a multilateral approach.
Regarding the first topic, it is argued that as human rights transition from physical to digital spaces, regulation is needed to protect and promote these rights.
The AI revolution necessitates a paradigm shift towards creativity-based AI platform regulation, and defining digital human rights and tech companies’ responsibilities is crucial in this evolving landscape.
The analysis emphasises the proactive definition of digital human rights and the roles of big tech companies to establish clear regulations governing the interaction between technology and human rights.
This approach is essential to ensure responsible and ethical use of evolving technologies.
Regarding the second topic, the analysis supports a multistakeholder approach to internet governance. This approach involves involving various stakeholders, including governments, tech companies, civil societies, and individuals, in decision-making processes.
It aims to ensure diverse perspectives and interests are considered for balanced and inclusive governance.
Concerns are raised about a multilateral approach that may exclude big tech companies and civil societies from decision-making processes, hindering effective internet governance.
The analysis also identifies a draft cybercrime treaty proposed by Russia as a potential threat to digital human rights, potentially limiting freedom of expression and privacy online.
In conclusion, the analysis highlights the importance of defining digital human rights and the roles of big tech companies in the AI revolution.
It emphasises proactive regulation and creativity-based AI platform regulation. It supports a multistakeholder approach to internet governance and raises concerns about exclusions and threats to digital human rights. This comprehensive analysis provides valuable insights into the challenges and considerations at the intersection of technology, human rights, and internet governance.
Speech speed
0 words per minute
Speech length
words
Speech time
0 secs
Report
The discussion centres around the relevance of human rights in the digital space and the potential impact of government regulations on online activities. It is acknowledged that the human rights that apply offline also extend to the online realm. However, there is ongoing deliberation regarding their practical implementation.
The significance of the human rights framework in the digital space is highlighted due to its universal applicability and legally binding nature.
This framework encompasses obligations that the majority of states have committed to. Additionally, a multistakeholder and multilateral approach plays a key role in addressing human rights in the digital realm.
There are concerns about potential government overreach and its negative impact on free speech.
Many legislations globally are viewed as hindering human rights rather than protecting them, raising apprehensions about government interference and censorship.
The responsibilities of companies in respecting human rights, particularly within their supply chains, are recognised. Companies are urged to understand and mitigate risks associated with human rights violations in their operations.
The UN Guiding Principles on Human Rights outline the role of states in regulating the impact of companies on human rights and establishing accountability and remedy mechanisms.
However, there are also concerns about legislation on content moderation, which is seen as often leading to the suppression of free speech.
The push for companies to take down excessive content can result in the repression of opposition or dissent. The Cybercrime Convention is highlighted as an area where potential overreach is observed, which can curtail rights.
The implications of legislative models, such as the German NetzDG statute, in different global contexts are discussed.
It is noted that exporting these models without considering the varying contexts can lead to problems and conflicts with human rights principles.
Furthermore, worries are expressed about regulatory approaches in liberal democracies that could potentially compromise human rights and data encryption.
Measures such as client-side scanning or undermining encryption are viewed as problematic, as they could have adverse global impacts.
The breadth and severity of punitive measures under the Cybercrime Convention also raise concerns. Instances where individuals have been imprisoned for a single tweet for three to four years prompt questions about the proportionality and fairness of these measures.
While negotiation processes are still ongoing, there is a recognised need for continued dialogue to address concerns and improve the Cybercrime Convention.
Multiple states share the concerns expressed by the Office of the United Nations High Commissioner for Human Rights (OHCHR).
In conclusion, the discussion highlights the importance of upholding human rights in the digital space and cautions against excessive government regulation that can impede these rights.
The responsibilities of companies in respecting human rights are emphasised, along with concerns about the negative effects of content moderation legislation. The need for careful consideration of context when enacting legislative models and the challenges posed by regulatory approaches in liberal democracies are also brought to light.
Ultimately, ongoing negotiations are required to address concerns and enhance the Cybercrime Convention.