Non-regulatory approaches to the digital public debate | IGF 2023 Open Forum #139

12 Oct 2023 00:45h - 01:45h UTC

Event report

Speakers and Moderators

Speakers:
  • Pedro Vaca, Special Rapporteurship for Freedom of Expression of the IACHR (OAS), Intergovernmental Organization/ treaty-based international organizations, Americas/Latin America
  • Anna Karin Eneström, permanent representative of Sweden in the United Nations and co-facilitator of the Global Digital Compact, Intergovernmental Organization/ treaty-based international organizations, Europe
  • María Elósegui, judge at the European Court of Human Rights, Europe
Moderators:
  • Jonathan Bock Ruíz, Director of the Foundation for the Freedom of the Press (FLIP), Civil Society, Latin America
  • Agustina del Campo, Director at the Center for Studies on Freedom of Expression and Access to Information (CELE) at Universidad de Palermo, Civil society, Latin America

Table of contents

Disclaimer: This is not an official record of the IGF session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed. The official record of the session can be found on the IGF's official website.

Knowledge Graph of Debate

Session report

Juan Carlos Lara

The discussions revolve around the challenges posed by online violence, discrimination, and disinformation in the digital public debate. These harmful effects have far-reaching impacts, particularly against marginalised and vulnerable communities and groups. The failure of both private tech companies and states to fully comply with their human rights obligations has worsened these challenges.

Regulatory proposals have emerged globally in response to these issues in the digital public sphere. These proposals aim to address concerns such as competition, data protection, interoperability, transparency, and due diligence. Efforts by international organisations to provide guidelines and regional blocs reacting with their own concerns have contributed to this regulatory landscape.

While regulation is necessary, it is crucial that it does not infringe upon the principles of freedom of expression and privacy. The question of how to strike a balance between regulation and these fundamental rights remains a point of debate. It is important to consider the potential fragmentation of the internet and the lack of regulatory debates in many regions of the majority world.

Soft law principles, as well as the application of international human rights laws, play a crucial role in guiding the behaviour of companies in the digital sphere. They have provided valuable guidance for alternative frameworks. However, the effectiveness of these principles and laws is a matter of discussion.

In conclusion, the discussions highlight the urgent need to address the challenges posed by online violence, discrimination, and disinformation. While regulatory proposals have emerged globally, it is essential to ensure that the regulation strikes a balance between protecting human rights, such as freedom of expression and privacy, and addressing the harmful effects of the digital public sphere. Soft law principles and international human rights laws provide valuable guidance for company behaviour, but ongoing discussions are needed to determine their effectiveness. Overall, collaborative efforts between governments, tech companies, and civil society are essential to achieve a digital space that upholds human rights and promotes a more inclusive and equitable society.

Chantal Duris

Chantal Duris stressed the importance of adopting both regulatory and non-regulatory approaches to address challenges related to social media platforms. She expressed concern about legislations that primarily hold platforms accountable for user speech, rather than addressing the underlying business models. Duris highlighted the potential dangers of such approaches, as they can impact freedom of expression. She advocated for platforms to operate based on the UN Guiding Principles, regardless of regulatory status, emphasizing the need to respect human rights. Duris also emphasized the importance of addressing the root causes of issues like disinformation and hate speech, both through regulating business models and exploring solutions outside the digital space. She supported the decentralization of social media platforms to empower users and enhance freedom of expression. Duris expressed concern about the limitations of automated content moderation tools and suggested the need for more human reviewers with language expertise. She discussed the trend of strategic litigation against platforms, highlighting that it could hold them accountable for failures to respect human rights. Duris recognized the challenge of keeping pace with evolving technology and regulatory initiatives, but argued that both platforms and regulators should take responsibility for upholding human rights. She also noted the growing recognition of civil society’s role in the digital space and the increasing consultations and engagements sought by platforms and regulators. Overall, Duris highlighted the need for a multi-faceted approach, incorporating regulatory measures, adherence to UN Guiding Principles, addressing root causes, decentralization, improving content moderation, and recognizing the role of civil society, with platforms and regulators sharing responsibility for upholding human rights.

Ana Cristina Ruelas

Summary:

Addressing harmful content online requires a multidimensional approach that takes into account linguistic nuances, cultural context, and the protection of freedom of expression. This is highlighted by the need to consider the complexities of different languages and crisis situations when moderating content. Companies must align their actions with the UN guiding principles to ensure their policies prioritise transparency, accountability, and human rights.

Education and community engagement play integral roles in tackling harmful content. Media and information literacy programmes empower users to navigate online spaces responsibly, while fostering a sense of shared responsibility in maintaining a safer online environment. Furthermore, a synergistic effort is necessary, combining policy advice, regulation, and the involvement of multiple stakeholders. This involves a multi-stakeholder process that includes the development, implementation, and evaluation of regulations.

Collaboration between regulators and civil society is vital to effective enforcement. Creating conversations between these groups can help reduce tensions and enhance the efficacy of regulations. Regulators should not feel abandoned after legislation is passed; ongoing enforcement and operation of laws must be a key focus.

To achieve a balanced and collective approach in dealing with companies, stakeholders from different regions are coming together. For example, the African Union is taking steps to address companies with a united front. This collective approach allows for better negotiation and more equitable outcomes.

It is important to emphasise a balanced, human rights-based approach when dealing with companies. Among the 40 countries analysed, some believe that this approach is the correct path forward. By prioritising the principles of human rights, such as freedom of expression and inclusive stakeholder participation, governments can create a regulatory framework that safeguards individuals while promoting peace, justice, and strong institutions.

In conclusion, tackling harmful content online requires a comprehensive and nuanced strategy. Such an approach considers linguistic nuances, cultural context, and the protection of freedom of expression. It involves aligning company actions with UN guiding principles, prioritising education and community engagement, and establishing effective regulatory processes that involve collaboration between regulators and civil society. With these measures in place, a safer online environment can be achieved without compromising individual rights and the pursuit of global goals.

Pedro Vaca

The current dynamics of freedom of expression on the internet are concerning, as there is a deterioration of public debate. This raises the need to ensure that processes, criteria, and mechanisms for internet content governance are compatible with democratic and human rights standards. Moreover, limited access to the internet, including connectivity and digital literacy, poses a challenge in enhancing civic skills online.

Recognising the importance of addressing these issues, digital media and information literacy programmes should be integrated into education efforts. By equipping individuals with the necessary skills to navigate the digital landscape, they can critically evaluate information, participate in online discussions, and make informed decisions.

State actors have a responsibility to avoid using public resources to finance content that spreads illicit and violent materials. They should instead promote human rights, fostering a safer and more inclusive online environment. In addition, internet intermediaries bear the responsibility of respecting the human rights of users. This entails ensuring the protection of user privacy, freedom of expression, and access to information.

Managing the challenges in digital public debate requires a multidimensional approach. Critical digital literacy is vital in empowering individuals to engage in meaningful discourse, while the promotion of journalism supports a free and informed press. Internet intermediaries must also play a role in upholding human rights standards and fostering a healthy online debate.

Upon further analysis, it is evident that there is a lack of capacity and knowledge among member states regarding internet regulation. This poses a significant challenge in effectively addressing issues related to content governance and user rights. Efforts should be made to enhance understanding and collaboration among countries to develop effective and inclusive policies.

Shifting the focus towards the role of public servants and political leaders presents an opportunity to reduce discrimination and inequality. By implementing stronger regulation, especially for political leaders, their limited freedom of expression compared to ordinary citizens can be addressed. Adhering to inter-American and international standards can serve as a guideline for ensuring accountability and promoting a fair and inclusive public sphere.

Overall, this extended summary highlights the importance of protecting freedom of expression online, promoting digital literacy, and holding both state actors and internet intermediaries accountable. It also emphasizes the need for increased collaboration and knowledge-sharing among member states to effectively address the challenges in the digital realm.

Ramiro Alvarez Ugarte

The global discussion on the regulation of online platforms is gaining momentum, with diverse viewpoints and arguments emerging. The Digital Services Act (DSA) implemented in Europe is being viewed as a potential model for global regulation. Bills resembling the DSA have been presented in Latin American congresses. Additionally, several states in the US have passed legislation imposing obligations on platforms.

Legal challenges concerning companies’ compliance with human rights standards and the First Amendment are being debated. These challenges can have both positive and negative implications for holding companies accountable. For instance, companies have faced litigation in the US for alleged violations of the First Amendment.

In addition to regulatory measures, there is recognition of the potential of non-regulatory initiatives, such as counter-speech and literacy programs, in addressing the challenges posed by online platforms. These initiatives aim to empower individuals to discern between fake and real information and combat disinformation. Successful implementation of counter-speech initiatives has been observed during Latin American elections.

Nevertheless, concerns exist about the potential negative consequences of well-intentioned legislation on online platforms. It is argued that legislation, even if well-designed, may have unintended harmful effects in countries with insufficient institutional infrastructure.

The tension between decentralization and the need for regulatory controls is another point of contention. A fully decentralized internet, while offering freedom of choice, may facilitate the spread of discriminatory content. Balancing the desire for increased controls to prevent harmful speech with the concept of decentralization is a challenge.

Polarization further complicates the discussion on online platform regulation. Deep polarization hampers progress in implementing regulatory or non-regulatory measures. However, it also presents an opportunity to rebuild the public sphere and promote civic discourse, which is essential for overcoming polarization.

In conclusion, the global conversation on regulating online platforms is complex and multifaceted. The potential of the DSA as a global regulatory model, legal challenges against companies, non-regulatory measures like counter-speech and literacy programs, concerns about the unintended consequences of legislation, the tension between decentralization and regulatory controls, and the challenge of polarization all contribute to this ongoing discourse. Rebuilding the public sphere and fostering civic discourse are seen as positive steps towards addressing these challenges.

Speakers

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more