Protect people and elections, not Big Tech! | IGF 2023 Town Hall #117

10 Oct 2023 07:30h - 08:30h UTC

Event report

Speakers and Moderators

Speakers:
  • Alexandra Pardal, Campaigns Director at Digital Action, Civil Society, WEOG
Moderators:
  • Bruna Martins dos Santos, Global Campaigns Manager at Digital Action, Civil Society, GRULAC

Table of contents

Disclaimer: This is not an official record of the IGF session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed. The official record of the session can be found on the IGF's official website.

Knowledge Graph of Debate

Session report

Daniel Arnaudo

In 2024, several countries, including Bangladesh, Indonesia, India, Pakistan, and Taiwan, are set to hold elections, making it a significant year for democracy. However, smaller countries often do not receive the same level of attention and support when it comes to content moderation, policies, research tools, and data access. This raises concerns about unfair treatment and limited resources for these nations.

Daniel highlights the need for improved data access for third-party researchers and civil society, particularly in smaller countries. Currently, there is a disinvestment in civic integrity, trust, and safety, which further exacerbates the challenges faced by these nations. Platforms are increasingly reducing third-party access to APIs and other forms of data, making it harder for researchers and civil society to gather valuable insights. Large countries often control access systems, resulting in high barriers for smaller nations to access data.

Another pressing issue raised is the insufficient addressing of threats faced by women involved in politics on social media platforms. Research shows that women in politics experience higher levels of online violence and threats. Daniel suggests that platforms establish mechanisms to support women and better comprehend and tackle these threats. Gender equality should be prioritised to ensure that women can participate in politics without fear of harassment or intimidation.

To effectively navigate critical democratic moments, such as elections or protests, social media platforms should collaborate with organisations that possess expertise in these areas. Daniel mentions the retreat from programs like the Trusted Partners at Meta and highlights the potential impacts on elections, democratic institutions, and the bottom lines of these companies. By working alongside knowledgeable organisations, platforms can better understand and respond to the needs and challenges of democratic events.

Algorithmic transparency is a desired outcome, but it proves to be a complex issue. While it has the potential to improve accountability and fairness, there are risks of manipulation or gaming the system. Striking the right balance between transparency and safeguarding against misuse is a delicate task that requires careful consideration.

Smaller political candidates seeking access to reliable and accurate political information need better protections. In order to level the playing field, it is crucial to provide resources and support to candidates who may not have the same resources as their larger counterparts.

The data access revolution is transforming how companies provide access to their systems. This shift enables greater innovation and collaboration, particularly in industries like infrastructure and industry. Companies should embrace this transformation and strive to make their systems more accessible, promoting inclusivity and reducing inequalities.

Deploying company employees in authoritarian contexts poses challenges. Under certain regulations, these employees might become bargaining chips, compromising the companies’ integrity and principles. It is essential to consider the potential risks and implications before making such decisions.

Furthermore, companies should invest in staffing and enhancing their understanding of local languages and contexts. This investment ensures a better response to users’ needs and fosters better cultural understanding, leading to more effective and inclusive collaborations.

In conclusion, 2024 holds significant democratic milestones, but there are concerns about the attention given to smaller countries. Improving data access for researchers and civil society, addressing threats faced by women in politics, working with organisations during critical democratic moments, and promoting algorithmic transparency are crucial steps forward. Protecting smaller political candidates, embracing the data access revolution, considering the risks of deploying employees in authoritarian contexts, and investing in local understanding are additional factors that warrant attention for a more inclusive and balanced democratic landscape.

Audience

The analysis raises a number of concerns regarding digital election systems, global media platforms, data access for research, and the integrity of Russia’s electronic voting systems. It argues that digital election systems are susceptible to cyber threats, citing a disruption in Russian elections caused by a denial of service attack from Ukraine. This highlights the need for improved cybersecurity measures to safeguard the accuracy and integrity of digital voting systems.

Concerns are also raised about the neutrality and transparency of global media platforms. It is alleged that these platforms may show bias by taking sides in conflicts, potentially undermining their neutrality. Secret recommendation algorithms used by these platforms can influence users’ news feeds, and this lack of transparency raises questions about the information users are exposed to and the influence these algorithms can have on public perception. The analysis also notes that in certain African countries, platforms like Facebook serve as the primary source of internet access for many individuals, highlighting the importance of ensuring fair and unbiased information dissemination.

Transparency in global media platforms’ recommendation algorithms is deemed necessary. The analysis argues that platforms like Facebook have the power to ignite revolutions and shape public discourse through these algorithms. However, the lack of understanding about how these algorithms work raises concerns about their impact on democratic processes and the formation of public opinion.

The analysis also highlights the challenges of accessing data for academic and civil society research, without specifying the nature or extent of these challenges. It takes the position that measures need to be taken to fight against data access restrictions in order to promote open access and support research efforts in these fields.

The integrity of Russia’s electronic voting systems is called into question, despite the Russian Central Election Commission not acknowledging any issues. These systems, developed by big tech companies Kaspersky and Rostelecom, lacked transparency and did not comply with the recommendations of the Russian Commission, raising doubts about their reliability and potential for manipulation.

The use of social media platforms, particularly Facebook, for political campaigning in restrictive political climates is also deemed ineffective. The analysis argues that these platforms may not effectively facilitate individual political campaigns. Supporting facts are provided, such as limited reach and targeting capabilities of Facebook’s advertising algorithms and the inability to use traditional media advertisements in restrictive regimes. An audience member with experience managing a political candidate page on Facebook shares their negative experience, further supporting the argument that social media platforms may not be as effective as traditional methods in certain political contexts.

In conclusion, the analysis presents a range of concerns regarding the vulnerabilities of digital election systems, the neutrality and transparency of global media platforms, challenges in data access for research, and the integrity of Russia’s electronic voting systems. It emphasizes the need for enhanced cybersecurity measures, transparency in recommendation algorithms, increased support for data access in research, and scrutiny of electronic voting systems. These issues have significant implications for democracy, public opinion, academic progress, and political campaigning in an increasingly digital and interconnected world.

Ashnah Kalemera

Social media platforms and the internet have the potential to play a significant role in electoral processes. They can support various aspects such as voter registration, remote voting, campaigns, voter awareness, results transmission, and monitoring. These platforms are critical in ensuring that voter registration is complete and accurate, enabling remote voting for excluded communities and remotely based voters, supporting campaigns and canvassing, as well as voter awareness and education, facilitating results transmission and tallying, and monitoring malpractice.

However, technology also poses threats to electoral processes, especially in Africa. Authoritarian governments leverage the power of technology for their self-serving interests. They actively use disinformation and hate speech to manipulate narratives and public opinion during elections. Various actors, including users, governments, platforms themselves, private companies, and PR firms, contribute to this manipulation by spreading disinformation and hate speech.

The thriving of disinformation and hate speech in Africa can be attributed to the increasing penetration of technology on the continent. This provides a platform for spreading false information and inciting hatred. Additionally, the growing youth population, combined with characteristic ethnic, religious, and geopolitical conflicts, creates an environment where disinformation and hate speech can flourish.

To combat the spread of disinformation, it is crucial for big tech companies to collaborate with media and civil society. However, limited collaboration exists between these actors in Africa, and concerns arise regarding the slow processing and response times to reports and complaints, as well as the lack of transparency in moderation measures.

Research, consultation, skill-building, and strategic litigation are identified as potential solutions to address the challenges posed by big tech’s involvement in elections and the spread of disinformation. Evidence-driven advocacy is important, and leveraging norm-setting mechanisms can help raise the visibility of these challenges. Challenging the private sector to uphold responsibilities and ethics, as outlined by the UN guiding principles on business and human rights, is also essential.

Addressing the complex issues surrounding big tech, elections, and disinformation requires a multifaceted approach. While holding big tech accountable is crucial, it is important to recognize that the manifestations of the problem vary from one context to another. Therefore, stakeholder conversations must acknowledge and address the different challenges posed by disinformation.

Data accessibility plays a critical role in addressing these issues. Organizations like CIPESA have leveraged data APIs for sentiment analysis and monitoring elections. However, the lack of access to data limits the ability to highlight challenges related to big tech involvement in elections.

Furthermore, it is important to engage with lesser-known actors, such as electoral bodies and regional economic blocs, to effectively address these issues. Broader conversations that include these stakeholders can lead to a better understanding of the challenges and potential solutions.

In conclusion, social media platforms and the internet offer significant potential to support electoral processes but also pose threats through the spread of disinformation and hate speech. Collaboration between big tech, media, and civil society, as well as research, skill-building, and strategic litigation, are necessary elements in addressing these challenges. Holding big tech accountable and engaging with lesser-known actors are also crucial for effective solutions.

Moderator – Bruna Martins Dos Santos

Digital Action is a global coalition for tech justice that aims to ensure the accountability of big tech companies and safeguard the integrity of elections. Headquartered in Brazil, the coalition has been gaining support from various organizations and academics, indicating a growing momentum for their cause.

Founded in 2019, Digital Action focuses on addressing the impact of social media on democracies and works towards holding tech giants accountable for their actions. Their primary objective is to prevent any negative consequences on elections and foster collaboration by involving social media companies in the conversation.

Moreover, Digital Action seeks to empower individuals who have been adversely affected by tech harms. They prioritize amplifying the voices of those impacted and ensuring that their concerns are heard. Through catalyzing collective action, bridge-building, and facilitating meaningful dialogue, they aim to make a positive difference.

On a different note, the summary also highlights the criticism faced by social media companies for their lack of investment in improving day-to-day lives. This negative sentiment suggests that these companies may not be prioritizing initiatives that directly impact people’s well-being and societal conditions.

In conclusion, Digital Action’s global coalition for tech justice is committed to holding big tech accountable, protecting election integrity, and empowering those affected by tech harms. By involving social media companies and gaining support from diverse stakeholders, they aspire to create a more just and inclusive digital landscape. Additionally, the need for social media companies to invest in initiatives that enhance people’s daily lives is emphasized.

Yasmin Curzi

The legislative scenario in Brazil concerning platform responsibilities is governed by two main legislations. The Brazilian Civil Rights Framework, established in 2014, sets out fundamental principles for internet governance. According to Article 19 of this framework, platforms are only held responsible for illegal user-generated content if they fail to comply with a judicial order. The Code of Consumers Defense also recognises users as being vulnerable in their interactions with businesses.

However, the impact of measures to combat false information remains uncertain. Although platforms have committed to creating reporting channels and labelling content related to elections, there is a lack of detailed metrics to fully understand the effectiveness of these measures. There are concerns about whether content is being removed quickly enough to prevent it from reaching a wide audience. One concerning example is the case of Jovem Pão, which disseminated a fake audio during election day that had already been viewed 1.7 million times before removal.

The analysis indicates that social media and platforms’ content moderation have limited influence on democratic elections. Insufficient data and information exist about platforms’ actions and their effectiveness in combating false information. Content shared through official sources often reaches a wide audience before it is taken down. Despite partnerships with fact-checking agencies, it remains uncertain how effective platform efforts are in combating falsehood.

There is a pressing need for specific legislation and regulation of platforms to establish real accountability. Platforms currently fail to provide fundamental information such as their investment in content moderation. However, there is hope as the Data, Consumer Protection, and Regulation (DCPR) initiative has developed a framework for meaningful and interoperable transparency. This framework could guide lawmakers and regulators in addressing the issue.

Furthermore, platforms should improve their content moderation practices. Journalists in Brazil have requested information from Facebook and YouTube regarding their investment in content moderation but have received no response. Without the ability to assess the harmful content recommended by platforms, it becomes difficult to formulate appropriate public policies.

In conclusion, the legislative framework in Brazil regarding platform responsibilities comprises two main legislations. However, the impact of measures to combat false information remains uncertain, and the influence of social media and platform content moderation on democratic elections is limited. Specific legislation and regulation are needed to establish accountability, and platforms need to enhance their content moderation practices. Providing meaningful transparency information will facilitate accurate assessment and policymaking.

Alexandra Robinson

The vulnerability of online spaces and the ease with which domestic or foreign actors can manipulate and spread falsehoods is a growing concern, especially in terms of the manipulation of democratic processes. The use of new technologies like generative AI further complicates the issue, making it easier for malicious actors to deceive and mislead the public. This highlights the urgent need for stronger protections against online harms.

One significant observation is the glaring inequality between different regions in terms of protections from online harms. The disparity is particularly alarming, emphasizing the need for a more balanced and comprehensive approach to safeguarding online spaces. It is crucial to ensure that individuals worldwide have equitable protection against manipulation and disinformation.

Social media companies play a pivotal role in creating safe online environments for all users. This is particularly important with the upcoming 2024 elections, as these companies must fulfill their responsibilities to protect the integrity of democratic processes. However, concerns arise when examining the allocation of resources by these companies. Despite investing $13 billion in platform safety since 2016, Facebook’s use of its global budget for combating false information appears disproportionately focused on the US market, where only a fraction of its users reside. This skewed allocation raises questions regarding the equal treatment of users globally and the effectiveness of combating disinformation on a worldwide scale.

Furthermore, non-English languages pose a significant challenge for automated content moderation on various platforms, including Facebook, YouTube, and TikTok. Difficulties in moderating content in languages other than English can lead to a substantial gap in combating false information and harmful content in diverse linguistic contexts. Efforts must be made to bridge this gap and ensure that content moderation is effective in all languages, promoting a safer online environment for users regardless of their language.

In conclusion, the vulnerability of online spaces and the potential manipulation of democratic processes through the spread of falsehoods raise concerns that require urgent attention. Social media companies have a responsibility to create safe platforms for users worldwide, with specific emphasis on the upcoming elections. Addressing the inequities in protections against online harms, including the allocation of resources and challenges posed by non-English languages, is crucial for maintaining the integrity of online information and promoting a more secure digital environment.

Lia Hernandez

The speakers engaged in a comprehensive discussion regarding the role of digital platforms in promoting democracy and facilitating access to information. They emphasized the importance of independent tech work to advance digital rights across all Central American countries. Additionally, they highlighted the collaboration between big tech companies and electoral public entities, as the former provide tools to ensure the preservation of fundamental rights during election processes.

The argument put forth was that digital platforms should serve as valuable tools for promoting democracy and facilitating access to information. This aligns with the related United Nations Sustainable Development Goals, including Goal 10: Reduced Inequalities and Goal 16: Peace, Justice, and Strong Institutions.

However, concerns were raised about limitations on freedom of the press, information, and expression. Journalists in Panama faced obstacles and restrictions when attempting to communicate information of public interest. Of particular concern was the fact that the former President, Ricardo Martinelli, known for violating privacy, is a candidate for the next elections. This situation has the potential to lead to cases of corruption.

Furthermore, the speakers emphasized the necessity of empowering citizens, civil society organizations, human rights defenders, and activists. They argued that it is not only important to strengthen the electoral authority but also crucial to empower the aforementioned groups to ensure a robust and accountable democratic system. The positive sentiment surrounding this argument reflects the speakers’ belief in the need for a participatory and inclusive democracy.

However, contrasting viewpoints were also presented. Some argued that digital platforms do not make tools widely available to civil society but instead focus on providing them to the government. This negative sentiment highlights concerns about the control and accessibility of these tools, potentially limiting their efficacy in promoting democracy and access to information.

Additionally, the quality and standardisation of data used for monitoring digital violence were subject to criticism. The negative sentiment regarding this issue suggests that the data being utilised is unclean and lacks adherence to open data standards. Ensuring clean and standardised data is paramount to effectively monitor and address digital violence.

In conclusion, the expanded summary highlights the various perspectives and arguments surrounding the role of digital platforms in promoting democracy and access to information. It underscores the importance of independent tech work, collaboration between big tech companies and electoral entities, and empowering citizens and civil society organisations. However, limitations on freedom of the press, potential corruption, restricted access to tools, and data quality issues represent significant challenges that need to be addressed for the effective promotion of democracy and access to information.

Speakers

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more