How prevent external interferences to EU Election 2024 – v.2 | IGF 2023 Town Hall #162

12 Oct 2023 06:10h - 07:10h UTC

Event report

Speakers and Moderators

Speakers:
  • Esteve Sanz, Representative of EU Commission (government)
  • Alberto Rabbachin, Head of Unit Media Convergence & Social Media
  • Paula Gori, EDMO (Academia and Technical Community)
  • Giovanni Zagni, Pagella Politica (media industry)
  • Stanislav Matejka, Chair of the ERGA Sub-group 3, on disinformation (regulatory authority)
Moderators:
  • Giacomo Mazzone, WEOG (Civil society)

Table of contents

Disclaimer: This is not an official record of the IGF session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed. The official record of the session can be found on the IGF's official website.

Knowledge Graph of Debate

Session report

Audience

Upon analysing the provided statements, it is evident that there are several concerns and inquiries raised by the speakers. These concerns are centred around various issues related to TikTok, platform APIs, engagement with overseas countries, fake news and disinformation, algorithm transparency, and online content moderation.

One of the main concerns is regarding TikTok’s censorship and user information “bubbling.” This refers to a situation where TikToks from certain countries, such as Russia and Ukraine, cannot be viewed by users in other countries, even with a direct link. Instead, videos from these links are replaced with unrelated content, such as videos of cats and dogs. This has triggered negative sentiment among users and raised concerns about the control over user information.

Additionally, there is an inquiry about the regulation of TikTok’s policy and mechanism for controlling viewer access. The speaker seeks clarity on how TikTok manages and controls viewer access to its platform. Although no supporting facts are provided, it reflects a neutral stance and highlights the need for understanding TikTok’s policy regulation.

Another concern raised relates to platform APIs and data access. The de-emphasis of CrowdTangle, restrictions on APIs, and expenses for research organizations are mentioned as supporting facts. These issues have generated negative sentiment among users who worry about the limitations and costs associated with platform APIs and data access.

Furthermore, the speakers express curiosity about engagement with overseas and partner countries. While one speaker mentions work done with these countries, no details are provided about the nature and extent of this engagement. Nonetheless, this topic is approached with a neutral sentiment, indicating an interest in learning more about the engagement process.

The increase in the manifestation of fake news and disinformation in Taiwan raises concerns. It is highlighted that private sector platform providers play a crucial role in enforcing regulations and dealing with such information. This negative sentiment reflects worries about the impact of fake news and disinformation on society.

The desire for algorithm transparency in content recommendation is another argument put forth. However, no supporting facts are mentioned regarding this issue. Despite this, the neutral sentiment reflects a general interest in making the content recommendation algorithm more transparent.

There is also a speaker who wants to understand how online content moderation systems work. While no supporting facts are provided, this neutral stance suggests a curiosity about the mechanisms and processes involved in content moderation on platforms like TikTok.

Lastly, there is an inquiry if it is possible to retrieve a post or video once it has been removed by the content moderation system. No additional information is provided on this topic, but the neutral sentiment implies a desire to explore the potential of content recovery.

In conclusion, the concerns and inquiries presented in the statements cover a wide range of topics, including TikTok’s user privacy and information control, policy regulation and control over viewer access, platform APIs and data access, engagement with overseas and partner countries, manifestation of fake news and disinformation, algorithm transparency, online content moderation systems, and content recovery. These matters highlight various aspects of platform management, user experience, and the impact of social media platforms on society. The analysis helps identify the speakers’ viewpoint and concerns while emphasising the need for further insights and information on these subjects.

Paula Gori

The European Digital Media Observatory (EDMO) is an independent consortium of organizations that focuses on fact-checking, academic research, and media literacy. Although funded by the European Commission, EDMO operates autonomously. It aims to combat misinformation by providing a platform where experts can collaborate on addressing this issue.

One of the main objectives of EDMO is to provide tools and evidence to counter disinformation. The organization establishes networks of fact-checkers who work together to identify false narratives and share information with one another. This collaborative approach allows for quicker and more efficient debunking of misleading information, especially when done within the first 24 hours.

In addition to combating disinformation, EDMO also focuses on mapping and evaluating media literacy initiatives. It strives to thoroughly understand the impact and effectiveness of these initiatives, ensuring that efforts to enhance media literacy are productive and fruitful.

An important consideration for EDMO is data accessibility. They have produced a code of conduct for accessing online platform data and are working towards creating an independent intermediary body that handles requests for such data. EDMO recognizes the necessity of granting access to platform data for research purposes while fully respecting GDPR regulations.

However, there are challenges in accessing platform data, particularly for researchers from smaller universities and countries with minority languages. Data access is more readily available to well-established universities, which amplifies the inequality in research opportunities between larger and smaller educational institutions.

Paula, in her stance, advocates for the accessibility of platform data, especially for researchers from smaller universities and countries with minority languages. She points out the difficulty faced by these institutions in accessing data and emphasizes the importance of ensuring equitable research opportunities. Paula also acknowledges the need for proper infrastructures to effectively handle and manage data, highlighting that data accessibility is not the only concern; having the necessary infrastructure is equally crucial.

In conclusion, EDMO plays a significant role in addressing misinformation by providing a collaborative platform for experts in fact-checking, research, and media literacy. Their efforts to combat disinformation, map media literacy initiatives, and promote data accessibility are commendable. However, challenges remain in terms of accessing platform data, particularly for researchers from smaller universities and minority language contexts. It is essential to address these challenges and create a level playing field for all researchers to contribute to the fight against misinformation.

Erik Lambert

The European Commission is currently engaged in the process of regulating artificial intelligence (AI) with a specific focus on preventing the manipulation of public opinion. These regulations aim to curb coordinated activities by foreign powers or specific groups seeking to influence public sentiment. It is important, however, that these regulations do not impede freedom of speech.

According to Erik Lambert, an expert in the field, the younger generation’s trust in social media platforms is shifting. Platforms like Facebook and Twitter, which have traditionally dominated the digital sphere, are experiencing a decline in trust. Instead, younger people are turning to platforms such as TikTok that offer more personal experiences. This shift underscores the need for social media platforms to adapt and address the concerns of their user base.

Furthermore, Lambert emphasizes the importance of understanding and evolving our approach to public opinion formation in the 21st century. The rise of digital platforms, social media, and the rapid dissemination of information have changed the way public opinion is shaped. It is essential to recognize and adapt to these changes in order to effectively engage with the public and address their needs and concerns.

In conclusion, the efforts of the European Commission to regulate AI and combat the manipulation of public opinion are commendable. However, it is crucial to strike the right balance between preserving freedom of speech and preventing coordinated activities that aim to deceive or manipulate the public. Additionally, social media platforms must adapt to the changing trends in trust among the younger generation. Finally, understanding and evolving our approach to public opinion formation is essential for effective engagement with the public in the 21st century.

Esteve Sanz

Esteve Sanz highlights the crucial role of the Internet Governance Forum (IGF) in discussing critical issues related to disinformation and internet governance on a global scale. The attendance of the Vice President of the European Commission further emphasizes the importance placed on the forum and the seriousness with which disinformation is being addressed.

At the IGF, countries exchange ideas and concerns about disinformation, demonstrating collaborative efforts to combat its spread and the need for international cooperation. Esteve Sanz emphasizes that the IGF provides a substantial and concrete platform for these discussions.

One specific concern raised is the increasing influence of generative Artificial Intelligence (AI) in amplifying disinformation. Policymakers are urged to be alert and proactive in countering this issue. The affordability and ease with which generative AI can produce disinformation campaigns make it a significant threat. The European Commission is considering measures such as watermarking AI-generated content to tackle this challenge.

Esteve Sanz also emphasizes the importance of a clear definition of disinformation within the European Union (EU). It is argued that disinformation is an intentional action carried out by specific actors. This aligns with the EU’s human-centric approach to digital policies and underscores the need for accurate understanding and identification of disinformation to effectively combat it.

In conclusion, Esteve Sanz’s stance on the IGF underscores its critical role in addressing global disinformation and internet governance issues. The attendance of the Vice President of the European Commission and the exchange of concerns among countries highlight the significance placed on the forum. The threat posed by generative AI in amplifying disinformation calls for heightened alertness from policymakers. Moreover, a clear definition of disinformation is deemed essential within the EU, reflecting its human-centric approach to digital policies. These insights shed light on the international and regional efforts to combat disinformation and ensure the integrity of online information exchanges.

Stanislav Matejka

The European Regulators Group for Audiovisual Media Services (ERGA) plays a vital role in enforcing and implementing the Audiovisual Media Services Directive, with a strong focus on effectiveness. ERGA’s members have the responsibility of not only enforcing European legislation but also their own national legislation, ensuring comprehensive media regulation.

ERGA is particularly focused on political advertising, establishing rules for advertising in general and paying particular attention to political advertising. Since the creation of the first code of practice in 2018, ERGA has consistently directed its efforts towards this issue. Their aim is to ensure fair and transparent political campaigns.

ERGA also places significant importance on election integrity and transparency. They have introduced a code of practice that includes transparency obligations and commitments to publish transparency reports. ERGA emphasizes the effective enforcement of platforms’ own policies and closely monitors this aspect. Transparency is key to protecting election integrity and ensuring accountability.

To combat misinformation on online platforms, ERGA supports the establishment of reporting mechanisms. They propose the creation of functional reporting mechanisms for regulators, researchers, and anyone else who wishes to report or flag instances of misinformation. This initiative aims to address the spread of false information and provide a platform for accountability.

Access to data is crucial for ERGA in promoting public scrutiny through independent research. They recognize the significance of data for the research community in informing the enforcement of regulatory frameworks. ERGA supports the idea that independent research should have access to relevant data, enabling a more informed analysis and evaluation of media services.

In summary, ERGA is dedicated to effectively implementing the Audiovisual Media Services Directive. Their focus on political advertising, transparency in elections, reporting mechanisms for misinformation, and access to data for independent research are essential aspects of their work. By addressing these areas, ERGA aims to ensure fair and transparent media services in Europe.

Giovanni Zagni

The European Digital Media Observatory (EDMO) has recently established a new task force with a specific focus on addressing disinformation during the 2024 European elections. This task force aims to build upon the success of a previous one that focused on tackling disinformation during the Ukraine war. Comprising 18 members from various sectors, the task force is committed to understanding the nature of disinformation and disseminating valuable insights to combat its harmful effects.

One of the key objectives of the task force is to review past electoral campaigns, analyze their outcomes, and identify the main risks associated with the upcoming European elections in 2024. Through this process, they seek to develop strategies and frameworks to counteract disinformation and safeguard the integrity of the electoral process. Additionally, the task force plans to disseminate best practices from the media and information literacy world. By sharing successful approaches, they hope to enhance media awareness and empower citizens to critically evaluate and navigate the information landscape.

Giovanni Zagni, a strong advocate for democracy and inclusivity, fully supports this initiative. He emphasizes the need for a democratic and inclusive approach in addressing disinformation, ensuring that the diverse issues faced by each country are properly represented. Zagni highlights the task force’s role in facilitating the exchange of best practices and experiences in combating disinformation, thereby enhancing the effectiveness of efforts to promote peace, justice, and strong democratic institutions.

In conclusion, the establishment of the new task force by EDMO represents a significant step in addressing disinformation during the 2024 European elections. Building on the success of the previous task force, they aim to develop comprehensive strategies to tackle disinformation, review past electoral campaigns, and disseminate best practices. With the support of individuals like Giovanni Zagni, the task force aims to foster a democratic and inclusive environment where diverse issues are adequately considered. Through these collective efforts, they hope to reinforce media literacy, combat disinformation, and uphold the integrity of the electoral process.

Caroline Greer

TikTok actively participates in the Code of Practice on Disinformation, taking a leading role in developing structural indicators. They, along with other platforms, recently published their second reports on tackling disinformation. As a signatory of the Code of Practice on Disinformation, TikTok co-chairs the election working group, demonstrating their dedication to addressing disinformation during elections.

TikTok advocates for a multi-stakeholder approach to combat disinformation, promoting partnerships with fact-checkers, civil society, and other actors. They are part of a larger ecosystem that encourages collaboration in combating disinformation.

To ensure the integrity of elections, TikTok has a comprehensive global election integrity program in place. They work with local experts for each election and provide authoritative information about the election on their platform. Additionally, TikTok collaborates with external partners to gather additional intelligence.

TikTok has a strict policy against political advertising, which they have upheld for several years. They restrict the activities of political parties and politicians during elections, including campaign funding.

TikTok runs media literacy campaigns to promote critical thinking and verification of information. They sometimes partner with fact-checkers to enhance the effectiveness of these campaigns.

TikTok applies community guidelines globally, which help create a safe and inclusive environment for users.

In response to the Ukraine-Russian situation, TikTok has implemented special measures to mitigate the spread of harmful content and support peace and justice.

TikTok offers features to enhance user experience, such as the ability to refresh the content feed for a broader range of content. They have also introduced a second recommender system as required by the Digital Services Act, which presents popular videos based on the user’s location.

The Digital Services Act (DSA) plays a crucial role in promoting transparency in online platforms, including TikTok. Platforms must provide a detailed explanation of their recommender systems and reasons for any action taken. Users have the right to appeal platform decisions, and transparency reports are published to provide insights into content moderation practices.

In summary, TikTok actively engages in combatting disinformation, ensuring election integrity, promoting media literacy, and enhancing user experience. They adhere to policies and regulations such as the Code of Practice on Disinformation and the Digital Services Act, upholding transparency and fostering trust. Through collaboration and effective measures, TikTok creates a safe and engaging platform.

Albin Birger

The European Union (EU) is taking comprehensive action to combat disinformation. This includes implementing measures in three key areas: legislation, external actions, and communication. The EU institutions, such as the Commission and the European External Action Service, reflect these actions through their institutional architecture. The Director-General (DG) of the European Commission, Albin Birger, represents DG Connect, which is responsible for legislation regarding disinformation.

The EU is strengthening its regulatory framework with the introduction of the Digital Services Act (DSA), which mandates that online platforms be accountable for content moderation, advertising, and algorithmic processes. The Commission has been granted extensive investigatory and supervisory powers under the DSA.

Furthermore, the Code of Practice on disinformation, a voluntary and industry-based measure, plays a significant role in combating disinformation. Established in 2018 and strengthened in 2022, the Code aims to reduce financial incentives for those spreading disinformation and empower users to better understand and report disinformation content.

The EU is particularly focused on addressing disinformation related to electoral processes. To tackle this issue, a specific working group has been established. This group aims to exchange information and develop actions that can be implemented during elections to effectively counter disinformation-related risks.

The European Digital Media Observatory (EDMO) also plays a crucial role in the EU’s fight against disinformation. This observatory supports the development of a multi-disciplinary community of independent fact-checkers and academic researchers. EDMO operates as a central system, with national or regional hubs covering the EU territory and population. Additionally, EDMO has a specific task force for elections that carries out risk assessments ahead of European elections.

The DSA adds an additional layer of accountability for large online platforms, introducing mechanisms to audit the data and information provided by these platforms. Failure to comply with DSA obligations may result in enforcement measures and fines based on a percentage of the platform’s global turnover.

While signing the code of practice is voluntary for online platforms, it serves as a tool to demonstrate their compliance with DSA obligations. Even if platforms choose not to sign, they can still align their actions with the expectations outlined in the code of practice.

In conclusion, the European Union is taking comprehensive action against disinformation through legislation, external actions, and communication. The implementation of the Digital Services Act and the Code of Practice on disinformation provides a framework for accountability and empowers individuals to combat disinformation. The EU’s focus on tackling disinformation related to electoral processes, along with the support of the European Digital Media Observatory, further strengthens its efforts in this area.

Giacomo Mazzone

This town hall meeting focused on the upcoming European election in 2024 and the measures being taken to secure the elections and minimize interference. Representatives from the European Commission, the European Digital Media Observatory (EDMO), the regulatory body ERGA, TikTok, and civil society were present.

The European Commission, as the main proponent of this initiative, discussed the broader framework of the election and the role of independent regulators. They emphasized the importance of securing the elections and minimizing interference while enabling voters to freely express their views.

EDMO, responsible for tackling disinformation, addressed concerns from other regions about the creation of a “minister of truth.” They clarified that involvement of independent regulators, like ERGA, ensures a multi-stakeholder approach and prevents any monopolization of truth.

A representative from civil society questioned the effectiveness of self-assessment reports from big tech companies in preventing social harm on digital platforms. They discussed additional measures and actions that need to be taken for better results.

TikTok’s representative highlighted the platform’s commitment to preventing harm and maintaining a safe environment during the elections. They emphasized the responsibility of platforms like TikTok to proactively address harmful content and uphold the integrity of the democratic process.

The issue of what happens if large platforms refuse to comply with the code of practice was also discussed. The European Commission representative addressed this concern and assured that remedial actions would be taken to prevent significant harm.

Research in the field was another topic raised in the meeting. The EDMO representative acknowledged the importance of research in understanding and addressing election security and disinformation.

The meeting briefly discussed concerns about European citizenship modules and their impact on the election process. The need to address these concerns and provide clarity was mentioned, though no specific solutions were discussed.

Overall, the meeting aimed to provide valuable insights into securing elections, minimizing interference, and combating disinformation during the European election in 2024. The multi-stakeholder approach, involving the European Commission, regulators, platforms like TikTok, and civil society, demonstrated a collective commitment to ensuring the integrity of the electoral process.

Speakers

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more