Unlocking Trust and Safety to Preserve the Open Internet | IGF 2023 Open Forum #129

11 Oct 2023 05:45h - 07:15h UTC

Event report

Speakers and Moderators

Speakers:
  • Alison Gillwald, Research ICT Africa, Civil Society, Africa
  • A representative to be confirmed from the Government of Fiji’s Online Safety Commission, Government, Asia-Pacific
  • Nobuhisa NISHIGATA, Government of Japan, Ministry of Internal Affairs and Communications, Government, Asia-Pacific
  • A representative to be confirmed from a DTSP company member, Private Sector, region to be confirmed but we will aim for maximum diversity
  • Brent Carey, NetSafe, New Zealand, Civil Society, WEOG
Moderators:
  • David Sullivan, Digital Trust & Safety Partnership, Private Sector, WEOG

Table of contents

Disclaimer: This is not an official record of the IGF session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed. The official record of the session can be found on the IGF's official website.

Knowledge Graph of Debate

Session report

Brent Carey

New Zealand has shown its commitment to online safety by enacting the Harmful Digital Communications Act in 2016. This legislation takes a principles-based approach to address various forms of online harm, including incitement to commit suicide, breach of confidentiality, and harassment. The act covers both criminal and civil aspects, with NetSafe, a government-approved NGO agency, managing the civil side.

NetSafe plays a vital role in helping New Zealanders resolve online disputes through mediation. Each year, over 25,000 individuals seek assistance from NetSafe, with more than 7,000 engaging in the mediation process. This demonstrates the effectiveness of NetSafe in providing a platform for conflict resolution in the digital realm.

NetSafe has also led the development of the ‘Aotearoa Online Safety Code’, launched in July 2022. This code, supported by major platforms like TikTok, Meta, Amazon, Twitch, and Twitter, addresses issues such as hate speech, disinformation, and misinformation. By adopting risk-based approaches, the code aims to tackle these challenges and create a safer digital environment.

New Zealand is actively seeking innovative approaches to address emerging online harms and learn from global best practices. They have produced a discussion paper titled ‘Safer Online Services and Media Platforms Bill’ to explore content regulation. Additionally, NetSafe participates as an observer in global regulators’ forums, engaging in relevant discussions.

To ensure a comprehensive and collaborative approach to internet safety, different stakeholders need to come together for discussions. This collaboration within the digital ecosystem creates spaces and opportunities for diverse parts of the infrastructure to engage in meaningful conversations.

Industry-led interventions are considered crucial in promoting online safety. By providing a platform for different voices, these interventions contribute to a balanced and effective response to online threats.

However, the regulation of platforms has raised concerns about the withdrawal of news and media plurality. Some platforms have expressed plans to withdraw and stop providing news in response to regulations such as the Fair Digital Media Bill. This highlights the challenge of balancing regulation with maintaining a diverse media landscape.

The importance of media plurality and media literacy is widely acknowledged. Media plurality is seen as crucial for a vibrant civil society, while media literacy empowers individuals to critically navigate the digital world.

The existing media landscape is undergoing significant transformations due to the influence of both old and new media. Brent Carey suggests that understanding these changing dynamics and effective responses are necessary in this evolving landscape.

Preserving online privacy is of utmost importance, and New Zealand has implemented stringent measures to tackle privacy violations. The Harmful Digital Communications Act imposes penalties of a $50,000 fine or two years imprisonment for posting intimate images without consent. The New Zealand police actively prosecute such offenses, emphasizing the seriousness of this issue.

Brent Carey supports severe repercussions for online privacy violations and highlights the effectiveness of the Harmful Digital Communications Act in addressing such breaches.

Encouraging the online industry to uphold the highest standards of safety and corporate citizenship is essential. Brent Carey believes in striving for the highest standards, rather than settling for lower ones exhibited by certain platforms. Companies like Twitter have taken steps in this direction, as evidenced by Brent Carey’s involvement with Twitter’s Trust and Safety Council and their commitment to online safety through localized data.

It is worth noting that the discussion did not cover the Judaic argument, as Brent Carey explicitly declined to discuss it. This suggests that certain limitations or sensitivities may exist concerning specific topics within the broader discourse of online safety and governance.

In conclusion, New Zealand’s enactment of the Harmful Digital Communications Act reflects its commitment to online safety. NetSafe’s mediation services and the ‘Aotearoa Online Safety Code’ further enhance efforts to address online disputes and tackle issues such as hate speech and misinformation. New Zealand actively explores innovative strategies and seeks global best practices to combat emerging online harms. Collaboration among stakeholders is crucial for effective internet safety, and industry-led interventions play a vital role. However, challenges remain regarding platform regulation and media plurality. Preserving online privacy and promoting the highest standards of safety and corporate citizenship are key priorities.

Rishika Chandra

Fiji is at the forefront of recognising the significance of online safety and has taken concrete steps to ensure a secure digital environment for its citizens. In 2018, Fiji enacted the Online Safety Act, which laid the foundation for the establishment of the Online Safety Commission in 2019. The commission has made considerable progress in organising awareness and education programmes to educate people about potential risks and equip them with the necessary tools to protect themselves online.

Furthermore, Fiji has been actively involved in fostering international cooperation and knowledge sharing in tackling online abuse through its participation in the Global Online Safety Regulators Network. Formed in 2022, the network includes members from Fiji, the UK, Australia, Ireland, Africa, and Korea. This collaboration has been instrumental in promoting the exchange of ideas and experiences in combating online abuse on a global scale.

The partnership between the Online Safety Commission and eSafety Commissioner Australia, along with social media platforms such as Meta and TikTok, plays a crucial role in promoting online safety. Under this arrangement, the organisations work together to support online safety in Fiji and Australia by sharing best practices, raising awareness of online safety trends and emerging issues, and developing national online safety strategies. One of the primary ways they collaborate with these tech companies is through their content reporting systems, which enable users to report harmful and inappropriate content for swift action.

Governments around the world face the challenge of balancing regulations on online content and data privacy without infringing upon individuals’ rights to free speech or impeding innovation. While it is important to protect users from harmful content or cyber threats, it is equally essential to ensure that regulations do not stifle freedom of expression or impede the progress of technological advancements.

Fiji has taken a strong stance against online harassment, cyberbullying, image-based abuse, and child exploitation, criminalising these offences. The penalties are significant, including imprisonment and fines. However, it is worth noting that defamation is not covered under Fiji’s Online Safety Act.

To effectively regulate social media platforms, Fijians need a better understanding of their design, policies, and community guidelines. It is crucial for individuals to be aware of how these platforms work to navigate them safely. While social media platforms can be dangerous, they also serve as a means of connectivity and communication.

Building strong relationships and collaborations with social media platforms is vital in achieving a balance between regulation and individual rights. By working in a collaborative manner with these platforms, it becomes possible to address online safety concerns effectively.

In conclusion, Fiji’s commitment to online safety is commendable, with the enactment of the Online Safety Act and the establishment of the Online Safety Commission. The country’s active participation in international networks and partnerships, along with efforts to educate its citizens and collaborate with social media platforms, further solidifies Fiji’s position as a leader in this field. However, it is essential for governments to find a balance between regulation and individual rights, ensuring the protection of users while fostering innovation and free speech.

Audience

During the discussion, several key points were raised by different speakers. One audience member expressed concern about the involvement and engagement of civil society within the Internet Governance Forum (IGF). They questioned the extent to which civil society is included and heard in participatory discussions such as the IGF. This raised questions about the room and role for civil society and their ability to influence decisions.

Another speaker highlighted the importance of partnerships and their role in addressing the demands and concerns of civil society. They emphasized the need for the partnership to consider and respond to the voices and needs of civil society, particularly in the areas of peace, justice, strong institutions, and partnerships for the goals.

Doubts were also raised about the effectiveness of voluntary industry associations, specifically in sectors such as automotive, advertising, and digital identity. The audience member noted that voluntary industry associations in these sectors have failed to bring about significant change or address the concerns of stakeholders. This raised skepticism about the potential success of a new voluntary industry association.

The need to strike a balance between government and private sector involvement in regulating the internet was a key point of discussion. One speaker questioned the current system of industry-led regulation of the internet and advocated for a more balanced approach that includes government involvement. They highlighted the example of Canada’s Bill C-11 Online News Act, which required tech companies to pay news outlets for posting or linking content. This led to Meta removing news from their platforms, raising questions about the control that companies have over the digital space.

On the other hand, a speaker argued that less regulation can lead to better outcomes. They referenced the positive effects of the relatively unrestricted early internet and suggested that excessive government regulation can hinder innovation and progress. This viewpoint advocated for self-regulation as a solution, suggesting that businesses should take responsibility for their actions and address any potential harm caused.

Notably, there were contrasting viewpoints on self-regulation between different cultural contexts. A South Korean panel member advocated for self-regulation, while Europe has shifted towards government regulation. This highlighted the different perspectives on how best to regulate the internet and the need for cross-cultural understanding and collaboration.

The enforcement of online moderation rules and regulations was a point of concern, with many customers expressing dissatisfaction. The speaker called for transparency in the enforcement process but also highlighted the impact it may have on revealing business strategies. Striking a balance between transparency and maintaining customer trust was deemed essential.

In terms of partnership expansion, there was a call to bring more gaming companies into the fold and to establish rules and expectations specific to the gaming industry. This recognizes the unique challenges and dynamics within the gaming sector and the need for tailored approaches.

The challenges of information sharing within companies and content moderation were also discussed. Companies have been relatively low profile about information sharing within their functions, but there is a shift towards more sharing while considering trade-offs. Additionally, the stress and challenges faced by content moderators were highlighted through the game “Moderator Mayhem,” underscoring the need for a deeper understanding of the positions and support given to those responsible for content moderation.

The credibility of voluntary industry action in trust and safety was called into question, particularly considering the activities of certain companies in this space. There were concerns that their actions might undermine the overall credibility and effectiveness of voluntary action in ensuring trust and safety.

Finally, a speaker suggested that a non-prescriptive duty of care for user safety would be a better legislative approach. This would involve holding companies accountable for ensuring the safety of their users without prescribing specific actions or methods.

In conclusion, the discussion covered a wide range of topics related to civil society involvement, the effectiveness of voluntary industry associations, government and private sector involvement in regulating the internet, contrasting viewpoints on self-regulation, the enforcement of online moderation rules, challenges in the gaming industry, information sharing within companies, the credibility of voluntary industry action, and legislative approaches to user safety. Noteworthy observations include the importance of considering civil society demands and concerns, the need for balance and collaboration in regulation, and the challenges faced in content moderation and information sharing.

David Sullivan

The Digital Trust and Safety Partnership (DTSP) was launched by David Sullivan in February 2021 to establish best practices for trust and safety online using a risk-based approach. It aims to develop specific standards and practices for companies’ services. DTSP emphasizes the importance of tailoring assessments and practices based on company size and risk. One of its key goals is to prevent internet fragmentation and support a free and open internet by developing international standards on trust and safety. DTSP believes that adopting a risk-based approach and conducting third-party assessments can help achieve these goals. The partnership values the input of stakeholders, including industry perspectives, and aims to engage in broad consultations. DTSP recognizes the significance of independent third-party reviews to provide objective assessments of company practices. It also highlights the changing concept of self-regulation within companies as emerging regulatory regimes are established globally. David Sullivan advocates for greater transparency in online moderation processes and regulations, while also considering trade-offs. DTSP refrains from commenting on specific companies’ activities to maintain industry credibility. The partnership acknowledges previous challenges faced by voluntary industry associations and emphasizes the need for proper implementation and alignment with emerging regulations. It also recognizes the spread and challenge of digital authoritarianism and emphasizes the need for collective action beyond individual company initiatives. Overall, the DTSP aims to establish best practices for trust and safety online by tailoring assessments, considering various perspectives, advocating for international standards, and promoting transparency in online moderation processes. The partnership is committed to driving positive change in enhancing the trust and safety of the online environment.

Nobuhisa Nishigata

In Japan, except for broadcasting, there is no direct regulation of online content by the government. However, there are certain issues that persist, such as cyberbullying, online slandering, and the distribution of pirated content, particularly manga. Despite these challenges, the Japanese government places great importance on respecting freedom of speech and expression.

Measures have been taken to address these issues, including regulations against spam and finding a balance between public safety and human rights. The government acknowledges the need to protect children from online harm and encourages voluntary efforts for software installation and filter optimization. Additionally, discussions have arisen about the liability of internet service providers and their prompt actions in response to harmful content.

There is a positive outlook for the future development of the Digital Trust & Safety Partnership (DTSP) and a recognition of the importance of combating pirated content without direct regulation. Japan believes in learning from successful practices of companies and sees co-regulation as an effective approach to tackle online content issues.

Concerns have been raised regarding public safety and the activities of tech companies. The frustrations of tech companies with government involvement are acknowledged. However, Japan remains committed to maintaining an open and free internet. The commitment of Japanese Prime Minister Kishida and Japan’s support for the Future Declaration on the Internet exemplify this dedication. Additionally, the importance of effective internet governance was emphasized at the G7 ministerial meeting in Takasaki.

Media literacy and caution about relying too heavily on online media and social networking sites (SNS) for information are highlighted. Concerns are expressed about companies lacking journalistic backgrounds and the variation in information depending on the country.

The handling of content-related matters, such as harassment and defamation, as criminal offenses varies depending on the case. Jurisdiction plays a role in determining the approach taken, and for more serious offenses, law enforcement may directly charge individuals. In other cases, private lawsuits can result in sanctions or mitigation.

Nobuhisa Nishigata, mentioned in the discussions, expresses optimism about the further development of digitalization work in the United States. Nishigata supports private-led investment in digital infrastructure and believes the government should act primarily as a coordinator. Japan has already established a basic law concerning digitalization and the digital society, which emphasizes private-led investment in digital infrastructure.

Lastly, there is an expressed interest in a Japanese company joining global partnerships. The importance of partnerships and global cooperation, particularly in relation to the United Nations’ Sustainable Development Goal 17: Partnerships for the Goals, is emphasized.

In summary, while the Japanese government does not directly regulate online content, challenges and concerns persist regarding cyberbullying, online slandering, and pirated content. Respect for freedom of speech and expression is highly valued by the government. Measures such as regulations against spam, finding a balance between public safety and human rights, and involving tech companies in ensuring public safety are being discussed. The future development of the DTSP and the interest in joining global partnerships reflect Japan’s commitment to addressing these issues while maintaining an open and free internet.

Angela McKay

Angela McKay, a technology risk expert, strongly supports the concept of a free, open, and interoperable internet. She acknowledges the desire of global companies to operate in a global market and expresses encouragement toward conversations surrounding this vision. McKay recognizes that collaborative solutions are necessary to address the changing technology and harms landscape. Drawing from her experience in technology risk, she identifies similarities between the discussions around online harm and her field. She notes that governments, civil society, and companies have realized the importance of collaborating to tackle these issues effectively.

In terms of regulation and transparency, McKay believes that these approaches should reflect the cultural values and norms of a region. She acknowledges that regardless of the approach taken, governments represent the cultural values of their respective regions. This implies that regulatory and transparency approaches must be sensitive to cultural variations.

McKay advocates for a risk-based approach to address online harms. She highlights the need for companies to adopt risk-based approaches and emphasizes the importance of considering trade-offs to ensure a safe online environment. This approach allows for a more nuanced and flexible response to the complexities of online harms.

Cross-sector dialogue is another crucial aspect highlighted by McKay. She emphasizes the importance of conversations between different entities, citing examples such as the DTSP (Digital Trust and Safety Partnership) within organizations and the Global Online Safety Regulators Forum between regulators. Through dialogue and collaboration, learning can occur, leading to improved practices.

The exchange of best practices among companies of varying sizes is seen as instrumental in supporting global proliferation. McKay notes that the DTSP has partnered with the Global Network Initiative to involve civil society in advancing the Digital Services Act. This collaboration prevents knowledge and expertise from being confined to only large companies, ensuring that even medium and smaller companies have an opportunity to benefit from best practices.

McKay recognizes that the field of operational maturity is continuously evolving. Companies are constantly seeking out novel methods and practices that have not been previously implemented, highlighting their commitment to continuous learning and improvement.

The importance of exchanging ideas among different communities of civil society is stressed by McKay. It is not sufficient for companies alone to engage in dialogue; the participation of civil society is crucial to ensure a more inclusive and comprehensive approach to addressing online harms. McKay mentions that Google has been actively involving civil society members and academics in discussions on topics like child safety. They are also exploring the use of requests for information and online forums to catalyze conversations and gather diverse perspectives.

Advocating for active engagement with civil society, McKay suggests that companies should proactively encourage dialogue and collaboration among different communities. By bringing in external voices and perspectives, companies can better understand and address societal concerns.

While acknowledging the potential benefits of regulation and transparency, McKay cautions against viewing them as a panacea for all problems. She believes that focusing on what behaviors are being aimed to drive is more crucial than fixating on the enforcement method. This perspective challenges the false dilemma of regulation versus transparency, shifting the focus towards the fundamental goal of shaping positive online behaviors.

The progress made in managing cybersecurity risks is acknowledged by McKay. She highlights the evolution from solely focusing on vulnerability management to a more holistic, risk-based approach. This progress highlights the continuous efforts to enhance cybersecurity measures and protect online users.

In conclusion, Angela McKay’s perspectives highlight the importance of a free, open, and interoperable internet, collaboration to address online harms, culturally sensitive regulation and transparency approaches, risk-based management of online harms, cross-sector dialogue for learning and improvement, the exchange of best practices among companies of varying sizes, continuous learning and improvement in operational maturity, the significance of exchanging ideas with civil society, and the need to focus on driving desirable behaviors rather than fixating on enforcement methods. Her insights contribute to a more comprehensive understanding of the complexities and potential solutions within the digital landscape.

Kyoungmi Oh

The South Korean government is currently making efforts to exert control over content on various platforms, which has posed challenges and highlighted the need for increased transparency. Civil society organizations in South Korea are requesting platforms to disclose government requests for user information and content takedown, a practice that ceased around 2011.

The inadequacy of the SAFE (Safety, Audit, Feedback, and Enforcement) framework in addressing the unique aspects of the digital industry has been noted. The framework fails to consider the importance of freedom of expression and privacy, and the potential harms that occur when content is taken down or censored. This calls for a more nuanced approach to trust and safety that prioritizes protecting freedom of expression.

Collaboration with digital rights organizations and civil society is crucial for effectively managing trust and safety in the digital industry. The Trust and Safety Council of Twitter serves as an example of successful collaboration, incorporating a wider range of perspectives and insights into content regulation decisions. Limited transparency with recognized human rights organizations under appropriate non-disclosure agreements is also seen as beneficial.

Incorporating industry-specific considerations and placing greater emphasis on enforcement and transparency within the SAFE framework is necessary. The current framework falls short in addressing the unique characteristics of the digital industry, with abstract questions that do not cater to its specifics. Clarity on what content should be taken down is lacking, leading to confusion and potential bias in decision-making.

Self-regulation is preferred over governmental regulation, as endorsed by South Korean civil society organizations. However, transparency in the self-regulation process is crucial due to the diverse interests, goals, and missions of different organizations.

South Korea has enacted legislation to address cybercrimes, particularly harassment and sexual abuse. The Punishment Act allows the communication network to punish offenders and provides a legal framework for combating these crimes.

In conclusion, the South Korean government’s control over platform content and the shortcomings of the SAFE framework have raised concerns regarding transparency, freedom of expression, and privacy. Collaboration with digital rights organizations and civil society, industry-specific considerations, and enforcement are essential for effective trust and safety management. While self-regulation is preferred, transparency in the self-regulation process is crucial. Legislation addressing cybercrimes demonstrates South Korea’s commitment to combating online abuse. Addressing these issues will contribute to a more inclusive and secure digital environment.

Speakers

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more

Speech speed

0 words per minute

Speech length

words

Speech time

0 secs

Click for more