Meta and Center for Open Science collaborate ahead of Congress hearing on child safety

Meta partners with the Center for Open Science ahead of a congressional hearing on children’s online safety and introduces a pilot program, embracing innovative open science research processes.

A person's hand holding Meta's infinite sign.

In preparation for an important congressional hearing on children’s online safety scheduled for 31 January, Meta has announced a significant partnership with the Center for Open Science, a nonprofit dedicated to transparency in academic research. This collaboration involves a pilot program where Meta will share ‘privacy-preserving social media data’ with selected academic researchers focusing on well-being studies. The pilot program will also incorporate innovative research processes popularised in the open science movement, such as preregistration and early peer review. This move represents a promising advancement beyond traditional industry-academia partnership models, aiming to enhance access to crucial data for academic research.

Curtiss Cobb, Meta’s VP of Research, expressed the company’s commitment to contributing to the scientific understanding of factors influencing well-being while respecting user privacy. For years, academics have been pressing platforms to provide them with more data for research purposes. However, these efforts have accelerated as Congress has grown more concerned about the negative effects of social media on mental health. With the release of the Meta Content Library, a transparency tool, in November, Meta increased the amount of data available to academics and facilitated large-scale analysis of pre-existing data such as public posts, comments, and reactions.

As part of Meta’s pre-hearing initiatives, Meta CEO Mark Zuckerberg and other industry leaders, including the CEOs from X, TikTok, and Discord, will testify before Congress regarding children’s online safety. Alongside this collaboration, Meta recently unveiled new messaging restrictions on Facebook and Instagram, restricting users under 16 from receiving messages from unfamiliar adults. Guardians will have control over teens’ privacy settings. Additionally, Meta has implemented measures to limit teens’ exposure to content related to self-harm, suicide, and eating disorders. However, Meta faces increased scrutiny, with unredacted documents in an ongoing lawsuit revealing the company’s repetitive reluctance to protect children on its platforms.

As the congressional focus on the impact of social media on mental health grows, Meta’s collaboration with the Center for Open Science and its pre-hearing initiatives aim to address concerns surrounding children’s online safety and contribute to ongoing discussions on responsible platform practices.