Republican states push to limit Big Tech moderation powers

The GOP attorneys argue that current state laws limiting Big Tech’s ability to moderate content on their platforms violate the US Constitution’s First Amendment.

 Device, Hammer, Tool, Indoors, Cosmetics, Lipstick

A group of twenty Republican state attorneys general, led by Missouri Attorney General Andrew Bailey,  have filed an amicus brief in the Supreme Court, asking the Court to pass a ruling supporting laws that limit internet platform’s ability to moderate content

The AGs have expressed support for the Florida and Texas laws that require large social media companies to host content from third parties and preclude them from blocking or removing users’ posts based on political viewpoints. The brief argues that the First Amendment provides no grounds for excluding social media companies from state regulation. They argued that the ‘must carry’ requirements, made applicable to telegraph companies in 1888, should also be applicable to social media companies,  as it makes no difference whether the company carrying other people’s speech is digital or analog. The Supreme Court is expected to hear oral arguments about the case in February of this year.

Why does it matter?

The case outcome would be consequential for the future of social media content moderation. If the Supreme Court accepts the AGs’ arguments in favour of state laws, it could result in more government regulation of Big Tech companies, following the European model. However, many argue that the problem of vagueness in speech (vagueness regarding what certain terms can imply) could lead to governments suppressing both prohibited and permitted speech, causing more censorship on social media platforms.

On the other hand, if the Supreme Court maintains the status quo, it would protect Big Tech’s ability to moderate content on its platform as it sees fit. However, this could lead to a less safe and less inclusive online environment, as has been demonstrated by the role of Twitter (now X) in the Capital Hill riots on January 6, 2021. Social media algorithms operate to increase user engagement, often amplifying harmful content posted by third parties, which calls into question how effectively the companies would be able to self-regulate with external government regulation.