The US Supreme Court seems torn over whether to trigger a radical transformation of the internet. The nation’s highest court heard arguments Monday over state laws in Florida and Texas that restrict how platforms like Facebook and YouTube moderate speech. If the court lets them take effect, social media feeds could look very different, with platforms forced to carry unsavory or hateful content that today is blocked or removed.

The high stakes gave long-standing questions about free speech and online regulation new urgency in Monday’s arguments. Are social platforms akin to newspapers, which have First Amendment protections that give them editorial control over content—or are they common carriers, like phone providers or telegraph companies, that are required to transmit protected speech without interference?

A ruling is expected by June, when the court typically issues many decisions, and could have sweeping effects on how social sites like Facebook, YouTube, X, and TikTok do business beyond Florida and Texas. “These cases could shape free speech online for a generation,” says Alex Abdo, litigation director of the Knight First Amendment Institute at Columbia University, which filed a brief in the case but did not take sides.

Florida and Texas passed the laws under debate in 2021, not long after social media platforms booted former president Donald Trump following the January 6 insurrection. Conservatives had long argued that their viewpoints were unfairly censored on major platforms. Laws barring companies from strict moderation were pitched as a way to restore fairness online.

The laws were quickly put on hold after two tech-industry trade associations representing social platforms, NetChoice and the Computer & Communications Industry Association, challenged them. If the Supreme Court now allows the laws to stand, state governments in Florida and Texas would gain new power to control social platforms and the content posted on them, a major shift from the situation today where platforms set their own terms of service and generally hire moderators to police content.

Polar Opposites

Monday’s arguments, spanning nearly four hours, underscored the legal confusion inherent to regulating the internet that remains. Justices raised questions about how social media companies should be categorized and treated under the law, and the states and plaintiffs provided opposing views of social media’s role in mass communication.

The laws themselves leave gaps as to how exactly their mandates would be enforced. The questions posed by the justices showed the court’s frustration at being “caught between two polar opposite positions, both of which have significant costs and benefits for freedom of speech,” says Cliff Davidson, a Portland-based attorney at Snell & Wilmer.

David Greene, senior staff attorney and civil liberties director at the digital rights group Electronic Frontier Foundation, which filed a brief urging the court to strike down the laws, says there are clear public benefits to allowing social platforms to moderate content without government interference. “When platforms have First Amendment rights to curate the user-generated content they publish, they can create distinct forums that accommodate diverse viewpoints, interests, and beliefs,” he says.

Shares:

Leave a Reply

Your email address will not be published. Required fields are marked *