Late last week, a US court effectively declared social media moderation illegal in Texas. The ruling doesn’t change anything for now. But it just set the stage for a Supreme Court decision that could transform the internet. And with that context… it’s remarkably bad.
The case I’m talking about is NetChoice v. Ken Paxton, a fight over a law called HB 20, which we’ve discussed before. HB 20 bans large social networks from deleting (or hiding or demonetizing or downranking) content or users based on “viewpoint.” It’s positioned as an anti-“Big Tech” censorship bill, but it’s poised to cover many sites beyond your average Facebook or YouTube behemoth. It would require a radical reworking of many major services, plausibly including an end to bans on things like hate speech and extremism, despite denials of this from supporters.
A district court found HB 20 likely unconstitutional for these reasons and blocked it. Then, earlier this year, the Fifth Circuit Court of Appeals disagreed. It reversed the block after a hearing where judges implied social networks should be treated like phone companies but without an official explanation. Now, it’s offered more solid grounds for the Supreme Court to set a nationwide precedent.
It’s a ruling from somebody who seems uninterested in how social networks work
The ruling, delivered on Friday, is mostly a tirade against internet moderation in general terms. Techdirt’s Mike Masnick has broken down the specifics already, but to sum up, Oldham finds that sufficiently large private websites become public services that don’t have a First Amendment right to control what appears on their sites. In Oldham’s interpretation, HB 20 “does not chill speech; if anything, it chills censorship.” But the ruling barely bothers trying to connect that claim with how social networks actually function. In fact, it’s outright dismissive of the real questions HB 20 poses for anybody on these platforms.
In one of the weirdest pieces of the complaint, Oldham all but mocks the idea that violent extremists might use the law as cover, deriding platforms’ “obsession with terrorists and Nazis” and complaining that they’re bringing up “the most extreme hypothetical applications” of the law. Oldham is also convinced that sites “use algorithms to screen out certain obscene and spam-related content,” but then “virtually everything else is just posted to the platform with zero editorial control or judgment” — something that makes no sense in a world of constantly changing recommendation algorithms.
We’re not feeling the effects of this decision now because NetChoice successfully got the Supreme Court to block the law while it plays out in a lower court.
For tech companies, the biggest fear isn’t that Oldham’s ruling will take effect as written but that its arguments will lay the groundwork for a similar ruling from the Supreme Court. As Free Press senior counsel Nora Benavidez noted on Twitter, Friday’s ruling tees up a sharp split between two of the nation’s highest courts. The Eleventh Circuit Appeals Court overruled a very similar Florida law last year, reaching a conclusion almost diametrically opposed to the Fifth Circuit’s. The Supreme Court already seemed inclined to take up the question, and at this point, it seems downright inevitable.
Once it’s there, the results are an open question — or, at least, far more open than many legal experts once expected. Justice Clarence Thomas has been actively spoiling for a platform regulation fight, and four justices (including Thomas) voted to let the law go into effect, although that doesn’t necessarily translate into upholding it.
There are real questions about tech companies and the First Amendment, and this doesn’t touch them
There are real questions about how social media moderation can chill speech. But this ruling seems totally uninterested in grappling with their complexities or even admitting they exist — and there is a real chance that conservatives on the Supreme Court won’t be interested either.
At the heart of all this, there’s the legitimately difficult question of how the First Amendment should apply to social media companies. As Public Knowledge senior vice president Harold Feld points out, the fact that nearly everything social networks do is related to speech shouldn’t necessarily mean the First Amendment prevents all regulation. Some courts, like the Eleventh Circuit, have carefully examined this question and found potential limits, albeit ones that would raise their own difficulties.
Other states are passing their own legislation, which will probably draw further legal challenges. California Governor Gavin Newsom just signed a series of bills that will test First Amendment protections for social media but are motivated by vastly different concerns than the Texas and Florida laws. And an Ohio court has declared its intention to weigh whether to treat Google as a common carrier. So these problems aren’t going away.
But while the issue deserves real thought, the realities of US politics mean we mostly get rants against the stock villain of “Big Tech,” suggesting US courts are mainly interested in punishing companies it doesn’t like.
The current wave of tech companies probably won’t reign forever, and the Supreme Court’s decision — as well as other lower court cases to come — will shape whatever comes next. Hopefully, they’re less flippant than the Fifth Circuit.