A person walks down the sidewalk near the U.S. Supreme Court building in Washington, D.C., February 16, 2022.
Jon Cherry | Reuters
The Supreme Court blocked a controversial Texas social media law from taking effect in a decision released on Tuesday, after the tech industry and other opponents warned it could allow for hateful content to run rampant online.
The law, HB20, prohibits online platforms from moderating or removing content based on viewpoint. It stems from a common charge on the right that major California-based social media platforms like Facebook and Twitter are biased in their moderation strategies and disproportionately quiet conservative voices. The platforms have said they apply their community guidelines evenly and it’s often the case that right-leaning users rank among the highest in engagement.
“HB20 would compel platforms to disseminate all sorts of objectionable viewpoints,” two industry groups that represent companies including Amazon, Facebook, Google and Twitter claimed in their emergency application with the court, “such as Russia’s propaganda claiming that its invasion of Ukraine is justified, ISIS propaganda claiming that extremism is warranted, neo-Nazi or KKK screeds denying or supporting the Holocaust, and encouraging children to engage in risky or unhealthy behavior like eating disorders.”
Texas’ attorney general Ken Paxton, a Republican, has said this is not the case, writing in a response to the emergency application, that the law does not “prohibit the platforms from removing entire categories of content.”
“So, for example,” the response says, “the platforms can decide to eliminate pornography without violating HB 20 … The platforms can also ban foreign government speech without violating HB 20, so they are not required to host Russia’s propaganda about Ukraine.”
In the 5-4 decision, Alito dissented from the decision to lift the stay, issuing a written explanation for his vote, which was joined by two other conservative justices, Clarence Thomas and Neil Gorsuch. Justice Elena Kagan, a liberal, also voted against vacating the stay.
Alito’s dissent opened by acknowledging the significance of the case for social media companies and for states that would regulate how those companies can control the content on their platforms.
“This application concerns issues of great importance that
will plainly merit this Court’s review,” Alito wrote. “Social media plat-
forms have transformed the way people communicate with
each other and obtain news. At issue is a ground-breaking
Texas law that addresses the power of dominant social me-
dia corporations to shape public discussion of the important
issues of the day.”
Alito said he would have allowed the law to remain in effect as the case proceeds through federal courts.
The legislation was passed in September but blocked by a lower court that granted a preliminary injunction keeping it from going into effect. That changed when a federal appeals court for the Fifth Circuit ruled in mid-May to stay the injunction pending a final decision on the case, meaning the law could be enacted while the court deliberated on the broader case.
That prompted two tech industry groups, NetChoice and the Computer and Communications Industry Association (CCIA), to file an emergency petition with Justice Samuel Alito, who is assigned to cases from that district.
NetChoice and CCIA asked the court to keep the law from going into effect, arguing social media companies make editorial decisions about what content to distribute and display, and that the appeals court’s decision would get rid of that discretion and chill speech. It said the court should vacate the stay as the appeals court reviews the important First Amendment issues central to the case.
The Supreme Court’s decision has implications for other states that may consider legislation similar to that in Texas. Florida’s legislature has already passed a similar social media law, but it has so far been blocked by the courts.
Soon after the tech groups’ emergency appeal in the Texas case, a federal appeals court for the Eleventh Circuit upheld an injunction against a similar law in Florida, unanimously concluding that content moderation is protected by the Constitution. Florida’s attorney general filed an amicus brief on behalf of her state and several others, urging the court to continue to allow the Texas law to be in effect, arguing the industry had misinterpreted the law and that states are within their rights to regulate businesses in this way.
Testing ground for Congress
The state laws serve as an early testing ground for the ways the U.S. Congress is considering reforming the legal liability shield tech platforms have relied on for years to moderate their services. That law, Section 230 of the Communications Decency Act, keeps online platforms from being held responsible for content users post to their services and also gives them the ability to moderate or remove posts in good faith.
The law has come under fire from both Democrats and Republicans, but for different reasons. Democrats seek to reform the law to give tech platforms more responsibility to moderate what they see as dangerous content, including misinformation. While Republicans agree certain types of content like terrorist recruitment or child sexual exploitation material should be removed, many seek to make it harder for platforms to engage in some other forms of moderation that they view as ideological-based censorship.
One of the authors of Section 230, former Rep. Christopher Cox, R-Calif., filed an amicus brief supporting the industry groups’ plea for the Supreme Court to reverse the stay. In the brief, Cox argues that HB20 “is in irreconcilable conflict” with Section 230, which should preempt the state law.
Still, at least one Justice on the Supreme Court has already expressed interest in reviewing Section 230 itself.
In 2020, conservative Justice Clarence Thomas wrote that “in an appropriate case, we should consider whether the text of this increasingly important statute aligns with the current state of immunity enjoyed by Internet platforms.”
Last year, he suggested in a concurrence that online platforms may be “sufficiently akin to common carriers or places of accommodation to be regulated in this manner.”
This story is developing. Check back for updates.
-CNBC’s Dan Mangan contributed to this report.
Subscribe to CNBC on YouTube.
WATCH: The messy business of content moderation on Facebook, Twitter, YouTube