Missouri says the Biden administration censored speech. The Supreme Court will judge

Graeme Sloan/Sipa USA file photo

In March 2021, as COVID-19 vaccines were increasingly available to the general public, officials inside the Biden administration were growing frustrated.

The administration had launched an aggressive campaign urging people to get the shot, hoping it would put the deadly pandemic in the past. Instead it kept bumping into a major source of resistance: viral posts on social media questioning the vaccines.

“We want to know that you’re trying, we want to know how we can help, and we want to know that you’re not playing a shell game with us when we ask you what is going on,” Rob Flaherty, the former White House director of digital strategy, wrote to someone at Facebook. “This would all be a lot easier if you would just be straight with us.”

Andy Slavitt, the White House senior adviser for COVID response, followed up.

“Internally, we have been considering our options on what to do about it,” Slavitt wrote.

The exchange, one of many between the White House and social media companies over content moderation policies, was uncovered in a sprawling case that argues the federal government violated the First Amendment by urging social media companies privately and publicly to take down posts, ban users and use their algorithms to deemphasize what it labeled disinformation.

The Supreme Court will hear oral arguments on the case Monday.

The lawsuit, initially brought by former Missouri Attorney General Eric Schmitt, former Louisiana Attorney General Jeff Landry and five people who say their social media posts were censored, focuses mostly on content questioning the administration’s public health policies during the COVID-19 pandemic and the legitimacy of the 2020 presidential election.

But it could have widespread implications on how the government is able to communicate with social media companies in an era where misinformation and disinformation runs rampant across the sites — and as the government is trying to decide if and how to increase regulation on technology companies.

“I think it’s the most important free speech case in the history of the country,” Schmitt said in an interview with The Star. “So much of our dialogue and our debate not only just takes place in the town square, but more and more now in the virtual town square.”

The case focuses on whether the several federal government entities — the White House, the Centers for Disease Control and Prevention, the Surgeon General, the Federal Bureau of Investigation, and the Cybersecurity and Infrastructure Security Agency — coerced or significantly encouraged the social media sites to block and remove content from their sites, which would violate the First Amendment.

White House press secretary Karine Jean-Pierre declined to comment on the case on Friday and directed the Star to a brief filed by the Department of Justice ahead of the arguments.

In that brief, Solicitor General Elizabeth Prelogar argued that the government was simply using its bully pulpit to convince social media companies to take action on posts that spread false information.

Prelogar also suggested that a ruling by the court could imply that the social media companies are state actors, subjecting them to additional lawsuits claiming they violated people’s First Amendment right to free speech.

Already, two federal courts ruled against the federal government — the U.S. District Court for the Western District of Louisiana and the 5th Circuit Court of Appeals. Both courts issued rulings limiting the Biden administration’s contact with social media companies. The Supreme Court put those orders on hold until after it makes its decision.

“It will have broad implications for how the government is permitted to weigh in on matters of public concern, and the extent to which they’re able to communicate information to the platforms,” said Jennifer Jones, a staff attorney at the Knight First Amendment Institute at Columbia University.

In its ruling, the 5th Circuit Court of Appeals found that each of the governmental entities had either coerced or significantly encouraged the sites to take down or stifle posts — and in some cases it found that it did both.

In the case of the FBI, it found the agency had coerced and significantly encouraged the social media sites to take down content, arguing that because it was a law enforcement agency, any request to take down content came with the implicit threat that it could take legal action.

Prelogar argued that the social media companies ignored the FBI’s requests about half the time.

Sen. Mark Warner, a Virginia Democrat who chairs the Senate Intelligence Committee, wrote a brief in the case urging the Supreme Court to allow the government to continue to communicate with social media companies about potential national security threats on their platforms.

“Any injunction here would prevent or limit the government’s ability to communicate with social media companies and would leave the United States vulnerable to attack. Foreign malign influence campaigns have grown in number, scope, and sophistication since 2016,” Warner wrote.

A spokesperson for Meta, the parent company of Facebook and Instagram, declined to comment on the record and X, formerly known as Twitter, indicated it does not answer press questions.

But Facebook’s Adversarial Threat Report from November 2023 emphasizes the importance of communication with the government and law enforcement to combat election interference — and acknowledged that the federal government has been limited in its ability to share information since July 2023.

“While we’ve continued to strengthen our internal capacity to detect and enforce against malicious activity since 2017, external insights from counterparts in government, as well as researchers and investigative journalists, can be particularly important in detecting and disrupting threat activity early in its planning taking place off-platform,” the policy says.

Many of the posts in question came at the edges of the partisan political discourse, particularly among the right wing, including posts challenging COVID-19 health measures, and the October 2020 article about Hunter Biden’s laptop. One of the plaintiffs was Jim Hoft, the owner of the St. Louis-based conservative site The Gateway Pundit. Two others were authors of the Great Barrington Declaration, which questioned COVID-19 lockdown policies.

The content moderation decisions on the Hunter Biden laptop story and the Great Barrington Declaration both took place before Biden took office.

“It ought to scare the bejesus out of every American that the government would wield this much power to silence,” Schmitt said. “Because you might like, one day, somebody being censored, because it’s the thing you don’t like, but the next day, it could be your own. And I don’t want to see that.”

Republican lawmakers frequently criticize content moderation policies, claiming that the rules stifle conservative speech. Twitter, in particular, had either suspended or banned the accounts of some prominent conservatives — like former President Donald Trump and Rep. Marjorie Taylor Greene — before it was sold to Tesla CEO Elon Musk.

Still, many academics have studied the spread of false information on social media — and Special Counsel Robert Mueller found that Russia spread disinformation on social media in the hopes of influencing the 2016 presidential election.

Schmitt has said the Biden administration uses the term misinformation “as an excuse to censor” speech and has pushed back against content moderation policies. He has said if the companies limit speech, they should be subject to lawsuits like other media companies.

“You combat speech you don’t like or that you think is misinformed or misguided with more speech,” Schmitt said. “Not suppressing or prohibiting somebody else from speaking. And that’s really at the heart of what our First Amendment protects us from and why this case is so important.”

Casey Mattox, the vice president of legal and judicial strategy at Americans for Prosperity, a conservative-leaning think tank, has said he believes Congress will have to increasingly make decisions regarding the regulation of social media companies and how they handle content.

He compared social media to the printing press rather than the town square, and said it has allowed more people to amplify their opinions and find an audience.

“Having an effective voice also means that you’re going to cause concern because of the fact that your speech is actually able to impact people,” Mattox said. “You’re going to have people who don’t like that fact, who want to regulate them.”

Monday’s oral argument will be the second case the Supreme Court has heard about content moderation. Earlier this month it heard a case challenging laws in Florida and Texas that attempted to prevent social media companies from removing certain content from its platforms.

Sen. Josh Hawley, a Missouri Republican who has pushed for tougher penalties for social media companies, fretted that the court could set up a situation where it’s difficult to challenge the power of social media companies.

“It’s a very significant case when you have a federal district court and a federal appellate court saying that the White House violated the First Amendment and did it by colluding, essentially with the biggest, most powerful media companies in the world,” Hawley said. “And I can’t think of a precedent in our history for that. So that’s a big deal.”

Advertisement