The disinformation war has taken a toll, but researchers feel a shift

Leila Register

After weathering a yearslong political and legal assault, researchers who study disinformation say they see reasons to be cautiously hopeful as their efforts heat up ahead of the 2024 election.

These researchers, along with the universities and nonprofits that they work for, have been in the crosshairs of Republicans and their allies, accused of acting as government proxies in a Biden administration plot to censor conservative speech online. Since 2021, those who worked to identify and combat disinformation around the last presidential election and Covid-19 have faced lawsuits, congressional inquiries and attacks online and in right-wing media that have threatened their reputations, careers and personal safety.

But recently, those researchers have achieved quiet but significant victories that could signal a shift in the larger war against disinformation.

On Monday, during oral arguments at the Supreme Court, most of the justices voiced some support for governments and researchers working with social media platforms on content moderation, especially related to national security, emergencies and health. After a year of sensational public hearings, a Republican-led congressional committee tasked with discrediting researchers and proving their collusion with the government and tech companies has produced little. And programs under which federal law enforcement shared information with platforms, which had been paused in response to the Republican efforts, have recently resumed.

Darren Linvill, co-director of Clemson University’s Media Forensics Hub, who testified and fielded “onerous” records requests last summer as part of Republican House Judiciary Chairman Jim Jordan’s investigation into the “weaponization” of the federal government, said he was somewhat optimistic about a cooling of the ire directed at those who study online disinformation “into something that’s more reasonable.”

“I’m not overly hopeful, given the nature of our politics today,” Linvill said. “You know, hope is a trap. But I will dare to hope.”

With less than eight months until a presidential election likely to be marred by unprecedented disinformation, the loss of coordination between government, researchers and platforms over the past several years will undoubtedly be felt. The programs most targeted by recent attacks include nonpartisan partnerships between universities and research groups.

Most notably, the Election Integrity Partnership, led by the University of Washington’s Center for an Informed Public and the Stanford Internet Observatory, identified and tracked false and misleading information online in real time. In addition to publishing findings and analysis, the groups also forwarded potentially harmful disinformation to platforms for review. That partnership is no longer active.

But other work has continued, and researchers are closely watching the case that was argued before the Supreme Court this week as a potential sign that right-wing efforts have hit a limit.

That case accuses the Biden administration of violating the First Amendment by coercing social media platforms to remove or limit the reach of posts from conservatives. It spurred a Donald Trump-appointed Louisiana judge to enjoin the government from meeting with social media companies over content, which halted information sharing.

Even some of the court’s most conservative members expressed skepticism about the censorship claim during Monday’s oral arguments, and liberal Justice Sonia Sotomayor suggested the attorney arguing for the respondents — the states of Louisiana and Missouri, and individuals including an anti-vaccine activist, an epidemiologist and the owner of a conspiracy theory news website — had misrepresented facts.

“You omit information that changes the context of some of your claims,” Sotomayor scolded Louisiana Solicitor General Benjamin Aguiñaga. “You attribute things to people who it didn’t happen to.”

The New Civil Liberties Alliance, a group representing the respondents, replied to a request for comment with a statement saying in part that should the federal government prevail, the court “will have abandoned the First Amendment”

The researchers who were listening, some from the livestream and at least one in person at the Supreme Court, heard in the justices’ questioning a recognition of the kind of factual inaccuracies that have frustrated them.

Kate Starbird, director of the Center for an Informed Public, whose work on disinformation has been targeted by conservative activists and Republican congressional committees, noted that the justices seemed to understand “the value of information sharing between researchers, government and platforms.” Regardless of what the justices decide, she said that her work, which most recently examined narratives about immigration and “rigged elections,” will continue.

The researchers’ opponents, too, are feeling a shift. “We’ve been had,” wrote Matt Taibbi, one of the journalists whose work disseminating internal Twitter documents provided by Elon Musk had helped build, though not prove, the theory at the heart of the Supreme Court case. “The government is on the precipice of gaining explicit permission to fully re-charge its censorship machine.”

That is what Taibbi, Musk, Jordan and many others — including elected officials and private conservative groups — say they have been fighting against. And until recently, they have been overwhelmingly successful.

In the past two years, government efforts to respond to disinformation have been shuttered. Last spring, Jordan’s congressional subcommittee began an investigation into researchers and their alleged collusion with what it calls the federal government’s “censorship regime” — a quest that has hauled a score of researchers and tech workers to often confrontational closed-door depositions in Washington. Reputational and legal threats aimed at these researchers coupled with a flood of harassment have chilled their work.

Sen. Mark R. Warner, D-Va., chairman of the Senate Intelligence Committee, characterized the multipronged assault as a “concerted effort by partisan actors to intimidate and silence” researchers, an attack that he said “poses a genuine threat” to efforts that would counter foreign attempts to sow discord and disinformation in the U.S. before the next election.

“It takes a toll,” said Rebekah Tromble, director of the Institute for Data, Democracy & Politics at George Washington University. “It eats up time that can and should be spent doing the incredibly important work. Beyond that, it drains energy and takes a real emotional and mental health toll on folks.”

Nina Jankowicz, a researcher who had served as executive director of a new Department of Homeland Security advisory board on disinformation before resigning in 2022 in response to online threats and harassment, is just beginning to get back into the field. She’s working on a soon-to-be launched project addressing attacks on disinformation researchers and free expression.

“I don’t want to count chickens before they are hatched, because I still think there’s a lot of damage to be done,” she said, nodding to the still active legal cases and the prospect of a 2024 Trump win. “It took me a good year and a half to be able to even start to think proactively about doing new work and research again. And I know that it could all come crashing down. It’s a scary time.”

Still, cautiously, Jankowicz and others acknowledged that the larger war is not yet lost.

After over a year, the House committee investigating researchers and their work on disinformation — which interviewed Jankowicz, Starbird and others in closed-door sessions last April — has yet to produce tangible results. Public hearings have not yielded actionable evidence that the federal government has been weaponized against conservatives. There have been no legal wins and no legislation has been passed.

A Judiciary spokesperson defended the subcommittee’s work in a statement, saying it was providing “constitutional oversight.” The spokesperson said the inquiry had played a critical role in uncovering evidence of the Biden administration’s efforts to “censor Americans’ constitutionally protected speech.”

The statement added the Election Integrity Partnership played “a unique role in the censorship industrial complex,” and vowed that the committee would “continue its critical investigative work.”

The credibility of the accusations at the heart of the committee's hearings and the lawsuits has been strained. Architects of the "censorship industrial complex" theory, including Taibbi and culture-war journalist Michael Shellenberger, were influenced and informed by Mike Benz, a former alt-right vlogger, who turned a two-month stint in the State Department in 2020 into a claimed expertise in cybersecurity. Most recently, Benz was the originator of the conspiracy theory that Taylor Swift is part of a Pentagon psychological operation to influence the election.

Taibbi, Shellenberger and Benz did not respond to emails requesting comment.

Renée DiResta, research manager at Stanford University’s Internet Observatory, has been thrust into the center of the censorship industrial complex conspiracy theory, branded its leader. DiResta, who is named as a defendant in a civil case brought by former Trump adviser Stephen Miller’s ultraconservative legal group, was also buoyed by the Supreme Court hearing on Monday.

“I welcome a return to discussing the nuance of content moderation policy and tackling the critical global issue of election integrity,” she said. “Once again rooted in the realm of facts.”

CORRECTION (March 23, 2024, 10:11 a.m. ET): A previous version of this article misstated the circumstances of Kate Starbird’s contact with the House committee investigating disinformation research. Starbird was interviewed voluntarily at the committee’s request; she was not subpoenaed and deposed.

Advertisement