Brendan Nyhan stands in front of a glowing red background, his face half in shadow. The red background has several white icons of people, head and shoulders only, connected by glowing lines.

moments
of truth

From conspiracy theories to anti-vaccine movements, misleading information continues to impact American society. As the country is frayed by partisanship, Swarthmore alumni and faculty take varied approaches to addressing the problem of false claims and their propagation. Brendan Nyhan ’00 is among them — his research is focused on misperceptions around politics and health care.
by Roy Greim ’14
laurence kesterson
I

n 2020, after placing first in a legal writing competition, Alexander “Sasha” Rojavin ’14 reached out to Professor of German and Film & Media Studies Sunka Simon to thank her. Rojavin’s article, which discussed federal counter-disinformation legislation reform, had its genesis in an independent study he took with Simon in 2014. As a film & media studies and theater double major, he was fascinated by propaganda films, and together, they designed a course to examine the intersections between the arts, humanities, and information warfare.

Six years later, the class and Rojavin’s email sparked an idea for Simon: Swarthmore could bring together academics from different disciplines to tackle the multifaceted problem of disinformation.

“In her characteristic, bombastic, take-no-prisoners style, Sunka said, ‘Let’s turn this into a class,’” recalls Rojavin. “I thought that she was joking, but she wasn’t.”

“we need to begin thinking outwards, stretching our feelers into communities of content producers and policymakers.”
— Professor of German and Film & Media Studies Sunka Simon
Together, the pair put together the Disinformation Studies Project, which featured a symposium in the spring of 2022 and aims to offer a diversified curriculum of courses in the coming years. The symposium invited scholars in the arts and sciences from across the country to “diagnose the mechanisms and effects of information warfare across platforms, media, nations, disciplines, and cultures.”

“I think it’s really important that the liberal arts are wrestling with this in an academic fashion,” says Simon. “And then we need to begin thinking outwards, stretching our feelers into communities of content producers and policymakers so this can be a truly interdisciplinary, cross-sectoral project.”

Rojavin, who is fluent in Russian, has immersed himself in this varied approach to studying disinformation: He earned a communications master’s degree at Temple University, writing a thesis on the broadcasting landscape in Ukraine, and subsequently obtained a law degree. He currently works as an intelligence, media, and policy analyst and also lectures on post-Soviet information warfare at the Middlebury School of Russian during the summer.

Sunka Simon stands in front of leafy, green trees, looking up and wearing a black t-shirt.
laurence kesterson
“I think it’s really important that the liberal arts are wrestling with this in an academic fashion,” says Professor of German and Film & Media Studies Sunka Simon. She and her former student, Sasha Rojavin ’14, put together the Disinformation Studies Project, which featured a symposium on campus this spring 2022.
“If we think of information warfare in biological terms, we can compare individual disinformation narratives to viral strains and counter-disinformation measures like improving digital literacy to vaccines.”
— Sasha Rojavin ’14
headshot of Sasha Rojavin
courtesy of sasha rojavin ’14
Sasha Rojavin ’14, who is fluent in Russian, works as an intelligence, media, and policy analyst and also lectures on post-Soviet information warfare.
Among other goals, Rojavin and Simon hope that the Disinformation Studies Project can pass on a cross-cultural media literacy that will help media consumers find reputable outlets, ones that vet their sources and have a history of upholding democratic principles in their reporting. They argue that limiting the initial access to disinformation will lower the need for countermeasures, which can have limited efficacy once a person has become exposed.

“If we think of information warfare in biological terms, we can compare individual disinformation narratives to viral strains and counter-disinformation measures like improving digital literacy to vaccines, because they are preventative. And at this time, preventative measures are known to be more effective in combating disinformation spread,” Rojavin says.

The duo is currently working on a scaled-up version of the symposium that aims to bridge the implementation and knowledge gaps between government, academia, and the private sector.

Brendan Nyhan, from the shoulders up, standing in front of a colorful emoji background.
laurence kesterson
“Consolidating the power to regulate speech into the hands of relatively few private companies is potentially dangerous,” says Brendan Nyhan ’00, James O. Freedman Presidential Professor in the Department of Government at Dartmouth College. “It’s a tricky needle to thread because efforts to clamp down on misinformation risk harming principles of free and open debate and can easily become quite illiberal themselves.”

a tricky needle to thread

The terms “disinformation” and “misinformation” are often used interchangeably, which can cause confusion. So what exactly is the difference, and why does it matter?

Brendan Nyhan ’00, James O. Freedman Presidential Professor in the Department of Government at Dartmouth College, says that “disinformation” refers to knowingly false claims that are spread with malicious intent, while “misinformation” is considered to make no assumptions about the beliefs or motives of those involved.

“It can be very compelling to talk about disinformation, and it would be nice if the problem were as simple as bad actors intentionally spreading false claims,” explains Nyhan. “However, there are many more people who hold these beliefs sincerely and repeat them because they think they are true. That’s why this is such a complex issue.”

Nyhan’s research focuses on misperceptions about politics and health care and has received attention for its surprising findings on misinformation: For example, a 2020 study he undertook concluded that the effect of “fake news” on the U.S. presidential election was exaggerated.

“As we examined the data, I was struck by the extent to which it showed that exposure to untrustworthy information was largely concentrated among the 10 percent of Americans with the most conservative news diets,” says Nyhan. “It’s still a problem because some of them are consuming quite a lot. But the story that the data told was very different from the one we were hearing in the news.”

Another study by Nyhan suggested that YouTube’s algorithm isn’t pushing users toward extremist or white supremacist channels unless they have already shown a strong interest in that type of fringe content. And though the number was very small relative to the entire YouTube user base, Nyhan worries that such groups have grown “disproportionately visible in public life as their claims are being mainstreamed by influential political leaders.”

“Too much of what we think about misinformation is based on anecdotal data, and we don’t have a baseline truth that is informed by peer-reviewed research.”
— Brandon Silverman ’02
When it comes to solutions, Nyhan cautions against entities such as Facebook’s parent company Meta and Google having freer rein in content moderation as a way to curb misinformation on their platforms.

“Consolidating the power to regulate speech into the hands of relatively few private companies is potentially dangerous,” he says. “It’s a tricky needle to thread because efforts to clamp down on misinformation risk harming principles of free and open debate and can easily become quite illiberal themselves.”

Instead, Nyhan argues for greater transparency from tech companies, which has been the focus of concerned citizens like Brandon Silverman ’02.

opening the black box

Silverman knows better than most how stories, misinformed or otherwise, can spread like wildfire online. In 2012, the Honors philosophy graduate joined Matt Garmur ’01 (whose degree is in computer science and music) in co-founding CrowdTangle, an analytics tool that helped publishers understand what was happening on social media. After Facebook acquired the service in 2016, it gained, in Silverman’s words, “a powerful home that supercharged our work to help publishers and news outlets.”

CrowdTangle became increasingly important for groups such as think tanks, human rights activists, and academic researchers, as it grew into one of the most valuable tools for monitoring social media.

But all was not well; as the New York Times reported in January of 2022 the efforts of Silverman and his team to expand transparency on the platform “had increasingly become an irritant to his bosses, as it revealed the extent to which Facebook users engaged with hyperpartisan right-wing politics and misleading health information.”

In October 2021, Silverman left Facebook and began working with legislators, both in the U.S. and abroad, on creating a robust legal framework that would require companies to share data and be more open about their algorithms, rather than allowing them to maintain inscrutable “black boxes” of information.

Reflecting on the challenges of expanding transparency from within Facebook itself, he told the U.S. Senate Judiciary Subcommittee on Privacy, Technology, and the Law on May 6, “When your data is misinterpreted to create a misleading narrative that is repeated for months on end in some of the most prominent news outlets in the country, it’s uncomfortable. … Those moments take a toll on your team.”

Brandon Silverman wears a collared blue button-down. He stands in a wooded area and flowers cast their shadow on him.
courtesy of ian bates
“When your data is misinterpreted to create a misleading narrative that is repeated for months on end in some of the most prominent news outlets in the country, it’s uncomfortable … Those moments take a toll on your team,” says Brandon Silverman ’02, who co-founded CrowdTangle, an analytics tool that helped publishers understand what was happening on social media. After Facebook acquired the service in 2016, it gained, in Silverman’s words, “a powerful home that supercharged our work to help publishers and news outlets.”
In the U.S., Silverman has been instrumental in helping draft the Platform Accountability and Transparency Act, a bipartisan bill proposed last December that has drawn praise for its technical savvy. Among other provisions, PATA requires social media companies to provide data to independent researchers who submit proposals to the National Science Foundation and allows the Federal Trade Commission to mandate regular disclosures by such companies.

Silverman has also advised lawmakers on language in the Digital Services Act, a piece of legislation that was approved by the European Parliament in July. Similar to PATA, DSA requires greater transparency regarding recommendation algorithms and data sharing with authorities and researchers.

“Too much of what we think about misinformation is based on anecdotal data, and we don’t have a baseline truth that is informed by peer-reviewed research,” says Silverman, who is co-teaching a Stanford University course on transparency this fall.

“Putting this data in the hands of researchers is important because we have an entire field of study that is trying to build a really important corpus of knowledge for the world, but it doesn’t have the data it needs to do it.”