America

Misinformation Defense Worked in 2020, Up to a Point, Study Finds

Not long after misinformation plagued the 2016 election, journalists and content moderators scrambled to turn Americans away from untrustworthy websites before the 2020 vote.

A new study suggests that, to some extent, their efforts succeeded.

When Americans went to the polls in 2020, a far smaller portion had visited websites containing false and misleading narratives compared with four years earlier, according to researchers at Stanford. Although the number of such sites ballooned, the average visits among those people dropped, along with the time spent on each site.

Efforts to educate people about the risk of misinformation after 2016, including content labels and media literacy training, most likely contributed to the decline, the researchers found. Their study was published on Thursday in the journal Nature Human Behaviour.

“I am optimistic that the majority of the population is increasingly resilient to misinformation on the web,” said Jeff Hancock, the founding director of the Stanford Social Media Lab and the lead author of the report. “We’re getting better and better at distinguishing really problematic, bad, harmful information from what’s reliable or entertainment.”

“I am optimistic that the majority of the population is increasingly resilient to misinformation on the web,” said Jeff Hancock, the lead author of the Stanford report.Credit…Ian C. Bates for The New York Times

Still, nearly 68 million people in the United States checked out websites that were not credible, visiting 1.5 billion times in a month in 2020, the researchers estimated. That included domains that are now defunct, such as theantimedia.com and obamawatcher.com. Some people in the study visited some of those sites hundreds of times.

As the 2024 election approaches, the researchers worry that misinformation is evolving and splintering. Beyond web browsers, many people are exposed to conspiracy theories and extremism simply by scrolling through mobile apps such as TikTok. More dangerous content has shifted onto encrypted messaging apps with difficult-to-trace private channels, such as Telegram or WhatsApp.

The Spread of Misinformation and Falsehoods

  • Deepfakes: Meme-makers and misinformation peddlers are embracing artificial intelligence tools to create convincing fake videos on the cheap.
  • Cutting Back: Job cuts in the social media industry reflect a trend that threatens to undo many of the safeguards that platforms have put in place to ban or tamp down on disinformation.
  • A Key Case: The outcome of a federal court battle could help decide whether the First Amendment is a barrier to virtually any government efforts to stifle disinformation.
  • A Top Misinformation Spreader: A large study found that Steve Bannon’s “War Room” podcast had more falsehoods and unsubstantiated claims than other political talk shows.

The boom in generative artificial intelligence, the technology behind the popular ChatGPT chatbot, has also raised alarms about deceptive images and mass-produced falsehoods.

The Stanford researchers said that even limited or concentrated exposure to misinformation could have serious consequences. Baseless claims of election fraud incited a riot at the Capitol on Jan. 6, 2021. More than two years later, congressional hearings, criminal trials and defamation court cases are still addressing what happened.

The Stanford researchers monitored the online activity of 1,151 adults from Oct. 2 through Nov. 9, 2020, and found that 26.2 percent visited at least one of 1,796 unreliable websites. They noted that the time frame did not include the postelection period when baseless claims of voter fraud were especially pronounced.

That was down from an earlier, separate report that found that 44.3 percent of adults visited at least one of 490 problematic domains in 2016.

The shrinking audience may have been influenced by attempts, including by social media companies, to mitigate misinformation, according to the researchers. They noted that 5.6 percent of the visits to untrustworthy sites in 2020 originated from Facebook, down from 15.1 percent in 2016. Email also played a smaller role in sending users to such sites in 2020.

Other researchers have highlighted more ways to limit the lure of misinformation, especially around elections. The Bipartisan Policy Center suggested in a report this week that states adopt direct-to-voter texts and emails that offer vetted information.

Social media companies should also do more to discourage performative outrage and so-called groupthink on their platforms — behavior that can fortify extreme subcultures and intensify polarization, said Yini Zhang, an assistant communication professor at the University at Buffalo.

Professor Zhang, who published a study this month about QAnon, said tech companies should instead encourage more moderate engagement, even by renaming “like” buttons to something like “respect.”

“For regular social media users, what we can do is dial back on the tribal instincts, to try to be more introspective and say: ‘I’m not going to take the bait. I’m not going to pile on my opponent,’” she said.

A QAnon flag on a vehicle headed to a pro-Trump rally in October. Yini Zhang, a University of Buffalo professor who published a study about QAnon, said social media companies should encourage users to “dial back on the tribal instincts.”Credit…Brittany Greeson for The New York Times

With next year’s presidential election looming, researchers said they are concerned about populations known to be vulnerable to misinformation, such as older people, conservatives and people who do not speak English.

More than 37 percent of people older than 65 visited misinformation sites in 2020 — a far higher rate than younger groups but an improvement from 56 percent in 2016, according to the Stanford report. In 2020, 36 percent of people who supported President Donald J. Trump in the election visited at least one misinformation site, compared with nearly 18 percent of people who supported Joseph R. Biden Jr. The participants also completed a survey that included questions about their preferred candidate.

Mr. Hancock said that misinformation should be taken seriously, but that its scale should not be exaggerated. The Stanford study, he said, showed that the news consumed by most Americans was not misinformation but that certain groups of people were most likely to be targeted. Treating conspiracy theories and false narratives as an ever-present, wide-reaching threat could erode the public’s trust in legitimate news sources, he said.

“I still think there’s a problem, but I think it’s one that we’re dealing with and that we’re also recognizing doesn’t affect most people most of the time,” Mr. Hancock said. “If we are teaching our citizens to be skeptical of everything, then trust is undermined in all the things that we care about.”

Back to top button