- A new survey says that people who get most of their news through Facebook may be less likely to be vaccinated against COVID-19.
- People who rely on Facebook are less trusting of the news media, according to the survey.
- Learning how to spot misinformation on social media and calling it out by providing reputable data is one way to curb it.
All data and statistics are based on publicly available data at the time of publication. Some information may be out of date. Visit our coronavirus hub and follow our live updates page for the most recent information on the COVID-19 pandemic.
Facebook gives people a way to stay connected and share photos, stories, and opinions.
And according to a survey conducted in June, it’s also an avenue to influence whether people get vaccinated against COVID-19.
The survey, led by The COVID States Project, found that people who get most of their news via Facebook are less likely than the average American to be vaccinated against COVID-19.
Katherine Ognyanova, PhD, a co-author of the survey results, is an associate professor of communication at the Rutgers School of Communication and Information and part of a coalition of researchers from Rutgers-New Brunswick, Northeastern, Harvard, and Northwestern universities.
She said the findings suggest there’s a considerable group of vaccine-hesitant people who get their COVID-19 information primarily from social media.
“This could be because they encounter more bad information on those platforms. False stories can spread fast and reach large groups of people online. It could also be because Americans who do not trust traditional institutions (mainstream media, the government, health experts) rely primarily on social media for their news. Most likely, it is some combination of the two, and we need more research to better understand what’s happening,” Ognyanova told Healthline.
As part of the survey, respondents were asked questions about sources they use for news and COVID-19 information, including, Facebook, CNN, Fox News, MSNBC, the Biden administration, and Newsmax.
Researchers discovered that Facebook is a major source of information, comparable with CNN or Fox News.
They also found that Facebook users are less likely to be vaccinated against COVID-19 than those who get their COVID-19 information from Fox News.
Additionally, Ognyanova said that Newsmax was the only source in the survey whose viewers noted lower vaccination levels and higher vaccine resistance than respondents who turn to Facebook for health news.
“Misinformation in any form always has the potential to harm, sometimes with deadly consequences. This is especially true when we talk about misinformation that steers people away from seeking appropriate medical care,” Dr. Joseph M. Pierre, professor in UCLA’s department of psychiatry and biobehavioral sciences, and author of the column Psych Unseen, told Healthline.
As of June 2021, 99 percent of COVID-19 deaths were occurring among unvaccinated people, he added.
“Statistics like that speak for themselves,” Pierre said.
The COVID States Project survey found that respondents who rely exclusively on Facebook for pandemic information were more likely to believe misinformation, such as claims that the COVID-19 vaccines will alter DNA or that they contain microchips to track people.
“Online misinformation can increase the levels of uncertainty among people who are vaccine-hesitant, and harden the conviction of those who are vaccine-resistant. To be sure, it is only one among many factors that drive people’s decisions to get vaccinated. But it remains an important issue to tackle (along with many other logistic challenges) if we want to put the pandemic behind us,” Ognyanova said.
Mistrust in media is another factor that leads to vulnerability toward misinformation.
According to the survey, people who rely on Facebook are less trusting of the media.
Thirty-seven percent of the people who got their news exclusively through Facebook in the preceding 24 hours said they trust the media “some” or “a lot” compared with 47 percent for everyone else.
Additionally, the survey found that:
- 37 percent of people who turn to Newsmax and 21 percent who rely on Fox News for COVID-19 news had beliefs in misinformation
- 7 percent of people who looked at multiple sources of information — but didn’t turn to Facebook, Newsmax, or Fox News — believed in at least one false claim
“We live in an era of rampant mistrust — of government, of the media, of scientific institutions, and of our neighbors. Within the free market of ideas that is the internet, that means that counter-information in the form of misinformation and deliberate disinformation will be there to fill the void that mistrust leaves behind,” Pierre said.
Despite efforts from platforms like Facebook to stop misinformation, it continues to spread because of how quickly it can reach millions of people when shared by popular influencers or posted on Facebook groups with millions of members before it’s removed.
Pierre added that misinformation spreads faster and further than accurate news does.
Because of this, disinformation has become a profitable industry.
“It sells. And any time something is profitable — and still mostly unregulated — it’s unlikely to stop,” Pierre said.
Ognyanova agreed, stating that misinformation is unlikely to go away anytime soon due to financial or ideological incentives to produce it.
“In the context of health, harmful claims can get elevated and distributed by people who genuinely believe they are spreading useful information,” she said.
Solutions that combine multiple approaches, such as technological, social, regulatory, and educational, are the best way to curb misinformation, said Ognyanova.
“Misinformation corrections and general health recommendations are most persuasive when they come from a trusted party. Corporate and government actors need to work together, as well as involving researchers and teachers,” she said.
Pierre said institutions of authority have to address mistrust by being transparent and engaging the public.
Educating the public about how to separate reliable information from bogus information in online spaces and the media is also needed. This involves learning how to read past headlines, how to separate facts and opinions, how to spot bias, and basic data reasoning, said Pierre.
“That’s something that, for the most part, isn’t part of education at all. The reality is that this might take a generation to fix, assuming we got started now,” he said.
Additionally, he pointed to a debate regarding misinformation.
“Should [there] be limits on the free market of ideas or what I call — because it’s so chaotic, rewarding the loudest and most outrageous voices — the ‘flea market’ of ideas?” Pierre said.
This debate brings up questions like:
- Should we encourage unregulated free speech, allowing anyone and everyone to have a podium and microphone, so to speak, if the result is that misinformation is amplified above truth?
- Is it in our best interests to live in a world where we can’t agree on something as basic as what truth and facts are?
“I say no, but that’s something we’re all going to have to decide as a society,” Pierre said.
Next time you’re scrolling through Facebook or another platform and you see a friend share misinformation, Pierre suggested that you “think before you click” and “read before you share.”
“I do think there’s a responsibility to counter misinformation in its place — that is, calling out misinformation when we see it posted online by people we know — but there’s always a risk of getting mired into unproductive debate and conflict,” he said.
While Ognyanova believes misinformation corrections can be effective when they come from people who are close to us, she said if you’re going to correct a friend, being able to provide not just evidence of the truth but also give context and an accessible explanation may be most effective.
“Also very important: We want to do all of that without antagonizing the friend who shared the story. In the end, even if that person is not persuaded, others who see the information may be,” she said.