In a recent study published in Scientific progressresearchers used a mathematical model to describe how the general public deviated from the best scientific guidance at the start of the coronavirus disease 2019 (COVID-19) pandemic.
They empirically mapped and quantitatively analyzed the emitter-receiver network of COVID-19 guidelines among online communities on Facebook, the dominant social media platform globally. It should be noted that Facebook has over three billion active users in almost 156 countries.
Distrust of guidelines based on the best available science has reached dangerous levels. During the pre-vaccination period of COVID-19 2020 of maximum uncertainty and social distancing, many people went to their online communities for guidance on how to avoid infection and suggested remedies. A 13.2% jump in social media users in 2020 brought the total number of users to a whopping 4.2 billion, corresponding to 53.6% of the world’s population. All of these people have joined social media to seek information on how to protect themselves and their loved ones from the wrath of COVID-19.
Unfortunately, there is a huge possibility that these members will end up being exposed to guidelines that are not the best science, which in turn leads to death due to rejecting masks or drinking bleach. It raised the question of who issues and receives guidance and how to intervene in current and future crises beyond COVID-19 (eg monkeypox or climate change disinformation).
A node and a link represent a Facebook page and a page recommending another page, respectively. Each page unites people around some common interest, and its analysis does not require access to personal information. A page member that simply mentions another page does not work. But when a link from a Facebook page recommends another page to all its members, they will automatically be exposed to new content, ie. how to create a transmitter-receiver network.
Although not all members necessarily pay attention to such content, a recent study experimentally and theoretically showed that only 25% of members can direct an online community to an alternative point of view.
About the research
In the current study, researchers hand-searched Facebook pages created in 2018 and 2019 using keywords and phrases involving COVID-19 vaccines and verified their findings through human coding and computer-assisted filters. They then indexed those pages’ links to other Facebook pages. Finally, two independent researchers classified each identified node (or Facebook page) as neutral, pro-, or anti-vaccination by reviewing its posts, About tab, and self-described category.
A professional page had content promoting best scientific guidelines; an anti-page, in contrast, opposed these guidelines, and a neutral page had community-level links with pro/anti communities. Parenting pages, for example, are considered neutral because they focus on topics such as children’s education, pets, and organic foods.
To make the initial seed of Facebook pages as diverse as possible, the researchers repeated the process of manually identifying those pages published in different languages, focused on geographic locations, and with managers from a wide range of countries. Additionally, the researchers developed a mathematical model that mimics the collective dynamics of these Facebook communities. The results of this model can be checked manually using standard calculus.
The study’s classification methodology yielded a list of 1,356 interconnected Facebook pages involving 86.7 million people. Analysis of data from December 2019 to August 2020 showed that initial conversations about COVID-19 guidelines began primarily among 501 anti-communities, comprising 7.5 million people, well before the official declaration of the pandemic in March 2020 .
It should be noted that there are 211 pro-vaccine communities and 644 neutral communities, comprising 13 and 66.2 million people, respectively. The most common locations of managers are the United States, Canada, the United Kingdom, Australia, Italy, and France.
Nearly seven million people were exclusively exposed to COVID-19 guidelines from lay communities, and 5.40 million were exposed to both. This imbalance is worse for people in parent (neutral) communities, with 1.10 million exclusively exposed to COVID-19 guidelines from lay communities. When randomly deleting up to 15% of the links related to COVID-19 from the entire web to mimic missed links on Facebook, the researchers still found that their findings and conclusions were robust.
In general, anti communities stepped in to dominate the conversation before the official announcement of the COVID-19 pandemic, and neutral communities (e.g., parenting) subsequently moved closer to extreme communities and therefore became highly exposed to their content.
Thus, parenting communities began receiving guidance on COVID-19 from anti-communities as early as January 2020, and then even began adding their own guidance to the conversation. Conversely, best scientific guidance from professional communities remained low throughout the duration of the study.
The combination of network mapping and the model showed more possible approaches to turning the conversation around than simply removing all extreme elements from the system. Removing all extreme elements may not even be the most appropriate solution. It may seem harsh, contradicting the idea of open participation and compromising the business model of increasing the number of users.
However, the research design could address the issue of online misinformation more generally, beyond COVID-19 and vaccinations. It can also help predict tipping point behavior and system-level responses to future crisis interventions.