Political Ideology and Science-Based Misinformation Through Social Media

February 27th, 2020

maxresdefault.jpg
 
 

Dangers of Misinformation

On November 8th 1989, British Prime Minister Margaret Thatcher delivered a speech on the floor of the United Nations General Assembly, and gave a stern warning about what would happen if we, as the human population, were to ignore scientific evidence and continue with our unsustainable actions, “we are seeing vast increases in the amount of carbon dioxide reaching the atmosphere...at the same time this is happening, we are seeing the destruction of a vast scale of tropical forests which are uniquely able to remove carbon dioxide from the air (Thatcher, 1989).” Thatcher is able to use scientific evidence in an attempt to convince global leaders to fight climate change, yet, we can now see that this was an unsuccessful attempt due to the fact that this speech parallels those which we are hearing now. This demonstrates how, even 40 years later, the general population continues to disregard scientific evidence. A reason for this dismissal of science-based information can be directly attributed to the rise of misinformation through social media. As explained by professor James Livingston, the objective of social media is to provide a system of communication which supplies accurate information for the citizens of a country (Livingston, 2011). Yet, it is argued that social media is giving science-based misinformation, which can be defined as, “news stories that were fabricated (but presented as if from legitimate sources) in order to deceive the public for ideological and/or financial gain" (Pennycook et al., 2019), a platform on which said information is amplified and is able to reach more people than ever before.

Past Research

This newly emerging market in which people distribute scientifically charged misinformation has prompted political science professors and academics alike to research a connection between political ideology and the susceptibility to believe anti-science forms of disinformation. Through their research, professors have stated that conservatives are more likely to believe scientific information which disagrees with their own politically-charged interpretation (Washburn & Skitka, 2018). However, new research focused on this field has emerged, proving that both conservatives and liberals are equally susceptible to the spread of anti-science misinformation (Ditto et al., 2019). Instead, the reason that past research has shown a one sided sensitivity to believe such information is due to a vast number of variables that have not been properly assessed in past studies. In reality, this paper will argue that a person’s vulnerability to science based misinformation is not correlated to one’s political ideology, and will attempt to explain why past studies have arrived at said conclusion. In addition, multiple solutions will be analyzed and assessed in order to come up with a viable answer to this growing problem, which, if left unchecked, could lead to a population which blatantly disregards factual scientific evidence.

Social Media Use

Firstly, it is important to understand the nature of social media, and how it has grown into Americans’ main source of news. According to Hunt Allcott and Matthew Gentzkow, both members of the National Bureau of Economic Research, 62% of US adults get their daily news through social media, particularly through Facebook and Twitter (Allcott & Gentzkow, 2019). This shows a stark contrast to the way news was spread just a couple of years ago, where only 47% of US adults gathered their news through social media sites (Gottfried & Mason, 2019). An explanation as to why social media has become the biggest source for American’s intake of information can be attributed to the rising distrust of traditional media. From the statistics above, one can attribute this change to both the rising anti-media rhetoric commonly heard in politics today, and the fact that it is simply more convenient to get news information through a social media site such as Facebook or Twitter. Whatever the case, news and information through social media manages to reach more people than ever before, but with this reach also comes the spread of misinformation, and more specifically, the rise of anti-science misinformation.

Creation of Misinformation

The reason for false news creation has been continuously disputed by professors and academics alike; some believe that misinformation exists because unorthodox news tend to attract a larger following, while others believe political ideology entices people to believe certain information. Deb Roy, managed to compile 100,000 different stories shared by three million individual users through Twitter, and was able to discover a connection between misinformation and novelty. His research came to the conclusion that novelty captivates human attraction and encourages the sharing of information (Roy et al., 2018), if applied to science-based misinformation, it is made clear why it is such a huge problem; it is simply human nature to believe news stories which are more interesting and overblown. Furthermore, due to the huge growth of misinformation through social media, many have attempted to turn the issue into a partisan one, claiming that political ideology is responsible for the spread of misinformation.

Conservative vs. Liberal Susceptibility

The conversation of science-based disinformation has more recently been turned into one of conservative susceptibility as compared to liberal susceptibility. Professors Pennycook and Rand’s research shows that conservatives, “tend to be less reflective and more intuitive than liberals…[and] are more distrusting of information outlets” (Pennycook et al., 2019). By emphasizing that there is an asymmetric (one sided) bias in the case of misinformation through social media outlets, many arrive at the conclusion that conservatives are more likely to deny science-based misinformation when compared to liberals. Consequently, these researchers force us to consider the notion that certain political ideals can have a deep impact on a person’s susceptibility to believe inaccurate information. Additionally, Mark Brandt stated that conservatives tend to be, on average, more traditional than liberals (Brandt, 2014). Therefore, since traditional beliefs are associated with a dismissal of new ideas, conservatives have a harder time accepting new scientific facts that contradict their traditional beliefs. Furthermore, Brandt’s research is supported by that of journalist Chris Mooney, who stated that due to conservatives’ more traditional nature, “they may be more disproportionately inclined and/or motivated to be skeptical and distrustful of scientific evidence than liberals” (Mooney, 2012). Yet,  these discoveries beg the question: are people of a certain political party really as susceptible as some professors and academics insist? 

Confirmation Bias

Many do not believe this common narrative, and have provided research that bias towards science-oriented misinformation through social media can be seen in both political ideologies. They claim that both liberals and conservatives are equally likely to deny scientific evidence if it contradicts their own political opinions (Washburn & Skitka, 2017). Sometimes referred to as “confirmation bias” further studies have emerged which show that both political ideologies disregard the correct interpretation of information if it is in direct contradiction with their previously held beliefs. Additionally, work done by James Kuklisnki shows that both liberals and conservatives cannot differentiate between factual reporting and misinformation, and instead choose which information to consider “real” (Kuklinski, 2019). Therefore instead of, “impartially evaluating evidence in order to come to an unbiased conclusion, [those who fall subject to science-based misinformation] build a case to justify a conclusion already drawn” (Nickerson, 1998). In other words, political ideology follows the proven human psychology to form opinions about a topic even while a contradictory fact is placed right in front of you. Through their analysis, many professors are able to disprove the common narrative that, “conservatives are more likely than liberals to believe misinformation.”

Reason for Incorrect Results

If the notion that a certain political ideology will make people more susceptible to falsified scientific information has been proven to be untrue, then why have multiple studies yielded the opposite results? According to Andrew Guess, the reason that professors and researchers arrived at the incorrect conclusion that conservatives are more likely to believe science-based misinformation is because they did not account for which political party was in power at the time of their research. Since many of these studies were conducted leading up to the 2016 presidential elections, Guess argues that while a liberal president is in power, there will, by virtue, be more falsified information aimed towards conservatives (Guess et al., 2019). Consequently, since more science based misinformation is aimed towards conservatives, research on this issue will obviously be skewed against conservatives, since they have been exposed to a larger amount of misinformation during the last couple of years. This theory is supported by Richard Hofstetter, who states that, “republicans are more misinformed because there is simply more anti-liberal misinformation in the world” (Hofstetter et al., 2019).

Echo Chambers

This false notion that republicans/conservatives are somehow more susceptible to misinformation is also fueled by the nature of social media, which creates echo chambers, which can be defined as a place (in this case social media) where, instead of an idea being challenged by others, it is simply acknowledged and repeated back. This is incredibly dangerous since it limits one’s ability to change their own opinion, which can be based on misinformation. Edward Kessler explains that, “a majority of people tend to join social networks of like-minded individuals. The overall trend is that people talk to people with whom they agree” (Kessler, 2013). Empirically speaking, Eytan Bakshy led a research team which gathered data from Facebook users to prove Kessler’s point, and discovered that on average, 80% of an individual’s social media friends were of the same political party (Bakshy, 2015). This makes for an ideal situation in which misinformation can spread, since one person falling for science-based misinformation can amplify the story by sharing it with like-minded individuals who are also prone to believe said information. This cycle, combined with the theory that there was more science-based misinformation aimed at conservatives back in 2016 (Guess, Hofstetter) can help show why professors arrived at the incorrect conclusion that one political party is more inclined to believe scientific disinformation.

Fact Checking Sites

While the solution to acquire more accurate data in respect to this matter is more simple; researchers need to take into account which political party is in power and the fact that echo chambers inflate the numbers, the issue of stopping or even alleviating the flow of science-based misinformation through social media is more difficult. Some simply do not grasp how hard it is to stop the spread of misinformation, and believe that simple fact-checking sites could solve the problem. Yet, research conducted by Jeffrey Gottfried found that almost half of all Americans believe fact-checkers to be inaccurate and biased (Gottfried & Mason, 2019), dealing a huge blow to the  “create more fact-checking websites” solution. With this information, it is now easy to see the vast scope of this ever changing problem, yet there could be a viable solution; integrate fact-checking into the social media site itself, and set government regulations for what qualifies as misinformation.

Proposed Solution

While independent fact checking sites have existed for many years, they have always been separate from the social media sites themselves. For example, European fact-checking site Stopfake.org is an independent organization that was launched in 2014 and has the mission of combating misinformation (Norman, 2018). Yet, sites like this are not embedded in the social media sites directly, therefore requiring people to open another website and try to figure out what information is real. Other countries, such as Qatar, have created government websites in order to debunk such misinformation, nevertheless such Qatari websites were used, “to counter what Qatar regards as fake news distributed by geopolitical rivals” (Norman, 2018). In this specific situation, the problem lies in the definition of what misinformation is, since biased fact checking websites can create their own definition and use it to further their political narrative. This is why a solution which embeds fact-checking sites that are based on the government’s definition of misinformation would work well. Companies such as Facebook and Twitter would create their own fact-checking algorithms that are paired with their individual companies, while the U.S. government would provide the restrictions for what defines misinformation. However, a solution like this would be incredibly hard to integrate due to the fact that defining the word misinformation would become an increasingly polarized debate in American politics, where each party would form their own definition in order to accommodate for their political interest. Additionally, this solution does not address the issues stated by Jeffrrey Gottfried and Deb Roy, who discovered that Americans do not trust fact checking and are more attracted to novel news, which tends to be misinformation. However, despite these limitations, the research shown in this paper makes it clear that something must be done in order to mitigate the spread of science-based misinformation through social media, and the solution stated above has the best chance to do so.