Skip to content
BY 4.0 license Open Access Published by De Gruyter Mouton September 16, 2022

Biased, not lazy: assessing the effect of COVID-19 misinformation tactics on perceptions of inaccuracy and fakeness

  • Stephanie Jean Tsang ORCID logo EMAIL logo

Abstract

Purpose

In light of the fact that people have more opportunities to encounter scientific misinformation surrounding the COVID-19 pandemic, this research aimed to examine how different types of misinformation impact readers’ evaluations of messages and to identify the mechanisms (motivated reasoning hypothesis vs. classical reasoning theory) underlying those evaluations of message inaccuracy and fakeness.

Design/methodology/approach

This research employed data from an online experiment conducted in Hong Kong in March 2022, when the fifth COVID-19 wave peaked. The data were collected using quota sampling established by age based on census data (N = 835).

Findings

In general, the participants were not able to discern manipulated content from misinterpreted content. When given a counter-attitudinal message, those who read a message with research findings as supporting evidence rated the message as being more inaccurate and fake than those who read the same message but with quotes as supporting evidence. Contrary, one’s disposition to engage in analytical thinking and reasoning was not found to impact assessments of information inaccuracy and fakeness.

Implications

With respect to the debate about whether people are susceptible to misinformation because of cognitive laziness or because they want to protect their personal beliefs, the findings provide evidence of the motivated reasoning hypothesis. Media literacy programs should identify strategies to prepare readers to be attentive to personal biases on information processing.

Originality/value

Although many researchers have attempted to identify the mechanisms underlying readers’ susceptibility to misinformation, this research makes a distinction between misinterpreted and manipulated content. Furthermore, although the Cognitive Reflection Test is widely studied in the Western context, this research tested this disposition in Hong Kong. Future research should continue to empirically test the effects of different types of misinformation on readers and develop distinct strategies in response to the diverse effects found.

1 Introduction

During health crises, such as the ongoing COVID-19 pandemic, it is important that all citizens cooperate and follow the same set of preventive measures (e.g., social distancing, mask wearing, and handwashing) to contain the virus and its spread. However, the circulation of medical and science misinformation across social media platforms can prompt individuals to do the opposite. In this sense, the proliferation of misinformation, ranging from the spread of harmful health advice to the uptrend in conspiracy theories about the COVID-19 virus, poses a serious threat to global public health (Lewandowsky et al. 2017). Over the course of the COVID-19 pandemic, vaccine hesitancy has emerged as a worldwide concern (Lin et al. 2021), including Hong Kong (Tsang 2021a). Hong Kong is one of the many regions with a low vaccination rate. Despite a surplus of vaccines that were going to expire, the vaccination rate of the first dose of vaccines remained low, at approximately 34.8% as of July 2021 (Tsang 2022b). According to Tsang (2022b), although vaccine safety was found to be a major concern for vaccine hesitancy, knowledge of the vaccines, which is widely cited as a reason for vaccine hesitancy, did not influence people’s willingness to vaccinate.

Table 1:

Regression results with perceived inaccuracy and fakeness as the outcome variables.

Perceived inaccuracy Perceived fakeness
Model 1 Model 2 Model 1 Model 2
Gender 0.07a 0.07a 0.08a 0.08a
Age 0.11b 0.12b 0.13c 0.14c
Education 0.04 0.03 0.01 0.00
Income 0.02 0.02 0.03 0.03
Political ideology 0.04 0.03 0.07a 0.07a
R2 change 6.6 6.6 8.2 8.2
Pre-existing attitude 0.32c 0.31b 0.29c 0.28c
R2 change 6.5 6.5 5.4 5.4
Analytical thinking 0.06 0.06 0.03 0.03
R2 change 0.2 0.2 0 0
Distrust in science 0.21c 0.20c 0.14c 0.14c
R2 change 0.4 0.4 1.9 1.9
Pre-existing attitude adistrust in science 0.11b 0.07a
R2 change 1.1 0.5
Total adjusted R2 16.5 17.6 14.7 15.0
  1. ap < 0.05, bp < 0.01, cp < 0.001.

Very often, misinformation about the science of COVID-19, whether about the transmission of the virus, the efficacy of mask wearing and social distancing, or the safety of vaccines, will likely involve either quotes from or research findings published by scientists. These can easily become the building blocks of misinformation. On the one hand, genuine research findings and quotes can be misinterpreted; on the other hand, they can be (a) deliberately manipulated, (b) erroneously published alongside the names of reputable scientists and organizations (e.g., CDC, FDA, the World Health Organization (WHO)), or (c) both. To contribute to the literature on misinformation, the current research examines how different types of misinformation (misinterpretation versus manipulation) and supporting evidence (research findings vs. quotes from scientists) impact readers’ evaluations of the accuracy or authenticity of a message.

Using COVID-19 vaccination as a case study, an identification of the factors underlying readers’ evaluations of inaccuracy and fakeness will consider analytic thinking disposition, pre-existing attitudes toward coronavirus vaccines, and distrust in science. In fact, the debate about whether people are susceptible to misinformation because of cognitive laziness or because they want to protect their beliefs and identities is not new. Because trust in COVID-19 scientists and health agencies such as the WHO was found to be divided along party lines (Kerr et al. 2021) and science skepticism is argued to be on a rise (Rutjens et al. 2021), understanding the reasons why people are susceptible to science misinformation is critically important. With respect to the decline in trust in health agencies, the WHO has faced a “credibility crisis among Hongkongers” since the organization repeatedly sided with the Chinese government’s response to the COVID-19 outbreak (Grundy 2020).

Using data from a 2 (misinterpretation versus manipulation) × 2 (research findings versus quotes from scientists) online experiment conducted in Hong Kong, the participants were not found to be capable of distinguishing manipulated content from misinterpreted content. When confronted with a counter-attitudinal message, one with research findings as supporting evidence generally resulted in higher ratings of inaccuracy and fakeness than the same message with quotes as supporting evidence. Although pre-existing attitudes toward coronavirus vaccines and distrust in science were found to positively predict perceived message inaccuracy and fakeness, no evidence of analytic thinking disposition was found. The implications for fact checking and media literacy will be discussed.

2 Misinformation about scientific research

Given how prominent social media has become, not only are younger American adults spending more time on the internet, but the share of older tech adults is also growing (Faverio 2022). In Hong Kong, internet usage has been shown to be particularly high, with an internet penetration of 93.0% in 2022 (DataReportal 2022). During crises such as the COVID-19 pandemic, scientific content can be manipulated to encourage the wide distribution on social media platforms. Indeed, misinformation related to COVID-19 can be found in many countries (Zeng and Chung-hong 2021); it can include, but is not limited to, false claims about the severity of the pandemic, prevention methods, vaccines, and the condition of the confirmed cases.

In general, there is a consensus as to how misinformation is defined (Tandoc et al. 2018; Wardle 2017); here, misinformation refers to false content being produced and shared unintentionally, and it contrasts with disinformation, which is published and shared with the intent to deceive or mislead. Following the literature suggesting that intent is usually not known (Krause et al. 2022), the present study also treats misinformation as inaccurate information, regardless of the publisher’s intention. Besides, with most definitions expecting an absolute ground truth to be present, it is vital to keep in mind that ground truth in the domain of science is often fluid in nature (Krause et al. 2022), and the best expert consensus available at the time is often used to discern the accuracy of a piece of misinformation (Tan et al. 2015; Vraga and Bode 2020). To function as a ground truth, information should be verifiable and often supplied with some sort of evidence (Nyhan and Reifler 2010). Nonetheless, the verification of accuracy can sometimes be a subjective call (Vraga and Bode 2020). In some cases, the experts have not yet reached a consensus, or it is unclear where to draw the line and assert that a consensus has been formed. To avoid these situations, the present research studied two specific types of misinformation: a message that involves the misinterpretation of scientific evidence and a message that involves manipulation (altering scientific evidence).

3 Misrepresentation versus manipulation

Studies on misinformation effects do not usually distinguish between different types of misinformation, tending instead toward the distinction of topics, such as health misinformation, science misinformation, and political misinformation (Vraga et al. 2019; Wang et al. 2022; Zeng and Chung-hong 2021). As documented by Wardle (2017), mis- and disinformation not only include fabricated content (content that is 100% false and designed with an intent to deceive), but also misleading content, content with false connections or false contexts, manipulated content, imposter content, and satires or parodies. Rather than focusing on the different misinformation types sorted by the publisher’s intention (Wardle 2017), the current study focuses on discerning the effects of misinformation employing different media tactics (misinterpreting vs. manipulating scientific research findings). Given that expert consensus is often presented in the form of findings from a scientific research report and quotations from an individual scientist or research team, the present study focuses on misinformation that involves the misrepresentation or manipulation of existing scientific evidence.

Misrepresentation involves genuine content that is warped through its placement alongside distorted contextual information. For example, it may be true that certain research was conducted and that the scientists held a press conference, but the study or quotations might then be interpreted in a way that is inconsistent with the findings of the study (HKBU Fact Check 2022a). For instance, a genuine study was cited to claim that tea can reduce COVID-19 transmission risks. The research did take place but was performed outside the human body and never inside a human body. Quoting the genuine research article, a false conclusion was generated and spread. Furthermore, some misinformation is manipulated content, which involves the outright alteration of the elements of a message. A common example of this tactic is when scientific research is said to be published by a credible source, such as the WHO, but in fact, the WHO has never conducted such research. Another example is when scientific research was conducted but the content of the findings is being altered to support a false conclusion. In other words, some aspect of the information is manipulated specifically to deceive (Wardle 2017).

4 Perceived fakeness and inaccuracy

Because fake news can be defined as intentionally publishing content with “false statements of fact” (Klein and Wueller 2017), the presence of untrue facts (i.e., inaccuracy) is a necessary element of fake news. Allcott and Gentzkow (2017) suggested a similar definition, mentioning verifiably false content being published intentionally. Altogether, fake news or misinformation can be boiled down into two components: the presence of false content and the content being produced and published strategically to serve ideological or financial purposes (Tandoc et al. 2018). From the perspective of falsity (the degree of falseness), manipulated content involves higher levels of fault deviating from expert consensus.

The present research argues for the need to distinguish between inaccurate and fake information. If one considers a statement inaccurate, it might be because of a mistake, such as a misspelling, an exaggerated claim, or perhaps facts presented in a misleading manner; however, that statement does not necessarily include claims that are completely fabricated. In other words, inaccuracy is more about a lack of exactness. There is usually some basis of fact in an inaccurate statement; the author does not deliberately spread an outright falsehood. Given that both manipulated and misinterpreted content include evidence or facts, it is necessary to examine how readers process and evaluate these two types of misinformation. Most importantly, scholars must examine the degree to which individuals can tell the differences between these different types of misinformation.

H1:

Participants exposed to a fabricated article will rate the content to be (a) inaccurate and (b) fake to a larger extent than those exposed to a misleading article.

5 Research articles versus quotations

Indeed, as information circulates around digital platforms in a back-and-forth manner between social media accounts and legacy media outlets, it is not surprising to see content being distorted intentionally and unintentionally along the way (West and Bergstrom 2021). In Hong Kong, misinformation about the safety and efficacy of vaccines, the virus itself (i.e., its transmission and severity), and the prevention and treatment of those infected was identified during the pandemic (HKBU Fact Check 2022b). In fact-checking reports published about vaccinations, false scientific content was reported as mainly involving the incorrect use of data (e.g., death rates, vaccine efficacy statistics, and vaccine ingredients) and quotations from doctors, medical experts, and local and international scientists. Hence, without countering sources of accurate scientific information, the sharing of misinformation can hinder the containment of the pandemic (Zarocostas 2020). Recently, the use of scientific information to deceive has become distressingly widespread. In fact, many have cited the use of this strategy as undermining confidence in both the scientific and governmental institutions in Hong Kong, contributing to vaccine hesitancy among the older population at the time of data collection (Hong Kong Baptist University 2021).

Scientific data and findings are regarded as “objective, precise, and replicable” (West and Bergstrom 2021). However, even objective data can be presented to tell a misleading story. West and Bergstrom (2021) illustrated this through data visualization. Using objective data and visualizations based on data, the researchers were able to adjust the axes and bin sizes of the charts, reframing the data to tell a story completely counter to that of the original. Although it is generally assumed that misleading uses of data or misinterpretations of scientific findings can be identified once one acquires sufficient training, misleading quotations are much harder to spot (Wardle 2017); whether a person actually said something or an event happened as described is very difficult to ascertain. According to Dahlstrom (2014), “logical-scientific communication (using data) is context-free,” and audiences can understand facts “independently from their surrounding units of information” (p. 13,614). In other words, readers can understand a research article and process conclusions from the article using their existing beliefs in meaningful ways. In contrast, “narrative communication (relying on expert quotations) is context-dependent” (Dahlstrom 2014, p. 13,614), such that the expert (i.e., source) holds most of the power in delivering the details of the quotations, as well as the interpretations or implications of the quote.

Following Dahlstrom’s (2014) logic, the research findings presented in the form of an objective piece of information (research article) are likely to be verifiable, and audiences can see how well the conclusion provided can be generalized to their prior beliefs and knowledge. When there is a mismatch between the two, people are likely to evaluate the information as inaccurate or fake. In contrast, the research findings presented in the form of expert quotations are often exhibited as an expert opinion, as well as special cases of some abstract truths, so discerning the existence of a special account is not an easy task. As a result, readers are expected to be fooled by the use of quotations more often than those research findings presented as an article publication. Hence, it is hypothesized that participants exposed to an article that cites prior scientific studies will rate the content to be more (a) inaccurate and (b) fake than those exposed to an article that cites quotes by scientists.

H2:

Participants exposed to an article that uses prior research as evidence will rate the content to be (a) inaccurate and (b) fake to a larger extent than those exposed to an article that uses quotes as evidence.

6 The motivated reasoning hypothesis

The current investigation would be incomplete without taking one’s pre-existing attitudes into account, in this case, attitudes toward vaccination. Certain psychological factors that drive people to fall for misinformation have been identified. In general, people are more likely to believe information coming from credible sources, narratives that are logical, and like-minded information (Lewandowsky et al. 2012). Consistent with the literature on confirmation and disconfirmation biases, people have the tendency not to scrutinize information that agrees with their predispositions and beliefs while being critical toward information that contradicts their points of view (Lord et al. 1979). In other words, people are likely to be motivated to reason in such a way that their pre-existing attitudes and values—and often their political identity—are protected (Kunda 1990).

By relying on their political worldviews to interpret scientific information and evaluate scientists’ credibility (Nisbet et al. 2015), most people do not weigh scientific arguments carefully (Nisbet and Scheufele 2009). Therefore, the level of susceptibility regarding a piece of science misinformation depends on how well the information fits with the citizen’s views. Indeed, research has long supported the application of motivated reasoning in political communication (Redlawsk 2002) and recently in science information processing (Ho et al. 2008; Kraft et al. 2015). Individuals have been shown to hold opposing attitudes to evaluate the same piece of misinformation as fake to significantly different degrees (Tsang 2022a). These divergent assessments of a particular message, in turn, create perception gaps between opposing sides, contributing to motivated fake news perceptions (Tsang 2021b). Indeed, as Nielsen and Graves (2017) suggested, fake news tends to be news that one does not believe. When presented with scientific misinformation claiming vaccination is harmful, it can be anticipated that those who hold more favorable attitudes toward vaccination will be more likely to discredit the attitudinal-incongruent message and, in turn, evaluate the content as more inaccurate and fake. Furthermore, pre-existing attitudes toward vaccination are expected to moderate the relationships documented in H1 and H2 regarding the type of misinformation and type of supporting evidence, respectively.

H3:

Pre-existing positive attitudes toward vaccination are positively associated with (a) perceived inaccuracy and (b) perceived fakeness.

H4:

Pre-existing positive attitudes toward vaccination moderate the relationships stipulated in (a) H1 and (b) H2.

7 Analytical thinking disposition

The motivated reasoning account is not without a competing hypothesis—classical reasoning theory (Pennycook and Rand 2019), which has also been identified as being responsible for individuals’ misinformation susceptibility. This line of research has indicated that people are too lazy to conduct deliberative, analytical information processing (Pennycook and Rand 2019). It is believed that the deliberative processing of information would allow people to reject misleading and inaccurate content and, in turn, identify information as false (Bago et al. 2020; Pennycook and Rand 2019). People tend to believe in false messages when they fail to reflect on the information, choosing to rely on their intuition instead (Martel et al. 2020). In fact, this hypothesis has been shown to relate closely to perceived news credibility in such a way that low levels of perceived credibility can motivate systematic processing (and vice versa among people who perceive high levels of credibility), which, in turn, impacts outcomes such as the perceived risks of air pollution (Huang 2021).

It has been argued that items from the Cognitive Reflection Test (CRT) can reflect individuals’ propensity to perform analytical thinking and reasoning (Pennycook et al. 2015). In general, people have a tendency to think in a less effortful manner (Stanovich 1999), and CRT can reflect the degree to which individuals are willing to engage in more effortful thinking (Toplak et al. 2014). Pennycook and Rand (2019) found evidence that individuals who were more willing to perform analytical thinking were less likely to think falsely that fake news was accurate. Meanwhile, those whose thinking was more analytical were more likely to think that genuine news was accurate. Therefore, it is hypothesized that the disposition to engage in analytical thinking and reasoning will positively relate to the chance of figuring out that the misinterpreted and manipulated content is inaccurate and fake (H5).

Solving the above theoretical debate is vital not only because there is a gap in the literature to be addressed, but also because it has implications for misinformation reception. If classical reasoning is the drive, with people being too lazy to critically process scientific information, practitioners should aim to feed people with accurate goals or even increase the costs of being susceptible to misinformation (e.g., informing people about possible negative health consequences). On the other hand, if it is more about motivated reasoning, media literacy programs (Tully et al. 2020) might be more effective.

H5:

Analytical thinking disposition is positively associated with (a) perceived inaccuracy and (b) perceived fakeness.

8 Distrust in science

Finally, the current research considers the role played by distrust in science. Because citizens usually hold little prior knowledge about science, trust becomes vital because it gives scientists “the benefit of the doubt” (Rahn and Transue 1998). In science communication, trust has often been interpreted as the extent to which a scientist is perceived to have integrity (Gauchat et al. 2017) and to work in the public’s interest with goodwill (Horton 2016). According to Renn and Levine (1991), trust is the assumption that a communicator—in this case, a scientist—is both honest and competent in conveying accurate information. Perceiving scientists as manipulating data to achieve political aims can increase levels of distrust. In the study of misinformation, audiences’ perceptions of motives, especially those that are malicious in nature, significantly discredit the work of journalists and, in turn, lower the credibility of the news (Tsang 2022a). In this way, a perceived political agenda can undermine confidence in the perceived legitimacy of science and influence whether scientific evidence is considered in a society’s decision making. According to the theoretical framework proposed by Lewicki et al. (1998), whereas trust involves hope, faith, confidence, assurance, and initiative, distrust involves fear, skepticism, cynicism, watchfulness, and vigilance. Therefore, the current research expects a positive relationship between distrust in science and perceived message inaccuracy and fakeness.

H6:

Distrust in science is positively associated with (a) perceived inaccuracy and (b) perceived fakeness.

Furthermore, because people’s distrust of science largely depends on their understanding of scientific matters (e.g., whether scientists have performed well throughout the pandemic and to what extent science and scientists have backed up good policy decisions on vaccination and pandemic recovery), individuals’ levels of distrust largely depend on their attitudes toward the message they are about to read (i.e., vaccination). Such an expectation resonates with what the National Academics of Sciences (2015) have suggested: “When there is societal debate, public trust [in science] often becomes a function more of political identity than of scientific fact” (p. 21). Hence, RQ1 is raised.

RQ1:

Do pre-existing attitudes toward vaccination moderate the relationships stipulated in (a) H5 and (b) H6?

9 Methods

A 2 × 2 experimental survey[1] was conducted online in Hong Kong in March 2022, at a time when the Omicron outbreak had peaked (RTHK 2022) and the government calling for higher COVID-19 vaccination rates to protect residents against severe disease and death from the virus (The Government of the Hong Kong Special Administrative Region 2022). The survey was executed by a survey company (Dynata) using quota sampling based on age. The participants were asked to complete a survey concerning their position on the pandemic and COVID-19 vaccination, read a piece of misinformation on scientific research regarding the controversy, and answer questions regarding message content. They were then given a debrief clarifying that the message was manipulated for the purpose of the study.

Given that the present study manipulated the type of misinformation and message evidence, which were not reliant on participants’ perceptions, manipulation checks of the message type were not included (see O’Keefe 2003). According to Ejelöv and Luke (2020), the inclusion of a manipulation check may prompt participants to contemplate the messages involved, leading to unwanted effects (Kidd 1976). Instead, a pilot study (N = 60) using a student convenience sample was collected ahead of actual data collection to ensure that participants were able to correctly identify whether a research article was cited or whether a research team was quoted as evidence. To ensure that participants paid attention to the stimuli, they were asked to identify whether the message was related to (a) the origins of COVID-19, (b) COVID-19 vaccine safety/efficacy, (c) COVID-19 test kit sensitivity/efficacy, or (d) the need for universal community testing. Only those who were able to correctly identify the message content were allowed to complete the rest of the survey. The final sample involved 835 participants, 427 of whom were female (51.1%). The participants were, on average, 39.65 years old (SD = 12.11), with ages ranging from 18 to 78. The majority were college graduates, and the average individual monthly income was HKD 30,000–39,999. One hundred and eight-three participants reported leaning toward the democratic party (21.9%), 121 (14.5%) learning toward the pro-establishment party, and 531 (63.6%) being neutral or independent.

9.1 Experimental design and stimuli

The experiment used a 2 (misinformation type: misrepresentation vs. manipulation of scientific findings) × 2 (evidence type: data from research article vs. quotes from research team) design. Prior to reading the misinformation, the participants were asked the extent to which they agreed or disagreed with two statements: (a) Vaccination is the best way to protect Hong Kong residents against COVID-19, and (b) Not getting vaccinated is harmful to society as a whole. They were asked to rate the statements from 1 (completely disagree) to 5 (completely agree). The mean of the two items was obtained to measure pre-existing attitudes toward vaccination (M = 3.12, SD = 1.15, Cronbach’s α = 0.86). The participants were then randomly assigned to read one of four stimuli. A total of 204 participants (24.4%) were exposed to misinterpreted misinformation citing data from a research article, while 215 participants (25.7%) read misinterpreted quotes of a research team as evidence, 203 (24.3%) read manipulated misinformation citing data from a research article, and 213 (25.5%) read manipulated misinformation quoting a research team as evidence (see Appendix A).

The content, source, structure, and length of the misinformation were consistent across conditions to safeguard internal validity. The misinformation was said to be circulated on Facebook. Although the original published research article claimed that the “preliminary findings did not show obvious safety signals among pregnant persons who received mRNA COVID-19 vaccines,” the author of the misinformation misinterpreted the study and claimed that the research showed there were 82 miscarriages among every 100 pregnant women, citing genuine research findings presented in either a quotation or an article. The fabrication went further, directly claiming that the study actually found evidence that there were 82 miscarriages among every 100 pregnant women, citing manipulated research findings presented in either a quotation or an article. To further heighten the degree of fabrication, the original author (New England Journal of Medicine) was replaced with the Centers for Disease Control and Prevention (CDC),[2] which made the message an imposter (Wardle 2017). With respect to evidence type, the misinformation either cited the findings as a published research article (cited as research findings published) or a research team that had recently held a press conference (cited as quotations from scientists).

The content was adapted from misinformation and journal articles circulated on Facebook[3] in Hong Kong to safeguard external validity. Because Hongkongers have reached a consensus on whether COVID-19 vaccines will bring forward side effects and deaths, the safety of COVID-19 vaccines during pregnancy was investigated. Taking the most controversial subject into account and referencing posts being circulated on Facebook around that time, a post regarding pregnant women was chosen. Facebook was used because of it being the single most popular online platform among Hongkongers. As of 2020, there were 5.95 million Facebook users in Hong Kong (Statista 2021), meaning that approximately 76.7% of the Hong Kong population is active on the platform. In Hong Kong, Facebook not only has been a battlefield for political actors, but also a commonplace for news consumption, which makes Facebook a great opportunity to study the mechanisms underlying the processing of scientific misinformation.

9.2 Measures

9.2.1 Perceived fakeness

Adopting Tsang’s (2021b) measure of perceived news fakeness, the participants reported to what extent they thought the Facebook message (a) was invented, (b) was fabricated, and (c) could be considered fake news with a 1 (completely disagree) to 5 (completely agree) Likert scale (M = 2.90, SD = 1.02, α = 0.92).

9.2.2 Perceived inaccuracy

Adopting Tsang’s (2021b) measure of perceived inaccuracy, the participants reported the extent to which they thought the Facebook message (a) was misleading, (b) contained exaggeration, (c) involved serious errors, and (d) was inconsistent with the facts using the same 5-point scale. The mean of all four items was used to measure perceived inaccuracy (M = 2.98, SD = 0.99, α = 0.93).

9.2.3 Analytical thinking disposition

Adopting Pennycook and Rand’s (2019) use of Frederick’s (2005) six-item CRT, the participants’ disposition to engage in analytical thinking was measured by asking them three questions: (a) A is 20 years older than B. The sum of their ages is 28. How old is B? (b) In a loaf of bread, there is a patch of mold. Every day, the patch of mold doubles in size. It takes 40 days for the mold to cover the entire loaf. How many days would it take for the mold to cover half of the loaf? (c) A bat and ball cost $12 in total. The bat costs $10 more than the ball. How much does the ball cost? Participants answering a question correctly scored one point. The more scores the participants obtained, the higher their analytical thinking disposition (M = 1.26, SD = 1.17, α = 0.72).

9.2.4 Distrust in science

Adopting items from the Trust in Science and Scientists Inventory developed by Nadelson et al. (2014), distrust in science was measured using 6 of the 21 items in the inventory: (a) when scientists change their mind about a scientific idea, it diminishes my trust in their work; (b) scientists ignore evidence that contradicts their work; (c) scientists intentionally keep some of their findings secret when sharing their discoveries (this item was edited to fit this study context); (d) scientists do not value the ideas of others; (e) we cannot trust scientists because they are biased in their perspectives; and (f) scientific theories are weak explanations. The six items were combined to measure distrust in science (M = 3.09, SD = 0.80, α = 0.88).

10 Results

To test H1, H2, and H4, two series of three-way analyses of covariance were conducted, with pre-existing attitudes, misinformation type, and evidence type as the predictors of perceived inaccuracy and perceived fakeness (Table 1). Because random assignment was not possible and pre-existing attitudes were measured instead of being manipulated, all analyses included gender, age, education, income, and political identification as the covariates. In both cases, females were found to positively impact perceived inaccuracy (F(1,823) = 4.85, p = 0.028, η2 = 0.006) and fakeness (F(1,823) = 6.22, p = 0.013) than male participants. Also, age was found to positively impact perceived inaccuracy (F(1, 823) = 15.20, p < 0.001, η2 = 0.018) and fakeness (F(1, 823) = 21.46, p < 0.001). Older participants were more likely to identify the messages as fake compared with younger participants. Those participants exposed to a manipulated message were not found to rate the content as more inaccurate (F(1,823) = 0.77, p = 0.380) or fake (F(1,823) = 0.43, p = 0.510) than a misinterpreted message. Therefore, H1 was not supported. H4a was also not supported because no interaction effects between pre-existing attitudes and misinformation type were found in relation to perceived inaccuracy (F(1,823) = 0.001, p = 0.982) or fakeness (F(1,823) = 0.122, p = 0.727).

Furthermore, those participants exposed to an article citing data from a research article did not rate the content as inaccurate (F(1,823) = 3.82, p = 0.051) or fake (F(1,823) = 3.11, p = 0.078) to a larger extent than those exposed to an article with quotes. Therefore, H2 was not supported. Nonetheless, a significant interaction was found. The relationship between evidence type and perceived inaccuracy was found to depend on the participants’ pre-existing attitudes (F(1, 823) = 7.52, p = 0.006, η2 = 0.009), indicating a relationship between evidence type and perceived fakeness (F(1, 823) = 8.09, p = 0.005, η2 = 0.010). The results from pairwise comparisons suggest that among those who were unfavorable toward vaccination, reading a message citing a research article (M = 2.70, SD = 0.83) did not differ from quoting a research team (M = 2.73, SD = 0.83, p = 0.565). However, among those who were favorable toward vaccination, those reading a message citing a research article (M = 3.43, SD = 1.00) rated the message as being more inaccurate than those reading a message quoting a research team (M = 3.15, SD = 1.10, p = 0.001). Similar patterns were found with fakeness. Those who were unfavorable toward vaccination rated the message citing a research article (M = 2.63, SD = 0.87) as more fake than the message quoting a research team (M = 3.06, SD = 1.12, p = 0.429). H4b was supported, but H4a was not (see Figure 1a and b).

Figure 1: 
The interaction effect between pre-existing attitudes toward vaccination and evidence type on (a) perceived inaccuracy and (b) perceived fakeness.
Figure 1:

The interaction effect between pre-existing attitudes toward vaccination and evidence type on (a) perceived inaccuracy and (b) perceived fakeness.

To test the relationship between perceived inaccuracy and fakeness and pre-existing attitudes toward vaccination (H3), analytical thinking disposition (H5), and distrust in science (H6), a series of regression analyses were conducted, controlling for gender, age, education, income, and political ideology. Consistent with H3, pre-existing attitudes were positively related to perceived inaccuracy (β = 0.31, p < 0.001, R 2 change = 0.065) and perceived fakeness (β = 28, p < 0.001, R 2 change = 0.054). Distrust in science was also found to be positively related to perceived inaccuracy (β = 0.20, p < 0.001, R 2 change = 0.040) and perceived fakeness (β = 0.14, p < 0.001, R 2 change = 0.019). However, analytical thinking disposition was not related to either perceived inaccuracy (β = 0.06, p = 0.098) or fakeness (β = 0.025, p = 0.471). Because H3 and H6—not H5—were supported, an interaction term was included in the model. Distrust in science interacted with pre-existing attitudes on perceived inaccuracy (β = 0.11, p = 0.001, R 2 change = 0.011) and perceived fakeness (β = 0.07, p = 0.035, R 2 change = 0.005). Among the control variables, political ideology was positively associated with perceived fakeness (b = 0.07, p = 0.044) but not perceived inaccuracy (b = 0.03, n.s.)

To better understand the interaction effect, a figure was produced by grouping the participants according to their pre-existing attitudes toward vaccination. The two groups were split at the mean (3.12), with 449 participants (53.8%) rating vaccination below average and 386 participants categorized as holding a favorable rating of vaccination. Furthermore, the sample was divided into two groups according to low and high levels of distrust in science, here again splitting at the mean.

As shown in Figure 2a and b, among the participants who were favorable toward vaccination, those with a higher distrust in science rated it as having more inaccuracy (M = 3.50, SD = 1.01) and fakeness (M = 3.32, SD = 1.04) than those with lower distrust (M = 3.11, SD = 1.08, p < 0.001; M = 3.10, SD = 1.13, p = 0.036, respectively). Furthermore, among the participants who were unfavorable toward vaccination, the evidence type did not result in significant differences in perceived inaccuracy (p = 0.305) or fakeness (p = 0.501). The significance tests presented here were adjusted using the Bonferroni method. Similar patterns were found in relation to perceived fakeness.

Figure 2: 
The interaction effect between pre-existing attitudes toward vaccination and distrust in science on (a) perceived inaccuracy and (b) perceived fakeness.
Figure 2:

The interaction effect between pre-existing attitudes toward vaccination and distrust in science on (a) perceived inaccuracy and (b) perceived fakeness.

11 Discussion

The circulation of medical and scientific misinformation on social media platforms has been shown to be a threat to public health (Lewandowsky et al. 2012), particularly during the COVID-19 pandemic, when governments and medical experts were trying to minimize infection risks. In Hong Kong, the scientific misinformation that circulated during the pandemic was mainly related to the safety and efficacy of the two vaccines available on the local market (HKBU Fact Check 2022b). Among the verified misinformation, very few involved scientific research conducted by local scientists or institutions. The majority of the misinformation was contained in scientific articles published in a nonlocal context (HKBU Fact Check 2022b), often citing foreign experts and institutions as sources. Taking a Facebook post in Hong Kong as a reference, the current research tested how people process misinformation with different media tactics involved (misinterpretation vs. manipulation), as well as with the different types of supporting evidence (use of research data vs. use of quotes by scientists) presented.

As noted earlier, manipulation involves the fabrication of published findings or quotes, whereas misinterpretation involves the presentation of the original findings and quotes but with an incorrect, added-on interpretation of the genuine evidence. The findings show that people did not consider misinterpreted content to be less inaccurate or fake than manipulated content, even though manipulation is “more faulty” because it deviates more from the “ground truth.” By ground truth, the present research refers to expert consensus as available at the time data collection happened (Tan et al. 2015; Vraga and Bode 2020). In this sense, people are incapable of discerning between the two types of misinformation. In fact, because people are seldom experts in scientific matters, they might not have the appropriate background and knowledge to make accurate judgments. This is particularly the case when numerous scientists are still investigating the effects of COVID-19 vaccines on people, including pregnant women. The implication is that scholars and practitioners should not expect individuals to be competent readers who can identify the media tactics used to create a piece of misinformation and, in turn, call out the information as false. This finding was expected because information is often presented as news. Because news is supposed to convey new information to the public, people should not be presumed to have the ability to recognize the ground truth (expert consensus) and identify suspicious content on social media platforms without external assistance (i.e., search engines, consultation with friends, cross-checking media reports, etc.).

Further, the findings suggest that the choice of supporting evidence has little to do with people’s perceptions of message inaccuracy and fakeness. Among all the factors taken into consideration, pre-existing attitudes toward vaccination played the largest role in predicting judgments about inaccuracy and fakeness. In this sense, with respect to the debate about whether people are susceptible to misinformation because of cognitive laziness or because they want to protect their personal beliefs, support was found for the motivated reasoning hypothesis (Nisbet et al. 2015; Tsang 2021b). When given a research finding that supports one’s antivaccination stance, the mismatch between one’s prior belief and an incoming message makes people with positive prior attitudes toward vaccination more likely to see it as inaccurate and fake.

Contrary, scoring highly on a CRT did not make people more likely to reject misinterpreted and manipulated content. Conflicting with previous studies using CRT to reflect individuals’ propensity to perform analytical thinking and reasoning (Pennycook et al. 2015), the disposition to engage in analytical thinking and reasoning was not found to impact assessments of information inaccuracy and fakeness. Although Pennycook and Rand (2019) explored whether people are lazy when it comes to performing deliberative information processing, the current research found that the same mechanism does not apply to the rejection of misleading and inaccurate content, at least in this specific context. This could be because of how studies finding support for the classical reasoning theory have used fairly obvious misinformation (Bago et al. 2020; Pennycook and Rand 2019), and when it comes to discerning nuanced scientific information (misinterpretation vs. manipulation), the motivated reasoning account holds better than the classical reasoning account. Second, it might be due to how controversial and political the topic was, driving people to protect their personal beliefs and identities. Overall, rather than determining their capability to perform deliberative, analytical information processing, one’s pre-existing belief of the subject at hand is more important.

Because misinformation often involves the use of evidence, either in the form of an article or quotation, the content is verifiable and, in turn, fact-checkable. However, with the motivated reasoning account, purely feeding scientific knowledge to people might not be sufficient (Krause et al. 2022). According to previous research on corrective interventions, partisan individuals often have a hard time accepting new counter-attitudinal messages (Nyhan and Reifler 2010). Even when misinformation is instantly corrected, it is highly unlikely that misinformation corrections will revert people’s beliefs back to the pre-exposure stage (Thorson 2016). As a result, to reduce the harm created by scientific misinformation, scholars and practitioners should explore media literacy programs as a way to combat the effects of personal biases on information processing (Tully et al. 2020). Instead of educating the public about scientific knowledge, these programs should help people recognize how their predispositions, such as partisanship and issues stance, impact the way they assess news. This is certainly not an easy task because, as cited in Tully et al. (2020), “Personal bias is ‘perhaps one of the most intractable barriers for news consumers to overcome’ (Klurfeld and Schneider 2014: 12)” (p. 212). While prior research has shown that media literacy interventions can successfully mitigate selective exposure (Vraga and Tully 2017) and biased perceptions of news (Vraga and Tully, 2015), the current research advocates the need to apply these media literacy interventions and messages to combat scientific misinformation, particularly when a subject is highly polarized and controversial.

It should be noted that, in the current study, even though evidence type did not have a main effect on judgments about inaccuracy and fakeness, those who felt favorably toward vaccination rated the counter-attitudinal message citing a research article as being more inaccurate and fake than an identical message that had the addition of a quote from a research team. People who are exposed to a counter-attitudinal message tend to be more critical toward the use of data than the use of quotations and, in turn, will find ways to discredit the content. In this sense, discussing the findings from the perspective of a published journal article allows people to scrutinize the data more closely. As Dahlstrom (2014) suggested, “logical-scientific communication” (p. 13,614) allows audiences to see how well the arguments supplied can be generalized to their prior beliefs and knowledge. The current research further suggests that misinformation that has the characteristics of “logical-scientific communication” is likely to drive motivated reasoning than misinformation with the characteristics of “narrative communication” (Dahlstrom 2014, p. 13,614). When people are supplied with a quote by or from a scientist or research team, there is less room for them to refute any arguments, leading to them having a harder time spotting misinformation presented in the form of expert quotations as being inaccurate and fake. Expert quotations will likely sound or look informative and be considered to be based on the experiences of one or more scientists, which are harder to refute.

Practically, it is important for individuals reading information that aligns with their existing views to be more critical of the information they receive. However, the findings show the opposite—those who were reading information they did not agree with were more critical and more alert to the supposed presence of inaccuracy and fakeness. In general, individuals’ perceptions of the existence of inaccuracy and fakeness are heavily impacted by one’s pre-existing attitudes, supporting the motivated reasoning hypothesis. This is consistent with what Tully et al. (2020) have suggested, specifically that literacy interventions should be designed with caution because solely encouraging people to identify news biases in information could drive people to be cynical toward news information. Instead, ideal interventions should assist people in reflecting on their personal biases.

Finally, the current study has shown the distrust of science was a significant predictor of the perception that messages were inaccurate and fake. Trust is particularly important for public compliance with pandemic recommendations because it gives scientists “the benefit of the doubt” (Rahn and Transue 1998, p. 543). As suggested, distrust means that audiences are likely to perceive scientists as being both dishonest and incompetent and, thus, less credible. Since the beginning of the COVID-19 pandemic, Hong Kong citizens have had decreased levels of trust in science and individual scientists and organizations (Grundy 2020) and, in turn, may distort the processing of vaccination information. On the more positive side, the perceptions that scientists keep some of their findings hidden, that they are self-interested and biased, and even that they discredit established scientific theories were found to heighten the awareness of news readers when analyzing information. These readers were better able to catch and be alert to misinformation, whether in the form of misinterpretation or manipulation. The challenge is that, although citizens are encouraged to trust science and experts National Academics of Sciences (2015), citizens are also alerted to be critical information consumers. Media literacy programs should aim to nurture individuals who are aware of potential false information but, at the same time, maintain that level of trust in science, especially given the polarization in society regarding various health and well-being matters, including climate change, mandatory vaccination, DNA cloning, and so forth.

11.1 Limitations

Because people with different backgrounds were expected to define what inaccurate and fake means, the current findings derived solely using data gathered in Hong Kong should be carefully interpreted. Future studies should replicate these findings in a variety of contexts. Furthermore, CRT is not a common measure in Hong Kong. Even though the CRT question items in the present study were all carefully drafted to accommodate the local language, further tests should be executed to better grasp the application of CRT in Chinese-language contexts. Moreover, although previous studies have relied on CRT to measure analytical thinking (Pennycook and Rand 2019), measures of analytical thinking, logical-mathematical thinking, and scientific thinking should all be explored. In addition, because the difference between manipulated content and misrepresented content in the current research may have been indiscernible, future research should seek to compare the effects of different forms of misinformation with more noticeable contrasts. Finally, the current study took one specific misinformation as a reference for stimuli creation. Scholars should continue to investigate the effects of different types of misinformation on readers’ processing and evaluations. In fact, there are many more ways to classify misinformation (Kapantai et al. 2021). Although the current study identified and tested the effects of two types of misinformation, more should be done to determine how the public can be protected from a wide variety of scientific misinformation (Van der Linden et al. 2017).


Corresponding author: Stephanie Jean Tsang, Department of Communication Studies, Hong Kong Baptist University, Room 916, Communication & Visual Arts Building, 5 Hereford Road, Kowloon Tong, Hong Kong, E-mail:
Article Note: This article underwent double-blind peer review.

Funding source: UGC General Research Fund, Hong Kong SAR

Award Identifier / Grant number: Project No. 12602820

  1. Research funding: This work was funded by the General Research Fund of the Hong Kong Research Grants Council (https://doi.org/10.13039/501100002920), Project Number: 12602820.

Appendix A

Experimental Stimuli

Misinterpretation of research article findings

Manipulation of research article findings

Misinterpretation of quotes by a research team

Manipulation of quotes by a research team

References

Allcott, Hunt & Matthew Gentzkow. 2017. Social media and fake news in the 2016 election. The Journal of Economic Perspectives 31(2). 211–236. https://doi.org/10.1257/jep.31.2.211.Search in Google Scholar

Bago, Bence, David G. Rand & Gordon Pennycook. 2020. Fake news, fast and slow: Deliberation reduces belief in false (but not true) news headlines. Journal of Experimental Psychology: General 149(8). 1608–1613. https://doi.org/10.1037/xge0000729.Search in Google Scholar

Dahlstrom, Michael F. 2014. Using narratives and storytelling to communicate science with nonexpert audiences. Proceedings of the National Academy of Sciences 111(4 Suppl). 13614–13620. https://doi.org/10.1073/pnas.1320645111.Search in Google Scholar

DataReportal. 2022. Digital 2022: Hong Kong. https://datareportal.com/reports/digital-2022-hong-kong#:∼:text=Hong%20Kong’s%20internet%20penetration%20rate,percent)%20between%202021%20and%202022 (accessed 8 August 2022).Search in Google Scholar

Ejelöv, Emma & Timothy J. Luke. 2020. Rarely safe to assume”: Evaluating the use and interpretation of manipulation checks in experimental social psychology. Journal of Experimental Social Psychology 87. 103937. https://doi.org/10.1016/j.jesp.2019.103937.Search in Google Scholar

Faverio, Michelle. 2022. Share of those 65 and older who are tech users has grown in the past decade: Pew Research Center. https://www.pewresearch.org/fact-tank/2022/01/13/share-of-those-65-and-older-who-are-tech-users-has-grown-in-the-past-decade/ (accessed 30 April 2022).Search in Google Scholar

Frederick, Shane. 2005. Cognitive reflection and decision making. Journal of Economic Perspectives 19(4). 25–42. https://doi.org/10.1257/089533005775196732.Search in Google Scholar

Gauchat, Gordon, Timothy O’Brien & Oriol Mirosa. 2017. The legitimacy of environmental scientists in the public sphere. Climatic Change 143(3). 297–306. https://doi.org/10.1007/s10584-017-2015-z.Search in Google Scholar

Grundy, Tom. 2020. Coronavirus: World health organization faces credibility crisis among Hongkongers – survey: Hong Kong Free Press. https://hongkongfp.com/2020/02/25/coronavirus-world-health-organization-faces-credibility-crisis-among-hongkongers-survey/ (accessed 14 August 2022).Search in Google Scholar

HKBU Fact Check. 2022a. False: Is there a 92.3% spontaneous abortion rate among those who receive an mRNA COVID-19 vaccine before 13 weeks of gestation? Fact Checks. Available at: https://factcheck.hkbu.edu.hk/home/2022/02/25/miscarriage/.Search in Google Scholar

HKBU Fact Check. 2022b. HKBU fact check: COVID-19 misinformation Hub. Available at: https://factcheck.hkbu.edu.hk/covid19-hub/.Search in Google Scholar

Ho, Shirley S., Dominique Brossard & Dietram A. Scheufele. 2008. Effects of value predispositions, mass media use, and knowledge on public attitudes toward embryonic stem cell research. International Journal of Public Opinion Research 20(2). 171–192. https://doi.org/10.1093/ijpor/edn017.Search in Google Scholar

Hong Kong Baptist University. 2021. Dispelling concerns, countering misinformation vital to combat vaccine hesitancy among elderlies in Hong Kong: Mingpao.com. https://bit.ly/3CjFlkc (accessed 30 April 2022).Search in Google Scholar

Horton, David M. 2016. Leading school teams: Building trust to promote student learning. USA: Corwin Press.10.4135/9781506344904Search in Google Scholar

Huang, Qing. 2021. Exposure to online news about air pollution and public trust in regulators in China: A moderated mediation analysis of perceived risk and perceived news credibility. Asian Journal of Communication 31(2). 144–159. https://doi.org/10.1080/01292986.2021.1892787.Search in Google Scholar

Kapantai, Eleni, Androniki Christopoulou, Christos Berberidis & Vassilios Peristeras. 2021. A systematic literature review on disinformation: Toward a unified taxonomical framework. New Media & Society 23(5). 1301–1326. https://doi.org/10.1177/1461444820959296.Search in Google Scholar

Kerr, John, Costas Panagopoulos & Sander van der Linden. 2021. Political polarization on COVID-19 pandemic response in the United States. Personality and Individual Differences 179. 110892. https://doi.org/10.1016/j.paid.2021.110892.Search in Google Scholar

Kidd, Robert F. 1976. Manipulation checks: Advantage or disadvantage? Representative Research in Social Psychology 7(2). 160–165.Search in Google Scholar

Klein, David O. & Joshua R. Wueller. 2017. A legal perspective: ‘What is fake news? Journal of Internet Law 20(10). 6–13.Search in Google Scholar

Klurfeld, James & Schneider Howard. 2014. News literacy: Teaching the internet generation to make reliable information choices. In Brookings institution research paper. https://www.brookings.edu/wp-content/ (accessed 14 August 2022).Search in Google Scholar

Kraft, Patrick W., Milton Lodge & Charles S. Taber. 2015. Why people “don’t trust the evidence” motivated reasoning and scientific beliefs. The Annals of the American Academy of Political and Social Science 658(1). 121–133. https://doi.org/10.1177/0002716214554758.Search in Google Scholar

Krause, Nicole M., Isabelle Freiling & Dietram A. Scheufele. 2022. The “infodemic” infodemic: Toward a more nuanced understanding of truth-claims and the need for (not) combatting misinformation. The Annals of the American Academy of Political and Social Science 700(1). 112–123. https://doi.org/10.1177/00027162221086263.Search in Google Scholar

Kunda, Ziva. 1990. The case for motivated reasoning. Psychological Bulletin 108(3). 480. https://doi.org/10.1037/0033-2909.108.3.480.Search in Google Scholar

Lewandowsky, Stephan, Ullrich K. H. Ecker, Colleen M. Seifert, Norbert Schwarz & John Cook. 2012. Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest 13(3). 106–131. https://doi.org/10.1177/1529100612451018.Search in Google Scholar

Lewandowsky, Stephan, Ullrich K. H. Ecker & John Cook. 2017. Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of Applied Research in Memory and Cognition 6(4). 353–369. https://doi.org/10.1016/j.jarmac.2017.07.008.Search in Google Scholar

Lewicki, Roy J., Daniel J. McAllister & Robert J. Bies. 1998. Trust and distrust: New relationships and realities. Academy of Management Review 23(3). 438–458. https://doi.org/10.5465/amr.1998.926620.Search in Google Scholar

Lin, Cheryl, Pikuei Tu & Leslie M. Beitsch. 2021. Confidence and receptivity for COVID-19 vaccines: A rapid systematic review. Vaccines 9(1). 16. https://doi.org/10.3390/ vaccines9010016.10.3390/vaccines9010016Search in Google Scholar

Lord, Charles G., Lee Ross & Mark R. Lepper. 1979. Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology 37(11). 2098. https://doi.org/10.1037/0022-3514.37.11.2098.Search in Google Scholar

Martel, Cameron, Gordon Pennycook & David G. Rand. 2020. Reliance on emotion promotes belief in fake news. Cognitive Research: Principles and Implications 5(1). 1–20. https://doi.org/10.1186/s41235-020-00252-3.Search in Google Scholar

Nadelson, Louis, Cheryl Jorcyk, Dazhi Yang, Mary Jarratt Smith, Sam Matson, Ken Cornell & Virginia Husting. 2014. I just don’t trust them: The development and validation of an assessment instrument to measure trust in science and scientists. School Science & Mathematics 114(2). 76–86. https://doi.org/10.1111/ssm.12051.Search in Google Scholar

National Academies of Sciences, and Medicine. 2015. Trust and Confidence at the Interfaces of the Life Sciences and Society: Does the Public Trust Science? A Workshop Summary. National Academies Press. https://doi.org/10.17226/21798.Search in Google Scholar

Nielsen, Rasmus Kleis & Lucas Graves. 2017. “News you don’t believe”: Audience perspectives on fake news. In Reuters Institute for the Study of journalism (reuters Institute for the Study of journalism factsheets). Reuters Institute for the Study of Journalism.Search in Google Scholar

Nisbet, Erik C., Kathryn E. Cooper & R. Kelly Garrett. 2015. The partisan brain: How dissonant science messages lead conservatives and liberals to (dis)trust science. The Annals of the American Academy of Political and Social Science 658(1). 36–66. https://doi.org/10.1177/0002716214555474.Search in Google Scholar

Nisbet, Matthew C. & Dietram A. Scheufele. 2009. What’s next for science communication? Promising directions and lingering distractions. American Journal of Botany 96(10). 1767–1778. https://doi.org/10.3732/ajb.0900041.Search in Google Scholar

Nyhan, Brendan & Jason Reifler. 2010. When corrections fail: The persistence of political misperceptions. Political Behavior 32(2). 303–330. https://doi.org/10.1007/s11109-010-9112-2.Search in Google Scholar

O’Keefe, Daniel J. 2003. Message properties, mediating states, and manipulation checks: Claims, evidence, and data analysis in experimental persuasive message effects research. Communication Theory 13(3). 251–274. https://doi.org/10.1111/j.1468-2885.2003.tb00292.x.Search in Google Scholar

Pennycook, Gordon, Jonathan A. Fugelsang & Derek J. Koehler. 2015. What makes us think? A three-stage dual-process model of analytic engagement. Cognitive Psychology 80. 34–72. https://doi.org/10.1016/j.cogpsych.2015.05.001.Search in Google Scholar

Pennycook, Gordon & David G. Rand. 2019. Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition 188. 39–50. https://doi.org/10.1016/j.cognition.2018.06.011.Search in Google Scholar

Rahn, Wendy M. & John E. Transue. 1998. Social trust and value change: The decline of social capital in American youth, 1976–1995. Political Psychology 19(3). 545–565. https://doi.org/10.1111/0162-895x.00117.Search in Google Scholar

Renn, Ortwin & Debra Levine. 1991. Credibility and trust in risk communication. Communicating risks to the public, 175–217. The Hague: Kluwer.10.1007/978-94-009-1952-5_10Search in Google Scholar

Redlawsk, David. 2002. Hot cognition or cool consideration? Testing the effects of motivated reasoning on political decision making. The Journal of Politics 64(4). 1021–1044. https://doi.org/10.1111/1468-2508.00161.Search in Google Scholar

RTHK. 2022. Omicron outbreak in HK may have peaked, says CHP. https://news.rthk.hk/rthk/en/component/k2/1638242-20220310.htm (accessed 14 August 2022).Search in Google Scholar

Rutjens, Bastiaan T., Sander van der Linden & Romy van der Lee. 2021. Science skepticism in times of COVID-19. Group Processes & Intergroup Relations 24(2). 276–283. https://doi.org/10.1177/1368430220981415.Search in Google Scholar

Stanovich, Keith E. 1999. Who is rational? Studies of individual differences in reasoning. Mahwah, NJ: Erlbaum.10.4324/9781410603432Search in Google Scholar

Statista. 2021. Number of facebook users in Hong Kong from 2017 to 2020 with a forecast until 2026: Social Media & User-Genrated Content. Available at: https://www.statista.com/statistics/558226/number-of-facebook-users-in-hong-kong/.Search in Google Scholar

Tan, Andy S. L., Chul-joo Lee & Jiyoung Chae. 2015. Exposure to health (mis) information: Lagged effects on young adults’ health behaviors and potential pathways. Journal of Communication 65(4). 674–698. https://doi.org/10.1111/jcom.12163.Search in Google Scholar

Tandoc, Edson C., Zheng Wei Lim & Richard Ling. 2018. Defining “fake news”: A typology of scholarly definitions. Digital Journalism 6(2). 137–153. https://doi.org/10.1080/21670811.2017.1360143.Search in Google Scholar

The Government of the Hong Kong Special Administrative Region. 2022. Government adjusts vaccination requirements of Vaccine Pass. Available at: https://www.info.gov.hk/gia/general/202203/20/P2022032000438.htm.Search in Google Scholar

Thorson, Emily. 2016. Belief echoes: The persistent effects of corrected misinformation. Political Communication 33(3). 460–480. https://doi.org/10.1080/10584609.2015.1102187.Search in Google Scholar

Toplak, Maggie E., Richard F. West & Keith E. Stanovich. 2014. Assessing miserly information processing: An expansion of the cognitive reflection test. Thinking & Reasoning 20(2). 147–168. https://doi.org/10.1080/13546783.2013.844729.Search in Google Scholar

Tsang, Stephanie Jean. 2021a. COVID-19 vaccine hesitancy and perceptions of fake news. BU Audience Research. https://sites.google.com/hkbu.edu.hk/buar/research/covid-19-research/covid-03 (accessed 11 August 2022).Search in Google Scholar

Tsang, Stephanie Jean. 2021b. Motivated fake news perception: The impact of news sources and policy support on audiences’ assessment of news fakeness. Journalism & Mass Communication Quarterly 98(4). 1059–1077. https://doi.org/10.1177/1077699020952129.Search in Google Scholar

Tsang, Stephanie Jean. 2022a. Issue stance and perceived journalistic motives explain divergent audience perceptions of fake news. Journalism. 23(4). 823–840. https://doi.org/10.1177/1464884920926002.Search in Google Scholar

Tsang, Stephanie Jean. 2022b. Predicting COVID-19 vaccine hesitancy in Hong Kong: Vaccine knowledge, risks from coronavirus, and risks and benefits of vaccination. Vaccine X 11. 100164https://doi.org/10.1016/j.jvacx.2022.100164.Search in Google Scholar

Tully, Melissa, Emily K. Vraga & Anne-Bennett Smithson. 2020. News media literacy, perceptions of bias, and interpretation of news. Journalism 21(2). 209–226. https://doi.org/10.1177/1464884918805262.Search in Google Scholar

Van der Linden, Sander, Anthony Leiserowitz, Seth Rosenthal & Edward Maibach. 2017. Inoculating the public against misinformation about climate change. Global Challenges 1(2). 1600008. https://doi.org/10.1002/gch2.201600008.Search in Google Scholar

Vraga, Emily K. & Leticia Bode. 2020. Defining misinformation and understanding its bounded nature: Using expertise and evidence for describing misinformation. Political Communication 37(1). 136–144. https://doi.org/10.1080/10584609.2020.1716500.Search in Google Scholar

Vraga, Emily K., Sojung Claire Kim & John Cook. 2019. Testing logic-based and humor-based corrections for science, health, and political misinformation on social media. Journal of Broadcasting & Electronic Media 63(3). 393–414. https://doi.org/10.1080/08838151.2019.1653102.Search in Google Scholar

Vraga, Emily K. & Melissa Tully. 2015. Media literacy messages and hostile media perceptions: Processing of nonpartisan versus partisan political information. Mass Communication & Society 18(4). 422–448. https://doi.org/10.1080/15205436.2014.1001910.Search in Google Scholar

Vraga, Emily K. & Melissa Tully. 2017. Engaging with the other side: Using news media literacy messages to reduce partisan selective exposure. In Paper presented at the national association for media literacy education 2017 conference. Chicago & IL.10.1080/19331681.2019.1572565Search in Google Scholar

Wang, Xiangyu, Min Zhang, Weiguo Fan & Kang Zhao. 2022. Understanding the spread of COVID-19 misinformation on social media: The effects of topics and a political leader’s nudge. Journal of the Association for Information Science and Technology 73(5). 726–737. https://doi.org/10.1002/asi.24576.Search in Google Scholar

Wardle, Claire. 2017. Fake news. It’s complicated. First Draft News. https://firstdraftnews.org/articles/fake-news-complicated/ (accessed 11 August 2022).Search in Google Scholar

West, Jevin D. & Carl T. Bergstrom. 2021. Misinformation in and about science. Proceedings of the National Academy of Sciences 118(15). e1912444117. https://doi.org/10.1073/pnas.1912444117.Search in Google Scholar

Zarocostas, John. 2020. How to fight an infodemic. The Lancet 395(10225). 676. https://doi.org/10.1016/s0140-6736(20)30461-x.Search in Google Scholar

Zeng, Jing & Chan Chung-hong. 2021. A cross-national diagnosis of infodemics: Comparing the topical and temporal features of misinformation around COVID-19 in China, India, the US, Germany and France. Online Information Review 45(4). 709–728. https://doi.org/10.1108/OIR-09-2020-0417.Search in Google Scholar

Received: 2022-04-30
Accepted: 2022-08-21
Published Online: 2022-09-16

© 2022 the author(s), published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 29.3.2024 from https://www.degruyter.com/document/doi/10.1515/omgc-2022-0037/html
Scroll to top button