Preregistration

Determinants of health-related misinformation sharing on social media - a Scoping Review

Author(s) / Creator(s)

Jones, Christopher Martin
Jahnel, Tina
Egharevba, Gabriel
Schüz, Benjamin

Abstract / Description

The effectiveness of Public Health responses to acute crises often relies on population-level changes of individual health-related behaviours (Glanz & Bishop, 2010). Many potential measures to achieve this aim critically depend on the information available to the target population and whether they provide trustworthy and reliable guidance (De Vries, 2017). As exemplified during the Covid-19 pandemic, the proliferation of false information may pose a key threat here, as false information has been shown to reduce adherence to behavioural guidelines, promote engagement in false prevention measures and encourage hate and exclusion. Tackling their spread has thus been established as a research priority in the WHO Response Strategy (WHO, 2020). As the rapid distribution of dis- and misinformation (dis-information: incorrect information intended to deceive; mis-information: incorrect information, but not intended to deceive) is largely occurring through social media, government efforts have focused on reducing the spread and availability of mis- and disinformation on various platforms (Pennycook & Rand, 2021). These efforts have mostly targeted misinformation, which is making up the far bigger share, and have strongly relied on partnering with tech companies for fact-checking, offering more trustworthy information or removing false information (Pennycook & Rand, 2021). This work has almost exclusively focused on deliberate and reflective mental processes. However, other potential determinants (e.g., fast, reactive and impulsive processes that drive most of individuals’ decision making as well as contextual effects) are likely to also affect how users judge the trustworthiness of information and influence their decision to share false information. As a result, interventions have mostly failed to considerably reduce misinformation sharing (Pennycook & Rand, 2021). It is thus crucial to better understand misinformation sharing behaviour as functional behaviour, which also implies that the determinants go beyond cognitive judgements about the correctness or incorrectness of information. Viewing sharing of misinformation through a behavioural lens opens different avenues to comprehensively and integratively study its determinants. One especially promising avenue is the use of behavioural frameworks such as the Theoretical Domains Framework (TDF; Cane et al., 2012) that include both cognitive-deliberative as well as contextual and non-deliberative determinants. It is a continuously updated integrative framework summarizing evidence-based determinants of behavior (with 14 key domains) and behavior change (i.e. facilitators and barriers), allowing to systematically understand mechanisms of change and inform intervention design as well as their evaluation.

Keyword(s)

misinformation social media sharing fake news health

Persistent Identifier

PsychArchives acquisition timestamp

2023-03-23 12:01:25 UTC

Publisher

PsychArchives

Citation

  • 2
    2023-03-23
    We received community feedback on two potential errors in our search strategy. In response, we have corrected both errors and also aligned the overall strategy with the search strategy of a second scoping review users´ ability to identify misinformation.
  • 1
    2021-05-17
  • Author(s) / Creator(s)
    Jones, Christopher Martin
  • Author(s) / Creator(s)
    Jahnel, Tina
  • Author(s) / Creator(s)
    Egharevba, Gabriel
  • Author(s) / Creator(s)
    Schüz, Benjamin
  • PsychArchives acquisition timestamp
    2023-03-23T12:01:25Z
  • Made available on
    2021-05-17T07:03:50Z
  • Made available on
    2023-03-23T12:01:25Z
  • Date of first publication
    2023-03-23
  • Abstract / Description
    The effectiveness of Public Health responses to acute crises often relies on population-level changes of individual health-related behaviours (Glanz & Bishop, 2010). Many potential measures to achieve this aim critically depend on the information available to the target population and whether they provide trustworthy and reliable guidance (De Vries, 2017). As exemplified during the Covid-19 pandemic, the proliferation of false information may pose a key threat here, as false information has been shown to reduce adherence to behavioural guidelines, promote engagement in false prevention measures and encourage hate and exclusion. Tackling their spread has thus been established as a research priority in the WHO Response Strategy (WHO, 2020). As the rapid distribution of dis- and misinformation (dis-information: incorrect information intended to deceive; mis-information: incorrect information, but not intended to deceive) is largely occurring through social media, government efforts have focused on reducing the spread and availability of mis- and disinformation on various platforms (Pennycook & Rand, 2021). These efforts have mostly targeted misinformation, which is making up the far bigger share, and have strongly relied on partnering with tech companies for fact-checking, offering more trustworthy information or removing false information (Pennycook & Rand, 2021). This work has almost exclusively focused on deliberate and reflective mental processes. However, other potential determinants (e.g., fast, reactive and impulsive processes that drive most of individuals’ decision making as well as contextual effects) are likely to also affect how users judge the trustworthiness of information and influence their decision to share false information. As a result, interventions have mostly failed to considerably reduce misinformation sharing (Pennycook & Rand, 2021). It is thus crucial to better understand misinformation sharing behaviour as functional behaviour, which also implies that the determinants go beyond cognitive judgements about the correctness or incorrectness of information. Viewing sharing of misinformation through a behavioural lens opens different avenues to comprehensively and integratively study its determinants. One especially promising avenue is the use of behavioural frameworks such as the Theoretical Domains Framework (TDF; Cane et al., 2012) that include both cognitive-deliberative as well as contextual and non-deliberative determinants. It is a continuously updated integrative framework summarizing evidence-based determinants of behavior (with 14 key domains) and behavior change (i.e. facilitators and barriers), allowing to systematically understand mechanisms of change and inform intervention design as well as their evaluation.
    en_US
  • Publication status
    other
  • Review status
    notReviewed
  • Sponsorship
    This project is funded by a “Talent Funds” grant of the Leibniz ScienceCampus awarded to Christopher M. Jones.
    en_US
  • Persistent Identifier
    https://hdl.handle.net/20.500.12034/4278.2
  • Persistent Identifier
    https://doi.org/10.23668/psycharchives.12607
  • Language of content
    eng
    en_US
  • Publisher
    PsychArchives
    en_US
  • Keyword(s)
    misinformation
    en_US
  • Keyword(s)
    social media
    en_US
  • Keyword(s)
    sharing
    en_US
  • Keyword(s)
    fake news
    en_US
  • Keyword(s)
    health
    en_US
  • Dewey Decimal Classification number(s)
    150
  • Title
    Determinants of health-related misinformation sharing on social media - a Scoping Review
    en_US
  • DRO type
    preregistration
    en_US
  • Leibniz subject classification
    Psychologie