Determinants of health-related misinformation sharing on social media - a Scoping Review
Author(s) / Creator(s)
Jones, Christopher Martin
Jahnel, Tina
Egharevba, Gabriel
Schüz, Benjamin
Abstract / Description
The effectiveness of Public Health responses to acute crises often relies on population-level changes of individual health-related behaviours (Glanz & Bishop, 2010). Many potential measures to achieve this aim critically depend on the information available to the target population and whether they provide trustworthy and reliable guidance (De Vries, 2017). As exemplified during the Covid-19 pandemic, the proliferation of false information may pose a key threat here, as false information has been shown to reduce adherence to behavioural guidelines, promote engagement in false prevention measures and encourage hate and exclusion. Tackling their spread has thus been established as a research priority in the WHO Response Strategy (WHO, 2020).
As the rapid distribution of dis- and misinformation (dis-information: incorrect information intended to deceive; mis-information: incorrect information, but not intended to deceive) is largely occurring through social media, government efforts have focused on reducing the spread and availability of mis- and disinformation on various platforms (Pennycook & Rand, 2021). These efforts have mostly targeted misinformation, which is making up the far bigger share, and have strongly relied on partnering with tech companies for fact-checking, offering more trustworthy information or removing false information (Pennycook & Rand, 2021). This work has almost exclusively focused on deliberate and reflective mental processes. However, other potential determinants (e.g., fast, reactive and impulsive processes that drive most of individuals’ decision making as well as contextual effects) are likely to also affect how users judge the trustworthiness of information and influence their decision to share false information. As a result, interventions have mostly failed to considerably reduce misinformation sharing (Pennycook & Rand, 2021).
It is thus crucial to better understand misinformation sharing behaviour as functional behaviour, which also implies that the determinants go beyond cognitive judgements about the correctness or incorrectness of information. Viewing sharing of misinformation through a behavioural lens opens different avenues to comprehensively and integratively study its determinants. One especially promising avenue is the use of behavioural frameworks such as the Theoretical Domains Framework (TDF; Cane et al., 2012) that include both cognitive-deliberative as well as contextual and non-deliberative determinants. It is a continuously updated integrative framework summarizing evidence-based determinants of behavior (with 14 key domains) and behavior change (i.e. facilitators and barriers), allowing to systematically understand mechanisms of change and inform intervention design as well as their evaluation.
Keyword(s)
misinformation social media sharing fake news healthPersistent Identifier
PsychArchives acquisition timestamp
2023-03-23 12:01:25 UTC
Publisher
PsychArchives
Citation
-
study_protocol_prereg_scoping_review_misinformation_sharing_jones_etal__032023.pdfAdobe PDF - 272.63KBMD5: 6121ac41335bc698b906be1ca338e039
-
22023-03-23We received community feedback on two potential errors in our search strategy. In response, we have corrected both errors and also aligned the overall strategy with the search strategy of a second scoping review users´ ability to identify misinformation.
-
Author(s) / Creator(s)Jones, Christopher Martin
-
Author(s) / Creator(s)Jahnel, Tina
-
Author(s) / Creator(s)Egharevba, Gabriel
-
Author(s) / Creator(s)Schüz, Benjamin
-
PsychArchives acquisition timestamp2023-03-23T12:01:25Z
-
Made available on2021-05-17T07:03:50Z
-
Made available on2023-03-23T12:01:25Z
-
Date of first publication2023-03-23
-
Abstract / DescriptionThe effectiveness of Public Health responses to acute crises often relies on population-level changes of individual health-related behaviours (Glanz & Bishop, 2010). Many potential measures to achieve this aim critically depend on the information available to the target population and whether they provide trustworthy and reliable guidance (De Vries, 2017). As exemplified during the Covid-19 pandemic, the proliferation of false information may pose a key threat here, as false information has been shown to reduce adherence to behavioural guidelines, promote engagement in false prevention measures and encourage hate and exclusion. Tackling their spread has thus been established as a research priority in the WHO Response Strategy (WHO, 2020). As the rapid distribution of dis- and misinformation (dis-information: incorrect information intended to deceive; mis-information: incorrect information, but not intended to deceive) is largely occurring through social media, government efforts have focused on reducing the spread and availability of mis- and disinformation on various platforms (Pennycook & Rand, 2021). These efforts have mostly targeted misinformation, which is making up the far bigger share, and have strongly relied on partnering with tech companies for fact-checking, offering more trustworthy information or removing false information (Pennycook & Rand, 2021). This work has almost exclusively focused on deliberate and reflective mental processes. However, other potential determinants (e.g., fast, reactive and impulsive processes that drive most of individuals’ decision making as well as contextual effects) are likely to also affect how users judge the trustworthiness of information and influence their decision to share false information. As a result, interventions have mostly failed to considerably reduce misinformation sharing (Pennycook & Rand, 2021). It is thus crucial to better understand misinformation sharing behaviour as functional behaviour, which also implies that the determinants go beyond cognitive judgements about the correctness or incorrectness of information. Viewing sharing of misinformation through a behavioural lens opens different avenues to comprehensively and integratively study its determinants. One especially promising avenue is the use of behavioural frameworks such as the Theoretical Domains Framework (TDF; Cane et al., 2012) that include both cognitive-deliberative as well as contextual and non-deliberative determinants. It is a continuously updated integrative framework summarizing evidence-based determinants of behavior (with 14 key domains) and behavior change (i.e. facilitators and barriers), allowing to systematically understand mechanisms of change and inform intervention design as well as their evaluation.en_US
-
Publication statusother
-
Review statusnotReviewed
-
SponsorshipThis project is funded by a “Talent Funds” grant of the Leibniz ScienceCampus awarded to Christopher M. Jones.en_US
-
Persistent Identifierhttps://hdl.handle.net/20.500.12034/4278.2
-
Persistent Identifierhttps://doi.org/10.23668/psycharchives.12607
-
Language of contentengen_US
-
PublisherPsychArchivesen_US
-
Keyword(s)misinformationen_US
-
Keyword(s)social mediaen_US
-
Keyword(s)sharingen_US
-
Keyword(s)fake newsen_US
-
Keyword(s)healthen_US
-
Dewey Decimal Classification number(s)150
-
TitleDeterminants of health-related misinformation sharing on social media - a Scoping Reviewen_US
-
DRO typepreregistrationen_US
-
Leibniz subject classificationPsychologie