Social media are pervaded by unsubstantiated or untruthful rumors, that contribute to the alarming phenomenon of misinformation. The widespread presence of a heterogeneous mass of information sources may affect the mechanisms behind the formation of public opinion. Such a scenario is a florid environment for digital wildfires when combined with functional illiteracy, information overload, and confirmation bias. In this essay, we focus on a collection of works aiming at providing quantitative evidence about the cognitive determinants behind misinformation and rumor spreading. We account for users’ behavior with respect to two distinct narratives: (a) conspiracy and (b) scientific information sources. In particular, we analyze Facebook data on a time span of 5 years in both the Italian and the US context, and measure users’ response to (1) information consistent with one’s narrative, (2) troll contents, and (3) dissenting information e.g., debunking attempts. Our findings suggest that users tend to (a) join polarized communities sharing a common narrative (echo chambers), (b) acquire information confirming their beliefs (confirmation bias) even if containing false claims, and (c) ignore dissenting information.

Misinformation Spreading on Facebook

ZOLLO, Fabiana
;
Quattrociocchi, Walter
2018-01-01

Abstract

Social media are pervaded by unsubstantiated or untruthful rumors, that contribute to the alarming phenomenon of misinformation. The widespread presence of a heterogeneous mass of information sources may affect the mechanisms behind the formation of public opinion. Such a scenario is a florid environment for digital wildfires when combined with functional illiteracy, information overload, and confirmation bias. In this essay, we focus on a collection of works aiming at providing quantitative evidence about the cognitive determinants behind misinformation and rumor spreading. We account for users’ behavior with respect to two distinct narratives: (a) conspiracy and (b) scientific information sources. In particular, we analyze Facebook data on a time span of 5 years in both the Italian and the US context, and measure users’ response to (1) information consistent with one’s narrative, (2) troll contents, and (3) dissenting information e.g., debunking attempts. Our findings suggest that users tend to (a) join polarized communities sharing a common narrative (echo chambers), (b) acquire information confirming their beliefs (confirmation bias) even if containing false claims, and (c) ignore dissenting information.
2018
Complex Spreading Phenomena in Social Systems. Influence and Contagion in Real-World Social Networks
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in ARCA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10278/3701770
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 32
social impact