Me, myself and I: Filter Bubbles & Echo Chambers

September 11th, 2020 by

‘Filter bubbles’ and ‘echo chambers’ are popular terms to describe the phenomenon social scientists call ‘selective exposure’. The theory of selective exposure (Klapper, 1957) in brief states that people tend to select information which are in accord with their existing likes, and consequently avoid information that contradicts their beliefs and values.

Different digital tools, algorithms and behaviours rely on the collection of personal data to filter and/or rank items in the daily information stream, creating ‘filter bubbles’ and ‘echo chambers’. As a consequence, they result in a higher personalisation, but also a decreasing diversity of information. Diversity of information may refer to either source or content. Source diversity means the inclusion of a multitude of information sources by a news outlet, as well as the variety of news outlets consumed by a recipient. Content diversity means the range of topics and perspectives on a given topic (Haim, Graefe, & Brosius, 2018).

Despite describing similar phenomena, ‘filter bubbles’ and ‘echo chambers’ are not the same concept. ‘Echo chambers’, on the one hand, describe the phenomenon of being surrounded by link-minded contacts. This might lead to an amplification or reinforcement of pre-existing beliefs. ‘Filter bubbles’, on the other hand, refer to the algorithmic filtering of information to match a user’s needs (Haim, Graefe, & Brosius, 2018). However, there is no consistency in the use of both terms; for example, Lewandowsky et al (2017) describe ‘echo chambers’ as the space where “most available information conforms to pre-existing attitudes and biases” (p. 359).

Studies have shown that people are more likely to share articles with which they agree (An, Quercia, Cha, Gummadi, & Crowcroft, 2014) and that social media expose the community to a narrower range of information sources, compared to a baseline of information seeking activities. Research has also shown that the diversity of social media communication is significantly lower than the one of interpersonal communication, both on an individual and collective level (Nikolov, Oliveira, Flammini, & Menczer, 2015).

But why do people surround themselves with like-minded contacts, why do they choose information that confirms what they already believe? There are different answers to this question. The theory of cognitive dissonance (Festinger, 1957) explains this phenomenon arguing that individual strives for consistency (or consonance) of their believes, attitudes, knowledge etc. Inconsistencies cause psychological discomfort, which Festinger calls dissonance. Another answer is that surrounding oneself with familiar information helps to cope with or even overcome information overload (Pariser 2011). A third answer refers to the social aspect of social media: because of its sharing mechanisms, discovering information becomes a social endeavour rather than an individual process.

In the context of social media, ‘filter bubbles’ and ‘echo chambers’ therefore allow users to avoid psychological discomfort and information overload and to engage in a social endeavour in the process of information seeking. However, they pose great risks leading to self-reinforcement and reduced information diversity (Haim, Graefe, & Brosius, 2018). The tendency to surround oneself with like-minded opinions might also prevent an engagement with other ideas. This can facilitate confirmation bias and polarisation (Nikolov, Oliveira, Flammini, & Menczer, 2015; Haim, Graefe, & Brosius, 2018).

But it’s not all bad news: ‘echo chambers’ seem to be focused mainly on political discourse (Nikolov, Oliveira, Flammini, & Menczer, 2015) whereas other topic areas are less affected. Furthermore, there are tools that encourage and enable users to seek information beyond their ‘bubble’. The EUNOMIA feature illustrating how information has been spread and changed by different users aims exactly to show how similar information is discussed in different ‘bubbles’.

References

An, J., Quercia, D., & Crowcroft, J. (2013). Fragmented Social Media: A Look into Selective Exposure to Political News. WWW 2013 Companion, May 13–17, 2013, Rio de Janeiro, Brazil. ACM 978-1-4503-2038-2/13/05.

Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford, CA: Stanford University Press.

Haim, M., Graefe, A., & Brosius, H.-B. (2018). Burst of the Filter Bubble? Effects of personalization on the diversity of Google News. Digital Journalism, 6(3), 330-343. https://doi.org/10.1080/21670811.2017.1338145

Klapper, J. T. (1957). What We Know About the Effects of Mass Communication: The Brink of Hope. The Public Opinion Quarterly, 21(4), 453-474. https://doi.org/10.1086/266744

Lewandowsky, S., Ecker, U. K., & Cook, J. (2017). Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369. https://doi.org/10.1016/j.jarmac.2017.07.008

Nikolov, D., Oliveira, D. F., Flammini, A., & Menczer, F. (2015). Measuring online social bubbles. PeerJ Computer Science, 1(38), 1-14. https://doi.org/10.7717/peerj-cs.38

Pariser, E. (2011). The Filter Bubble: How the New Personalized Web is Changing What We Read and How We Think. New York: Penguin.

It’s not me, it’s you: The Third Person Effect

July 16th, 2020 by

A well-documented phenomenon in communication science is the fact that individuals seem to think that others are more susceptible to media effects than themselves. We tend to assume that commercials, misinformation, or other manipulation by and via the media affect ‘the others’ more than ourselves. This is called the Third Person Effect (Davison, 1983).

Because of this, individuals also tend to overestimate the influence of the media on the attitudes and behaviour of others, and underestimate the influence on themselves. Both of these facts can be harmful: when individuals underestimate the influence of misinformation on their own attitudes and behaviour, they may become more susceptible to it. And when they overestimate the effects on others, this might lead them to take action based on their expectation of other people’s reaction to it.

Expertise plays also a relevant role in this process, especially an individual’s own (perceived) expertise as compared to others’:

“In a sense, we are all experts on those subjects that matter to us, in that we have information not available to others people. This information may not be of factual or technical nature; it may have to do with our own experiences, likes, and dislikes. Other people, we reason, do not know what we know. Therefore, they are more likely to be influenced by the media.” (Davison, 1983, p. 9)

Another relevant factor are in-group and out-group effects. Researchers have observed differences between in-group and out-group members (Jang & Kim, 2018), depending on the idea of ‘the others’, e.g. as an individual, an average person, ‘people just like me’ etc. (Conners, 2005). Jang & Kim (2018) also reported a positive relationship between media literacy and third-person perception, stating that media literacy education could minimize the potential harm of false information. The third-person effect is also related to self-enhancing tendencies such as the belief that one has a greater ability to learn from an experience than others, which seem to be cross-cultural (Johansson, 2005).

In the context of misinformation, this is an important issue: if people do not think they are influenced by false information (as opposed to others), “they may develop the false impression that information shared among them, regardless of their actual accuracy, is perceived true.” (Jang & Kim, 2018, S. 299) Previous results from EUNOMIA co-design research showed that misinformation was claimed to be dangerous because it might have great power over the population and influence their acts and/or decisions.

Sources

Conners, J. L. (2005). Understanding the Third-Person Effect. Centre for the Study of Communication and Culture (CSCC).

Davison, W. P. (1983). The Third-Person Effect in Communication. The Public Opinion Quarterly, 47(1), 1-15.

Jang, S., & Kim, J. (2018). Third person effects of fake news: Fake news regulation and media literacy interventions. Computers in Human Behavior, 80, 295-302.

Johansson, B. (2005). The Third-Person Effect. Only a Media Perception? Nordicom Review, 26(1), 81-94.

Understanding the behaviour of social media users

June 29th, 2020 by

Fill in the EUNOMIA survey on social media user behaviour and help us to develop further the EUNOMIA toolkit assisting users in practicing information hygiene.

Social media users adopt different norms of behaviour, spoken and unspoken rules, patterns and forms of communication in different social media platforms. How misinformation spreads via different channels is, at least partially, connected to such platform-internal norms and logics: different types of users ascribe trustworthiness in a different way, they differ in the way they seek and share information.

EUNOMIA is developing a set of tools that aims to support users in assessing the trustworthiness of indicators, offering them a variety of indicators. To understand (norms of) user behaviour is crucial: EUNOMIA follows a co-design approach that puts the user in the centre. Therefore, we are inviting every social media user, as well as traditional media journalists and social journalists, i.e. professional and citizen journalists that carry out journalism on online media, to fill out our survey and support this relevant task.

It will only take 15 minutes of your time!

Link to the survey: https://www.surveyly.com/p/index.php/687178?lang=en

Challenging Misinformation: Exploring Limits and Approaches

June 23rd, 2020 by

We invite researchers and practitioners interested in aligning social, computational and technological approaches to mitigating the challenge of misinformation in social media to join us at the SocInfo 2020 Workshop!

The last weeks and months of the COVID-19 pandemic have shown quite drastically how access to accurate information is crucial. So many rumours, hoaxes, fake news and other forms of dis- and misinformation have been spread that the term ‘infodemic’ was coined. To counteract this wicked problem, tools and online services are continuously being developed to support different stakeholders. However, limiting the harm caused by misinformation requires merging multiple perspectives, including an understanding of human behaviour in judging and promoting false information, as well as technical solutions to detect and stop its propagation. To this point, all the solutions are notably limited: as a multi-layered problem, there is not a single and comprehensive solution capable of stopping misinformation. Existing approaches are all limited for different reasons: the way end-users are (or not) engaged, limited data sources, subjectivity associated with the judgment, etc. Furthermore, purely technical solutions tend to disregard the social structures that lead to the spread of misinformation, or the different ways societal groups are affected are not only limited, they can also be harmful by obfuscating the problem. Therefore, a comprehensive and interdisciplinary approach is needed.

So, how can we battle misinformation and flatten the curve of the infodemic? Members of the EU funded projects Co-inform and EUNOMIA have joined forces to organise a hands-on workshop at the SocInfo conference on 6 October 2020. Aim of the workshop is to unpack the state-of-the-art on social sciences and technical solutions, challenging the participants to critically reflect upon existing limitations and then co-create a future with integrating perspectives. This workshop intends to bring together data scientists, media specialists, computer scientists, social scientists, designers, and journalists. Participants are invited to discuss challenges and obstacles related to misinformation online from human and technical perspectives; to challenge existing approaches and identify their limitations in technical terms and targeted users, and to co-create future scenarios building on existing solutions.

Find out more

About the workshop, topics of interest, the submission process, and the organisers: http://events.kmi.open.ac.uk/misinformation/

About SocInfo 2020: https://kdd.isti.cnr.it/socinfo2020/

About Co-inform: https://coinform.eu/

Eat, Sleep, Trust, Repeat or The Illusory Truth Effect

June 23rd, 2020 by

Have you ever noticed how a vaguely familiar statement, something you remember hearing, makes you think “there has to be something to it” or “this has to be true, I have heard it before”? This is what social psychology calls the “illusory truth effect”, which means that a person attributes higher credibility and trustworthiness to information to which they have been exposed before. In a nutshell, repetition makes information more credible and trustworthy. Frequency, it seems, serves as a “criterion of certitude”.

What is really interesting about this is that this effect has not only been tested for plausible information but even for statements initially identified as false. And the effect is immediate: already reading a false statement once is enough to increase later perceptions of its accuracy. An experiment carried out by researchers in the US showed the truth effect even for highly implausible, entirely fabricated news stories. The researchers further were able to show that the effect holds true even if participants forgot having seen the information previously. Even if participants disagree with information, repetition made it more plausible.

But why does this effect exist? Research gives us several answers to this. First, familiarity with the information leads to faster processing of that information. Second, recognition of information, and coherency of statements with previous information in a person’s memory, affect our judgement. Third, there is a learned effect that faster processing of information and truthfulness of information can be positively correlated.

When it comes to misinformation and fake news, the importance of this effect cannot be stressed enough. Even warnings by fact-checkers or experts cannot counter it; the only way of protecting yourself and others from misinformation is to avoid sharing any piece of dubious information. So, one of the most important rules we identified as part of our guidelines for information hygiene is simply if in doubt, don’t share.

References

Bacon, F. (1979). Credibility of repeated statements: Memory for trivia. Journal of Experimental Psychology: Human Learning and Memory, 5(3), 241–252.

Hasher, L., Goldstein, D., & Toppino, T. (1977). Frequency and the conference of referential validity. Journal of verbal learning and verbal behavior, 16(1), 107-112.

Pennycook, G., & Rand, D. (2018). Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. Journal of personality.

Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior exposure increases perceived accuracy of fake news. Journal of experimental psychology: general., 147(12), 1865–1880. doi:https://doi.org/10.1037/xge0000465

Unkelbach, C. (2007). Reversing the Truth Effect: Learning the Interpretation of Processing Fluency in Judgments of Truth. Journal of Experimental Psychology: Learning, Memory, and Cognition, 33(1), 2019-230.

Unkelbach, C., Koch, A., Silva, R. R., & Garcia-Marques, T. (2019). Truth by Repetition: Explanations and Implications. urrent Directions in Psychological Science, 28(3), 247-253.