become arguably more apparent. In recent years, social networks have displayed
symptoms of ideological polarization (Dylko et al., 2017; Stroud, 2010) and the
formation of the so-called filter bubbles associated with the emerging phenomenon
of fake news. Selective exposure behavior, confirmation bias and availability bias
that make us more likely to interact with content which confirms our pre-existing
views are the more likely trigger of ideological polarization, both offline and online
(Frey, 1986; Klapper, 1957; Stroud, 2008).
Despite the fact that the term ‘filter bubble’ has been popularized by Pariser (2011),
who illustrated the phenomenon of polarization on social media platforms, it is not a
new phenomenon. Already in 1996, Negroponte predicted a world where information
technologies become increasingly customizable and he “envisioned a digital life,
where newspapers tailor content to your preferences and media consumption
becomes a highly personalized experience” (Gil de Zuniga and Diehl, 2017, p. 3). He
based his vision on the fact that even in the pre-digital era audiences seemed to
prefer
reading specific media outlets and gathering information from sources which were
close, or at least closer, to their beliefs. They also tended to be influenced by their
own network of friends and other individuals, and in a way,
they lived inside
bubbles. Interestingly, Negroponte’s vision has become reality in our news feeds and
overall online experiences both on social media and the Internet, but it appears that
many individuals do not consider the implications of their highly customized online
experience. To be more specific, one study investigating users’
beliefs around
Facebook News Feed found that most users are not particularly aware of the
algorithm behind it (Rander and Gray, 2015, p. 177-178). This finding illustrates the
fact that when users interact with content on social media it is more likely for them
not to be aware of the fact that this content is part
of their filter-bubbles and,
therefore, that other information is also available online but is not part of their
bubbles since bots and algorithms have filtered it. As Burkhardt (2017) underlines,
the manipulation of computer code for social media
sites allows fake news to
proliferate and affects what people believe, often without ever having been read
beyond the headline or caption. This conclusion clearly demonstrates the significant
role bots and algorithms, as well as the ‘filter bubble’ effect, play on fake news
production and distribution on social media, and therefore it is crucial for the users to
16
be aware of the role bots and algorithms play in order to be able to counter the spread
of misleading information online.
Dostları ilə paylaş: