BCM 312

Unfollow, Unfriend, Delete, Repeat

When it comes to voting in today’s society, a lot of people look for information about the candidates online. But how do we know if the information we’re seeing is accurate? For democracy to work effectively, everyone needs to be equally informed. Filter bubbles distort how we see vital information which can have a detrimental effect on external social factors like democracy.

Democracy works on the understanding that citizens have an opportunity and right to voice their opinion, and vote for what they believe. Shared facts are also a key component of democracy, but filter bubbles threaten this. Instead of sharing facts about different political groups, filter bubbles create two distinct groups that sit parallel to each other but rarely interact.

This separation leads to political polarisation, which refers to the division of standard political attitudes into ideological extremes. When this happens, it becomes an echo chamber, with the members being unable and unwilling to accept the opposition’s views.

Have you ever unfollowed or unfriended someone based on a political or ideological disagreement? Perhaps you’ve hidden suggested posts about something that you don’t believe in or agree with? These are both examples of user-driven personalisation. The idea of user-driven customisation is similar to liking lots of funny cat videos and then having them show up consistently on your news feed. The action of excluding certain content from your social media can potentially lead to polarisation, especially if it is about politics.

Dylko et al. (2017) experimented with a similar basis to those examples. They got participants to complete a political preference survey and then several weeks later, asked them to interact with a news website they had created. The participants were split randomly into four groups. They were: a control group, a user-driven group, a system-driven group, and a mix of the user-driven and system-driven groups. The website each participant interacted with was personalised according to the group they were in, and their initial survey responses.

The control and user-driven groups’ websites were not personalised before the interaction, but the latter was allowed to select the views they preferred to see (either liberal or conservative). The system-driven group’s websites were personalised according to the preferences they responded with. The hybrid group’s sites were also customised beforehand, but this group then had to option to accept or reject the system’s customisation.

The study found that when a participant’s website had system-driven customisation, it led to a much higher level of selective exposure than the other groups. This study, along with others, suggests that filter bubbles can play a more significant part in politics than we think.


Feature Image by Thought Catalog on Unsplash

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s