BCM 312

Filter Bubbles 101

Have you ever noticed your Facebook suggesting posts or groups associated with something that you’ve recently interacted with? Maybe you’ve been presented with lots of memes about the Game of Thrones finale after liking the official page. Or you’re being suggested pages related to a brand of product that you liked pictures of. Or, if you’re like me, you’re being shown multiple suggestions for groups about cats because I keep liking funny cat videos.

These are all examples of personalisation algorithms at work. Facebook and other social media employ these types of algorithms to show to you content that you’re more likely to interact with and enjoy. These suggestions can be beneficial in presenting you with content that is likely to keep you engaged with the platform. However, it can also be undesirable when it leads to filter bubbles.

A filter bubble does just what its name suggests; it creates a bubble out of filtered content. The technical definition of filter bubbles refers to the state of ‘intellectual isolation’ which may occur when personalisation algorithms assume what content a user wants to see (Techopedia 2019). The algorithms use the ‘click history’ of a user to determine what interests them and produce suggestions based on the data.

On a social media platform like Facebook, filter bubbles will look like an ordinary news feed on the surface. But once you look a bit closer, you begin to see an increase in the Game of Thrones memes or cat group suggestions. The algorithms will consider not only what you interact with, but also what you don’t. Eli Pariser, who first identified this concept in his book and now-famous TED Talk, describes this cycle best:

“The challenge with these kinds of algorithmic filters, these personalised filters, is that, because they’re mainly looking at what you click on first, it can throw off that balance. And instead of a balanced information diet, you can end up surrounded by information junk food.”

Some are also concerned with the way personalisation filters work. There is a worry that they do not take into account the accuracy of any information provided, as the filters are primarily interested in the engagement factor of content.

We must remember that personalisation algorithms are not inherently evil, and when used correctly, can be convenient and helpful. It is when the personalisation begins filtering out opposing views to the user’s that it becomes a problem.


Feature Image by Ilya Pavlov on Unsplash

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s