Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Home ›› Behavioral Science ›› How filter bubbles confirm our biases and what we can do about it

How filter bubbles confirm our biases and what we can do about it

by Kashish Masood
4 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

FilterBias_Slider

Balancing personalized content with other sources of information

Last week my sister showed me a cute cat video on Instagram (because what else do you use social media for?). She then proceeded to scroll down her explore page and got lost in a sea of photos, videos and stories that caught her interest. Beyond initially squealing over the cute cat video (don’t judge!), this short interaction made me realize how different my own explore page looked.

Having written before on the topic of personalization, it comes as no surprise to me how our online activity is used to shape user experience. Our interactions provide personalization algorithms with data, which in turn, provide us with tailored content.

This experience made me wonder what type of information we are NOT presented with and how that influences the lens with which we view the world.

In this article, I’ll dive into how personalization can limit the type of information we’re exposed to. I’ll also explore ways in which we can achieve a balanced media diet while reaping the benefits of personalization.

Introducing filter bubbles

The concept of having a personalized space online is not new. It was first introduced as the ‘filter bubble’ by Eli Pariser in his famous TED talk back in 2011. Pariser describes the filter bubble as a private space consisting of things and ideas we like based on our interactions with the Web.

Filter bubbles exist for a practical reason. We live in a society where we are constantly bombarded with information from different channels. This does not help you with acting upon that information and making a choice. For example, when planning a vacation to Spain, information about vacation deals in other countries will not help you much. That’s where filter bubbles come in. By presenting you with information that is only relevant to your situation, it creates an environment that suits your needs.

So, what’s the problem here?

When presented with facts that only align with your view, its easy to end with a biased worldview as no alternative facts are explored. Online, this bias is more subtle as you don’t actively choose what is shown on your social media feeds and Google searches. Since the personalization algorithms are working ‘behind the scenes,’ you are less likely to realize the filter bubble you find yourself in.

For instance, when UK citizens had to cast their vote for Brexit, many older citizens voted to leave the European Union. This caught the younger generation by surprise as they were in an online bubble where the sentiment was the other way around. They were unable to consider the less visible views of older citizens.

To add on to it, consuming information this way can lead to a snowballing confirmation bias over time. This means that the digital content you are interacting with, will generate further similar content to interact with in the future. Hence, amplifying the effect of the bias.

Where does that leave us?

Since filter bubbles have a functional purpose, getting rid of them does not seem like an effective solution. At the same time, the example above demonstrates the need to expand our personal bubble with more diverse perspectives. So how do you balance personalized content with other sources of information?

1. Help users recognize when they are in a filter bubble

The first step to avoid the downsides of filter bubbles is for users to recognize when they are inside one. This could be done, for instance, by indicating that the recommended content users are viewing doesn’t present a balanced view. There are some applications that already try to tackle this issue. The newsletter Knowwhere uses AI to write unbiased new stories. Although, this removes filter bubbles completely. Gobo, a MIT Media lab project, helps users see what gets hidden on their social networks to create awareness of which personalization algorithms are being used. This informs users to what extent their content consumption is based on their specific interests.

Knowhere
2. Create multiple ways of exploring content

Ideally, users should be able to explore the content they interact with through lists other than recommendations. This can be achieved by creating multiple ways through which users can explore content on platforms and apps. For example, even though Netflix is well-known for its use of personalization algorithms, it still provides users with the possibility of searching for new content based on genres and what’s trending. Apps could even add a ‘Surprise me’ button if users want to be exposed to content outside of their filter bubble.

Netflix

The main takeaway

Filter bubbles exist for a practical reason. And because of that practicality, it would be illogical to completely remove them. But giving users the choice to view content the way they want is very important. There are moments where users are completely fine with listening to curated music playlists and yet, there are times when users want to actively explore new music without any recommendations.

Catering to both type of experiences by combining personalized content with other sources, provides a much richer and valuable experience to people.
post authorKashish Masood

Kashish Masood

Kashish is a UX researcher and strategist, based in the Netherlands. She is passionate about creating unique experiences by exploring the intersection of future trends, tech and human behaviour. 

Tweet
Share
Post
Share
Email
Print

Related Articles

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and