Algorithms, filter bubbles, and how personalization can change your perception

Personalization is usually a good thing, but it can limit the type of information you’re exposed to. Here’s why.

There’s a lot of information on the Internet – some of it good, some of it bad, and some of it not really relevant to you. Sifting through all that buzz and finding the info you’re looking for has become a burning necessity for various Internet-based entities. The usual solutions rely on content filtering and personalization algorithms.

Websites, online stores, and search engines monitor users’ activity with the goal of providing content that best meets that user’s needs. To do this, they use algorithms that filter out items that seem irrelevant to that user’s activity. The performance of these algorithms depends on:

  • Developers intentions
  • User behavior
  • How multiple algorithms interact e.g. how some algorithms use other algorithms content
  • The content to which algorithms refer

Algorithms and filter bubbles

The concept of the filter bubble was introduced by Eli Pariser in his 2010 book, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think; his TED presentation brought the term into public consciousness.

Pariser describes the filter bubble as a private and personalized space that every person has. This space is not shared with others and mostly consists of things and ideas we like and know well. It also contains variants of those familiar and expected things, as presented to us by various Web entities.

Our information bubble is also present in the offline world, but we can more readily see the mechanisms of its activity online. This is because the virtual context allows user reactions to be magnified and because the space in which events take place is clearly delineated.

However, being surrounded by content tailored to our personal tastes deprives us of significant amounts of information that algorithms have classified as potentially unwanted. This reinforces the confirmation bias most of us unconsciously employ. This bias prompts us to instinctively search for facts (real or imagined) confirming our previously-established views. It’s this bias that can lead us to, for example, look for negative product reviews when we’ve already assumed that a product won’t work for us.

Filter bubble pros and cons

A filter bubble has pros and cons. On the plus side, filter bubbles are great when you need to narrow your choices down to a few options. If you’re, say, researching the newest Samsung smartphone, information about Samsung’s equally new hair dryer will not help you much.

That is why mechanisms creating filter bubbles are frequently applied in e-commerce: they enhance the process of matching buyers to potential products and sellers. In essence, they establish an environment conducive to closing deals.

The disadvantages of the filter bubble become clear when we want to foster creativity or consider controversial and complex ideas. Within the filter bubble, there is no room for “meaning threats”, i.e. things that provoke our anxiety and curiosity and thus encourage us to discover different points of view.

Can we step out of the bubble?

So here’s the question: Can we put an end to the filter bubble? Not really. What we can do is expand our personal bubbles and make them more diverse. Completely destroying filter bubble stands in opposition to the concept of the bubble itself.