It was Mark Zuckerberg who said “knowing that a squirrel dies in your garden may be more relevant to your interests than knowing that people die in Africa.” This crude sentence is closely related to the Internet filter bubble. And today we explain what it is.
Do you enter Facebook and only find articles related to your ideological line? That shirt you took a look at Amazon chases you through the web in the form of a banner? Does the newsfeed of the medium to which you subscribe only ratify your opinion? The time has come to talk about the filter bubble, a concept coined by Eli Pariser, activist and founder of the Upworthy viral content website.
The author delved into this term in his work entitled The Filter Bubble, recently translated, which explores changes in algorithms such as Google, increasingly tending to the ubiquitous customization of results to users. The book analyzes the consequences that this phenomenon has on us and on the very functioning of the gears of democracy.
What is the filter bubble defined by Eli Pariser?
Precisely, a bubble filter is, as its name implies, that universe that surrounds us when performing our searches, as a result of the customization and algorithm mechanisms, which select the results according to the information previously provided by the user. The Internet giants use our personal data – the products we have searched, the political trends we see or the websites we consume – to adjust our browsing.
In this way, the bubble filter is the result of a personalized search for which past clicks, search history or geographical location of the user have been taken into account. In this way, two phenomena occur: on the one hand, we move away from that information with which we do not sympathize or that does not match our views and isolate ourselves in that cultural and ideological bubble, since only content adjusted to our preferences and interests.
Thus, the use of the network as a tool for empowerment, criticism and exposure of diversity is wasted, a negative side of which Eli Pariser warns and that “makes us more closed to new ideas, matters and important information”, making results invisible and generating the impression that there is nothing beyond our limited interests.
The author warns that the Internet filter bubble is potentially harmful to both the individual and society, since we are less exposed to conflicting views and are intellectually isolated. “A world built from the familiar is a world where there is nothing to learn, since there is invisible self propaganda, which indoctrinates us with our own ideas,” says the cyber-activist.
In this way, other authors have defined the bubbles as “ideological frameworks”. Also relevant figures of the likes of Sir Tim Berners-Lee, creator of the World Wide Web have criticized the drift of the Internet to become a gigantic and powerful mechanism to spy and monitor everything we do. With a group of experts, their desire is to create a different, decentralized and independent web of control by governments and companies. In this article published in The Guardian, he shredded the keys to achieve it.
Also, Adam Curtis, through his documentary Hypernormalization, explains how in his opinion, this bubble of filters is inherent in all the computer systems of today, inherent to the development of Big Data and the prediction of behaviors, generating tiny “bags of reality ”That reaffirm again and again the pre-established beliefs.
How does the filter bubble work?
Although the work of Pariser introduces this tricky topic focusing on the operation of Google search engines, which tracks about 200 variables to offer you unique results such as IP address, software used or location, Facebook is the one who takes the Palma and the platform that receives the most criticism of excessive personalization and the disappearance of perspectives that we do not share. A very clear example was the activist Tom Steinberg during the day after the Brexit vote: although the results were favorable to the United Kingdom’s departure from the EU, on Facebook he could not post any post about someone celebrating it.
The filter bubble simply uses tools curated by algorithms and the analysis they make of user behavior to offer attractive and increasingly personalized products to users. Facebook does it in the newsfeed and Google in its search engine. It is not the only one: Bing, Yahoo, YouTube or the New York Times itself uses automation to “improve the user experience”. In cases such as Spotify’s customized lists, it is a more harmless phenomenon – and in fact has advantages – but especially in terms of information search, the filter bubble generates an increasingly polarized public opinion.
Mentioning the example of the Wall Street Journal, this newspaper launched a simulator in its digital version in order to show what the newsfeed of a progressive and a conservative might look like in regard to various current issues. The screenshot showed how a voter in the Democratic Party saw Trump being criticized for not publishing his tax returns, while a Republican supporter received news about leaks in Hillary Clinton’s emails.
On the other hand, the filter bubble helps to expand the phenomenon and increase the spread of false news, since users not only share those contents with which we agree more, but the confirmation bias we need makes us validate related headlines to our worldview
Is it possible to escape the algorithms?
The Internet, despite the meticulous bubble filters, offers countless possibilities to expand perspectives learn different ideas and share opinions. Eli Pariser believes that it is companies like Facebook or Google that should advocate for greater openness and transparency, allowing users to access the data they have about them.
For their part, users should strive to find new ideas of people, consulting and deleting the data that technological platforms have of us, deleting cookies and using browsers in incognito mode to “deceive or mislead the algorithm” as far as the possible, in addition to deepening different topics and points of view with a proactive attitude and broader search perspectives.