Filter Bubbles: Retrain Yourself

pexels-photo-794495.jpeg

by Aristee Georgiadou March 2, 2018

Filter Bubbles: Retrain Yourself

 

Facebook recently announced changes in its algorithm which will impact its newsfeed,  a feature since September 2006. How true is the better experience it promises? And how much of our privacy is being ceded over for a more sophisticated involvement? AI processes billions of data, publishing a mere 0,2% of the stories it considers; 60 out of 30.000 possible candidates. Who you friend, what publishers you follow, how often you interact, what kind of content you prefer, how many interactions these contents have, their recency, they all are parameters of a relevancy score which ranks what appears in your newsfeed. The score is derived by a feed quality program comprised of a panel which organizes the stories based on a survey in 30 languages. Your control over it is friending/unfriending, following/unfollowing, hide, if content lacks interest.

 

Beyond the obvious concern of internet platforms keeping detailed tabs on us, our online behavior comes with more serious consequences, in the name of personalization. Edgerank is the Facebook algorithm which ranks the summary of our friends’ actions (called edges). Personalization sends users to “filter bubbles” by way of making note of which sites you visit and which links you click on. While following your web history, this structure gradually limits your exposure to opposing viewpoints, much like how we choose to friend like-minded people so as to avoid upsetting our nervous system with heated political debates, for example. And while with mass media, television or newspaper you can actively select what to see and read or not, “with personalization algorithms…many consumers don’t understand, or may not even be aware of, the filtering methodology”. As Google’s Jake Hubert put it, a foodie ends up seeing more apples instead of Apple computers.

 

In his book “The Filter Bubble: What the Internet Is Hiding From You”, in May 2011, Eli Pariser was the first to highlight that the algorithm of internet companies such as Google, since 2009, tend to spread falsehood and bias by either leaving groups who manipulate the social media trends unchecked, thus shifting the opinions of undecided voters, or leaving the rest of us unprotected at the mercy of partisan extremists who manipulate SEO by traffic. As a rule, every user sees different results in his search, the outcome of Google’s personalization. They were forced to tweak their system after rampant misinformation about the Holocaust. Framing content with facts is a way to enhance credibility and move to the top of search results.

pexels-photo-518543.jpeg

Building our digital democracy in the era of the Internet of Things is a difficult task dictated by the wisdom of pluralism. Reduced pluralism increases conflict historically and undermines stability by influencing people’s decisions. However, the concern about virtual echo chambers is considered overstated for some, as a study of 14.000 users in seven countries in 2017 showed. Users check sources, burst filter bubbles and open echo chambers, it maintains. Although people search online for news, they check it on traditional media as well, leaving the least skilled open to fake news. Relying on convincing peers and referrals we are victimized by the comfort zone of our newsfeed. One way to insulate ourselves is by signing up to unfiltered platforms, like Twitter.

 

Recent US elections (November 2016) with 62% of Americans getting their news from social media led Facebook and Google to restrict advertising on fake news sites, in an attempt to offset interference. Still, over-believing is a problem for a society which celebrated Obama’s clever use of Facebook to win elections. Educating an entire generation with the ability to discern between reliable information and misinformation is one way to burst the bubble as is subscribing to reputable news sources with traditional gatekeeping. The Facebook newsfeed trained us to scroll down at first and then it retrained us to wait for the news to come to us instead of us trying to find them; on top of that we read nothing but the headlines.

“Facebook has centralized attention typically spread across the web”

thus taking up the role of a news medium as well. High-jacking advertisement from the news sites themselves is an added bonus. By removing 20% of news from its newsfeed in favor of meaningful and interactive content between users, Facebook gains more engagement, as users are again retrained to skip less of what they see, whereas retraining number four is local news, which will eventually highjack local publishers unless they prepare against the native marketplace. Training your own readers to blog and towards newspapers is the answer.

 

 

 

 

 

1 thought on “Filter Bubbles: Retrain Yourself”

  1. The term “Filter Bubble” was introduced by the Internet activist Eli Pariser in 2010. So what is a filter bubble? In a very brief definition is “intellectual isolation”! So, is it a contemporary phenomenon? No!! The ancient Greek philosopher Plato in his famous philosophical text “symposium” mentioned for the first time the adage “Όμοιος ομοίω αεί πελάζει” which means “people tend to approach people with the same social characteristics and ideology” a similar English adage is “birds of a feather flock together”. So in laymen terms members of a filter bubble group tend to live in ideological feedback loop which extends both online and offline.

    Let’s examine the offline system! The key term is “gate-keeping” and especially “news media gate-keeping”. People of a particular filter bubble offline are exposed to disproofing and contradicting evidences to their ideology hence they are provided with the appropriate information to disprove a certain extreme ideology they practice, for example “racism”. Unfortunately, the opposite practice happens online! The main reason is the algorithmic design of SNS, websites and search engines which selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history.

    The best solution for the online system is for SNS, websites and search engines to assume the role of information and news gatekeeper! Retraining as the initial post suggest is a valid solution but it is not applicable to the majority of the isolation filter bubbles and therefore since the majority of the users are informed about the news online, news gate-keeping is the online massively applicable solution online!

    Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s