by Aristee Georgiadou March 2, 2018
Filter Bubbles: Retrain Yourself
Facebook recently announced changes in its algorithm which will impact its newsfeed, a feature since September 2006. How true is the better experience it promises? And how much of our privacy is being ceded over for a more sophisticated involvement? AI processes billions of data, publishing a mere 0,2% of the stories it considers; 60 out of 30.000 possible candidates. Who you friend, what publishers you follow, how often you interact, what kind of content you prefer, how many interactions these contents have, their recency, they all are parameters of a relevancy score which ranks what appears in your newsfeed. The score is derived by a feed quality program comprised of a panel which organizes the stories based on a survey in 30 languages. Your control over it is friending/unfriending, following/unfollowing, hide, if content lacks interest.
Beyond the obvious concern of internet platforms keeping detailed tabs on us, our online behavior comes with more serious consequences, in the name of personalization. Edgerank is the Facebook algorithm which ranks the summary of our friends’ actions (called edges). Personalization sends users to “filter bubbles” by way of making note of which sites you visit and which links you click on. While following your web history, this structure gradually limits your exposure to opposing viewpoints, much like how we choose to friend like-minded people so as to avoid upsetting our nervous system with heated political debates, for example. And while with mass media, television or newspaper you can actively select what to see and read or not, “with personalization algorithms…many consumers don’t understand, or may not even be aware of, the filtering methodology”. As Google’s Jake Hubert put it, a foodie ends up seeing more apples instead of Apple computers.
In his book “The Filter Bubble: What the Internet Is Hiding From You”, in May 2011, Eli Pariser was the first to highlight that the algorithm of internet companies such as Google, since 2009, tend to spread falsehood and bias by either leaving groups who manipulate the social media trends unchecked, thus shifting the opinions of undecided voters, or leaving the rest of us unprotected at the mercy of partisan extremists who manipulate SEO by traffic. As a rule, every user sees different results in his search, the outcome of Google’s personalization. They were forced to tweak their system after rampant misinformation about the Holocaust. Framing content with facts is a way to enhance credibility and move to the top of search results.
Building our digital democracy in the era of the Internet of Things is a difficult task dictated by the wisdom of pluralism. Reduced pluralism increases conflict historically and undermines stability by influencing people’s decisions. However, the concern about virtual echo chambers is considered overstated for some, as a study of 14.000 users in seven countries in 2017 showed. Users check sources, burst filter bubbles and open echo chambers, it maintains. Although people search online for news, they check it on traditional media as well, leaving the least skilled open to fake news. Relying on convincing peers and referrals we are victimized by the comfort zone of our newsfeed. One way to insulate ourselves is by signing up to unfiltered platforms, like Twitter.
Recent US elections (November 2016) with 62% of Americans getting their news from social media led Facebook and Google to restrict advertising on fake news sites, in an attempt to offset interference. Still, over-believing is a problem for a society which celebrated Obama’s clever use of Facebook to win elections. Educating an entire generation with the ability to discern between reliable information and misinformation is one way to burst the bubble as is subscribing to reputable news sources with traditional gatekeeping. The Facebook newsfeed trained us to scroll down at first and then it retrained us to wait for the news to come to us instead of us trying to find them; on top of that we read nothing but the headlines.
“Facebook has centralized attention typically spread across the web”
thus taking up the role of a news medium as well. High-jacking advertisement from the news sites themselves is an added bonus. By removing 20% of news from its newsfeed in favor of meaningful and interactive content between users, Facebook gains more engagement, as users are again retrained to skip less of what they see, whereas retraining number four is local news, which will eventually highjack local publishers unless they prepare against the native marketplace. Training your own readers to blog and towards newspapers is the answer.