Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

How to Think for Yourself When Algorithms Control What You Read


beastfromeast/Getty Images

With the flick of a switch, a handful of tech giants can change the nature and extent of mankind’s ingestion of information. In 2013, Google took a step towards understanding the intent of their users with the Hummingbird algorithm. Twitter replaced most-recent with most-important tweets when they introduced their algorithmic timeline in 2016. Facebook claimed they’ll be replacing clickbait with more meaningful interactions on their feeds earlier this year.  These changes are almost always met with public uproar for a few weeks, soon after which humanity acquiesces. The ability for an elite to instantly alter the thoughts and behavior of billions of people is unprecedented.

This is all possible because of Algorithms. The personalized, curated news, information and learning feeds we consume several times a day have all been through a process of collaborative filtering. This is the principle that if I like X, and you and I are similar in some algorithmically determined sense, then you’ll probably like X too. Everyone gets their own, mass-personalized feed, rationed by the machines.

The consequences are serious and wide-ranging. Fake news and misinformation are pervasive. Young kids are being subjected to algorithmically generated, algorithmically optimized pernicious content. Perhaps the least concerning implication is that there is systemic bias in our information feeds, that we operate in and are informed by tiny echo chambers. It’s a grotesque irony that our experiences of the world Wide Web today are actually pretty local, despite warnings from the likes of Eli Pariser back in 2011.

What can be done? While data scientists, policymakers, and ethics boards work on large-scale, long-term fixes, it’s incumbent on us, as individual agents, to ensure that we find out and learn what we really need. Against the technological backdrop described above, it’s more important than ever that the modern knowledge worker makes good business decisions based on good information: factual, unbiased, broad-based. Of course, resistance to algorithms is difficult because we’re up against a sophisticated, secret system. But it’s not futile, yet. Here are five practical steps you can take right now.

First, just be aware of what’s going on (a lot of decisions are being made for you by invisible code) and what’s at stake (the consequences of having a very specific, narrow world view). Reading up on the issue is a good start. This awareness may encourage you to question the veracity and comprehensiveness of your feeds and to find solutions which are more pertinent to your situation than can possibly be listed here.

Second, help the algorithm. Or game it. Change your settings to allow some randomized recommendations (if the system has this feature). Deliberately follow people with contrarian views. Proactively explore your chosen social media platforms rather than just passively consuming the content meted out; such exploration will be picked up by the algorithm and reflected in future recommendations. Force the system to cast its net wider.

Third, get off the radar. From browsing privately in incognito mode to searching anonymously with a search engine that doesn’t track you like duckduckgo, there are a host of methods at your disposal. This needn’t be a permanent or absolute switch. But knowing when and how to get an unfiltered view of the web can be useful.

Fourth, consciously decide how much human influence you want. Personalized email digests and social media feeds are algorithmically determined. Traditional editorial is still picked by the human hand. Of course, human curation is subject to bias too. But it’s also subject to scrutiny from a larger audience; clearer lines of responsibility (the buck stops with the Editor-in-Chief); ethics and standards; and an aspiration — for some publications at least — to deliver comprehensive rather than subjectively relevant coverage.

Fifth, step out of the digital echo chamber by stepping out of digital altogether. The physical world is consistently chaotic. Pay greater attention to the feelings, observations, musings, and conversations you have in real life.

The influence of algorithms is immense and double-edged. Some of the toxic effects have been alluded to here. The benefits are also substantial: from thousands, millions, even billions of pieces of mostly irrelevant content, algorithms serves up a feed of compulsive, inspiring nuggets. Fix your feed to optimize your perspective on the world wide web as well as the big wide world.



This post first appeared on 5 Basic Needs Of Virtual Workforces, please read the originial post: here

Share the post

How to Think for Yourself When Algorithms Control What You Read

×

Subscribe to 5 Basic Needs Of Virtual Workforces

Get updates delivered right to your inbox!

Thank you for your subscription

×