Apps track what we click, watch, like and follow.
The data gathered from past behaviors is utilized to filter content and generate a feed geared to our preferences.
What do you like? Cat or dog videos? True crime or history podcasts? Fútbol or football highlights?
The algorithm knows how to capture our attention and keep us occupied. So, how is this bad? If I don’t care about car videos, I am glad my app removes them. Apps programming for algorithms is ideal for entertainment but regrettably also functions as a tool to refine educational media too.
f a feed is created to display content one prefers, then any source of information gathered should be viewed as confirmation
bias.
In a way, our timelines keep us trapped in single-minded thinking and eliminate our freedom of thought. Each topic, ranging from political ideologies to celebrity scandals, is served to us on a golden platter.
Social media has provided a platform to share knowledge that prior to its existence was unknown. But when we are forced to consume only one opinion it becomes impossible to reason with others who have different beliefs.
Undeniably, we can recognize society is reluctant to listen to opposing arguments. Apps allow one to swipe past, dislike and even inform the algorithm to stop publicizing similar media.
So, what do we do? Succumb to doom scrolling or change our way of thinking?
Reframing an algorithm developed from years of past data takes time and surrendering to the echo chamber seems easier. Instead of perceiving the algorithm as the problem, turn it upon yourself.
Habitually we discern content as truth. Modify this practice by researching the information presented to you. Documents, research papers and articles are available for you to read and form ideas.
Challenging our own beliefs is the avenue to discover potential compromise with others. When talking to real people and having healthy debates, you begin to realize everyone has a common ground.
And that is a hope for the world to get better.