Algorithms

Algorithms, hard to spell, let alone to understand as it implies a complex formula of something the majority do not understand, nor are interested in learning. It is ‘a process or set of rules to be followed in calculations or other problem solving operations, especially by a computer’, explains Google.

The algorithms that I am interested in are the ones that are based on my online data and behaviour, and that pre-selects what information I see. For example algorithms gives me tips about what movies I should watch on Netflix based on what I have seen, what news feed I am exposed to on my Facebook and my Twitter, and what advertisement I see that creepily relates to what I last searched on Google.

Facebook tested an algorithm on some of their users (without their knowledge), exposing them to predominantly happy status updates in their feed. This resulted in that they ended up posting happier updates themselves (and vice versa when exposed to sad content). Spotify has also figured out the formula of generating excellent playlists based on individual preferences.

‘We decide what gets attention based on what we give our attention to. That’s how the media works now. There’s all these hidden algorithms that decide what you see more of and what we all see more of based on what you click on.’ – Sally Kohn 

Sally Kohn reminds us in less than five minutes that we are a part of shaping our own culture online by what we click on. If we click on tabloid headlines and cute kittens, well than it is exactly what type of ‘news’ we will see more of. 

Last week I was at a human rights forum, gathering 700 participants around the topics of refugee protection, social inclusion and digital age. As expected the issues of refugee’s rights took up much space, gathering many compelling speakers, experts and testimonies. It surprised me though that digital age was such an unknown territory, both for the organisers and the participants. Few speakers brought in-depth and technical knowledge, and the few participants attending the sessions were mostly there to learn rather than to contribute. One inspiration speaker was though Katarzyna Szymielewicz, President of Panoptykon Foundation, who talked about algorithms. Mr Szymielewicz argued that as long as algorithms continues to be black boxes we can never convince users that it may violate their rights. What she meant was that people in general don’t seem to mind their data being used as long as they have a smooth browsing experience. Rights actors should not simply argue against ‘smooth browsing’ without really explaining the impact and cost of sharing private data; if so, we will be seen as the ones withholding rights rather than those protecting them.

Digital rights and data privacy is something most people know little about. Some recklessly share all their personal information online and use one and the same password, consisting of a millisecond hackable combination (such as the name of their children and their birthdays). Others are suspiciously cautious and avoid any possible interaction online, especially if it asks for their personal details. They rather spend hours in a phone line or correspond via snail mail instead of turning to the web to access the services. In some cases though it is not a matter of resistant but rather a question of accessibility as the web is not yet inclusive of persons with disabilities. While one is naively unaware of the algorithms that determines the information they are exposed to, another foolhardy believes they outsmarted algorithms by staying offline. If the result of algorithms is our biased choices, than isn’t our brain also composed by algorithms that we are not always aware of?

‘We trust decision made by humans more than by algorithms, we prefer natural over artificial. Decision makers believe they can overrule the formula based on additional information and their own intuition. Unfortunately, more often are they wrong than not.’ – read my blog post about Kahneman

We should be as informed and consciously aware consumers, of products and services, online as we are offline. We have for example learned that buying  ridiculously cheap cloths contributes to human rights violations such as unfair working conditions for those in production. Some therefore try to buy for example organic, fair-trade, and/or local products. It is time that we become more informed about the consequences of our activities online too, by claiming our digital rights, protect our privacy and shape our online behaviour and culture.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: