The “Like” Button Isn’t As Innocent As You Might Think

If you thought the round earth was a purely self-evident fact, well, then you made a mistake.

In a survey conducted by YouGov, which surveyed more than 8,000 Americans, one in six admitted they were not entirely sure if the earth was really round. According to a similar survey among Brazilians, as many as 7% believed that the earth is in fact flat.

And that’s not about ignorance at all. There is a very strong movement of believers in flat earth (and it is getting stronger). More than 600 people attend their annual conferences. This, of course, is no social dilemma. Let people believe what they want, because we are free.

There are people who believe that guns everywhere should be legalized, and abortions allowed to any woman who feels that way. There are men who want us to treat them like women today, and like men again tomorrow as they feel at any given moment.

There are people (even these are few) who still believe that they are the world’s witches.

No, none of this is a social dilemma. People just believe in this or that, and that freedom belongs to us – as it should.

What is a social dilemma though, is the ‘like’ button on Facebook.

To Like or Not to Like

Yes, I know it sounds funny. But let’s take that blue button right now as a symbol for all the social networks that have so deeply infiltrated our daily lives. Not only Facebook, but all others; Twitter, Instagram, Snapchat, Tiktok, Youtube, and so on.

What’s so disturbing about that poor “like” button?

In a word: algorithm.

Don’t be intimidated, I’m not going to explain anything technical in programming jargon. Just think about what you see on the surface.

The New Equation for Everything?

With algorithms that power social networks, there is a huge social dilemma we are facing. Over time, we will actually be forced to make important social decisions, both on a personal and political level. Global!

There is a great documentary called The Social Dilemma. If you have Netflix, I recommend you watch it. I watched it a year ago, so I’ve had plenty of time thinking about it.

This documentary would be no different from many other similar films, but here the authors suceeded to put social media experts in front of the cameras. That includes those who created or co-created the algorithms themselves.

Like Justin Rosenstein, if we stay with the Facebook “like” button, that I put on the spot today.

He invented it, along with all of the functionalities that the button brings. He says it was initially thought of as a tool to spread “positively.” Of course, it’s nice to post something, and then get likes for your post. What could be wrong with that?

Well, it turns out people are disappointed if they don’t get likes. As a result, they adapt their behavior to attract more likes. There’s a huge population of kids and teenagers who feel the need to get likes in any way possible.

During this period of their development, approval and validation is more or less everything they crave for. Many live a nightmare if they get ‘too few’ of them or nothing at all – and their peers see it. Finally, children start to despise what their parents tell them at home. They prefer to be cool in the eyes of their peers.

To be fair, it’s not much different for adults. Maybe a little less dramatic. However, the fact is that it is precisely because of such social network mechanisms that experts are recording a rise in depression, self-harm, and even suicide attempts.

The “like” button is not as innocent as it seems at first sight. But that’s not all.

Not-So-Social Networks

The algorithm that works behind such a simple button is complex and is more than just adding up and displaying likes.

To put it very simply, every time you click on such a button (not only on Facebook, but also on all other social networks, Twitter, Pinterest,…), this information is recorded to your social network profile.

A system that records and calculates all these statistics is more or less automatic. When you like, share, or comment on a particular type of content, the system accumulates data and evaluates it.

Yes, a lot of data has been collected about you if you are active on social media. All the social network wants is to know what interests you the most, what you like most, where your attention is most. That’s so it can offer you more similar content.

You’d be naïve to believe that they care so much about educating you so well on a subject. No no, their goal is only one:

‘xyz (add your name) needs to spend as much time on our site as possible because we have a whole host of advertisers who pay us dearly to show them ads that they may be interested in.’

Google, as the largest online content provider, works similarly. We actually pay for all of what we use for free on these high-capital platforms with the clicks and ad views they show us. From their point of view, it would be completely pointless to suggest and offer content that may not interest you.

Why would they do that if they risked you go away looking elsewhere? They would lose a lot of xyz’s, and consequently advertisers who pay them for presence on a seemingly free system.

So if the algorithm detects that you’re interested in flat earth theory, then you’ll receive a lot of similar content on your wall. You may even receive a conference invitation.

If you’re interested in guns and you believe you should have at least one semi-automatic pistol under your bed, the algorithm will suggest enough content from people and the media who believe the same. Maybe even Trump will add you as a friend on Facebook.

If someone believes in fringe gender theories, in the free decision of abortion, if they believe that witches live in the world, then Google and social networks will offer more of that.

Don’t blame me for choosing cases like this, it is just easier to make a point that way. They are more extreme and delicate, but the same goes for all areas, including those that might be of interest to me or you.

You understand, it’s automatic. Behind the system, behind the algorithms, there’s no man sitting down and thinking ,”Wait, but maybe it would be wise to offer Xyz some other view, different content.”

The algorithm is cold, neutral math and statistics that work in such a way that the most benefit (read profit) will come to the platform that it has developed.

So here’s the social dilemma:

How can we become a healthy society if we think only one way without having the option to see different ideas?

Unfortunately, social networks are not helping at the moment. Quite the opposite. They help us to remain even more divided and confident in our own right. This is a shame because they really have all the power, opportunity, and tools to make a positive contribution to our mutual relations.

In this way, they help to frame our world and our views, which eventually become our reality. And that’s just to keep us in front of screens and cell phones as long as possible.

If we consolidate only and only one view, wars and conflicts are inevitable in the future, even escalating, even if we (more or less) all want peace.

There is always another view and I personally believe that a mature man is the one who can look at multiple angles and understand the views of the “other side”.

Who said the most dangerous man in his life had read only one book?

Not only by reading, speaking, and commenting, but also by encountering and through genuine, living human contacts, we will keep an open mind.

Someone once sympathetically wrote:

What is the opposite of the social network? Social life.

Leave a Comment