Three days ago I have watched Netflix’s film Social Dilemma. The film discusses the growing problem of social media and the gradual fracturing of our society.
If you are looking out to the world today and wondering why things seem so crazy I strongly encourage you to watch this film. You will find out that Social media is likely the reason, or probably the reason that is aggregating all of the other reasons. Let’s see why.
Most of us think of design, engineering, and technology, as something which makes our lives more intuitive, functional, and ultimately easier. It seemed to be true for most of our history, as new technology has enabled us to lead more enjoyable, happier, and fulfilling lives.
If you ask a tech product designer, he will probably explain that the whole purpose of technology and design is to create happier humans by fulfilling our needs and making our daily tasks easier. But, is it possible that the constant fulfillment of our moment to moment needs is making us less happy?
Its the gradual, slight imperceptible change in your behavior and perception that is the product. It’s the only possible product. -Jaron Lenier
The Social Media business model
Most of the problems with social media come down to the business model.
Why? You see, in the social media business model, services are free. Think of Youtube, Facebook, Twitter, Snapchat, or Instagram. These are all free services. Sort of. You pay for them with your attention and your attention is the most valuable resource in this business model. Your attention is indeed the product. The more of your attention equals more money for social media companies in the form of advertising revenue. Because of this link there is simply no limit on how much attention social media companies would like to have from us.
You may think, ok, but this is nothing new. This is a well-known business model. The newspapers and television companies have used it for many decades, why is it so different this time?
Fair question. On the one hand, tv and newspaper content was not specifically tailored to you. Sure, you had different channels you could watch, but the content was generated and distributed in a very different way.
If you planed to watch a tv show and get back to some other activity, you could easily do it. Most certainly you didn’t bring your tv everywhere you went. Today, we have content and social media platforms sitting in our pocked, regardless of where we go.
And not just that. The content is shaped in endless scrolling feeds were clusters of supercomputers, billions of times more powerful than the human brain are using enormous amounts of data on each end every scroll just to figure out what is the next best content to show, in order to keep us more engaged and on the platform.
How did it all begin?
Let’s suppose you are Youtube. You will probably add a feature to autoplay next video. Yes, it is annoying. But, let’s say this increases your average daily watch time by 3 %
And obviously attention is a limited resource. We all have a fixed amount of waking hours in the day. This means that as attention starts to saturate, other big tech companies need to dial-up on how persuasive they are in competing for it.
So, now Facebook is sitting back and thinking, we need to add the video autoplay in the news feed as well. Consequently, Youtube is now thinking, ok, we could probably include the person’s google search data into our video prediction models and make the next video plays, even more, accurate, engaging, and harder to resist.
This simple example just illustrates how these platforms will continue to evolve to get more attention. And the result? People are spending more time on social media platforms than ever before. In each competing iteration for attention, we are at the new equilibrium where more human attention is consumed by social media companies.
Experiencing and Remembering self
Let’s imagine that you are the product designer at Snapchat. Because of how frequently people turn at their phones, you are essentially in the power to schedule little blocks of people’s time.
If you immediately notify me of every Snapchat message, and Snapchat is probably one of the most abusive, manipulative technology products out there, I will see the message of my friend at that moment urgently, and it will almost certainly cause me to go swipe over. But I won’t just see that message. I will probably get sucked into other feeds and spend time in an activity that I didn’t plan for. But make no mistake, this doesn’t happen by accident. You as a product designer at Snapchat were planing this activity for me through little subtle and well-thought-out manipulations.
Experiencing vs. Remembered self
Let’s say you are a product designer at Facebook, and you want to be ethical. Your decisions affect billions of people and you need to decide on steering people towards the two different timelines:
- Schedule A in which some events will happen.
- Schedule B which some other events will happen.
So, should you schedule something which people might find challenging and fairly difficult at the moment, but later feel it was very valuable and fulfilling. Or do you steer them to something that rewards them immediately, but they will find to regret it later?
Behavioral economists are well aware of these two distinctions. They call it the difference between The Experiencing self and the Remembered self.
For example, if you were giving somebody a questioner asking them whether the time on Facebook was well spent you are talking to the Remembered self. If you were to experience sample people along the way while using these apps you would be talking to the Experiencing self. And almost certainly you will get two quite different measures and responses after the experiment. Why is that?
It’s simple. Facebook algorithms are in the arms race for attention. And Facebook needs to be maximizing the time spent on site. It’s all that matters. And the Remembered self isn’t the one spending time on your site, it’s the Experiencing self.
So are you saying that as a product designer at Facebook I will be steering people into choices that are shallow and empty, and which their Remembered self will come to regret? Yes, I’m afraid so. This is how the attention economy works. And it’s not just Facebook, you may as well be working for Youtube, Instagram, Ticktock, Netflix. Doesn’t matter. The goal is the same.
The slot machine technique
If you use Twitter, when you land on it, notice that there is an extra variable time delay, between 1–3 seconds before the number shows up on top of the notification. And that delay in fact makes it very similar to a slot machine. Literally, when you load the page, it’s like you are pulling the leaver and you don’t know what you are going to get.
The idea is that you push the lever, and sometimes you get 3 notifications, sometimes nothing at all, and sometimes 20. This is by no means an accident. This is a simple design hack that is creating variable scheduled rewords and tricking your brain into giving you small, bearly noticeable dopamine hits.
These dopamine hits are slowly forming unconscious habits by affecting the brain’s reward pathways.
Every product designer knows this very well. If you are reading this and thinking, ok but this couldn’t affect people that much. Think again, and ask yourself, could this be connected to the fact that billions of people look at their phones more than 150 times a day.
Machine learning and distribution of content
There is almost an infinite supply of news and information that can be distributed. Algorithms need to decide which of it will reach whom. And in the attention economy, algorithms need to do it in a way that maximizes the time spent on the platform.
Without even having any real person on top of it, today’s algorithms discover people’s invisible traits, make models and predictions of people’s behaviors and recursively learn what to show in their feed, and what not to show.
If an Algorithm determines that showing people fake news, conspiracy theories, extremism thinking, or information that generates outrage keeps peoples attention, make no mistake, it will maximize it.
But it’s not general information and content. All of the feeds are personalized to each user, due to the reverse learning and prediction models. The algorithm “knows” who are you most likely to hate, and what are you most likely to believe in. It will bring you the evidence which resonates with your current values, views, and information you are exposed to.
And make no mistake, algorithms are not doing this intentionally. They simply don’t know what the truth is. They are doing it because their only proxy for maximizing time on site is the engagement metrics. And they are maximizing the engagement by delivering personalized content for each user.
Flat Earth conspiracy videos where recommended hundreds of millions of times only in the past year
And we are not just seeing the rise of conspiracy theories. We are also seeing a rise in fake news, polarization, and extremism thinking. Combined, these effects are gradually braking and destroying our shared view of reality.
Are social media companies evil?
No. This doesn’t happen because there is someone that is evil at Facebook or Google and wants to steal people’s time and attention on an unprecedented scale in human history.
It’s simply because the algorithms they have created have no real way of talking to your Remembered self. The algorithms only measure, predict, and adapt to users’ Experienced self. And experienced self is the one that spends the time, clicks, and engages with the platform.
The most common narrative we hear all of the time is that technology is neutral, and it’s up to us how we want to use it. This argument completely misses out on the attention economy effect, and I believe we are way past the point in which we are the ones using social media technology.
And what is so interesting, when you listen to the people who make these services there really isn’t anybody there who is thinking — “Yes I want to take away people’s time from their life choices and make them less happy in the long run”.
The narrative at Facebook is, we want to make a world more connected and open. And to be fair, Facebook did make the world more connected. Problem is that this isn’t the thing Facebook algorithms and product designers are measuring against every day.
And still, they are steering the decisions of billions of people on an unprecedented scale in human history. And the effects are now showing up on the balance sheets of the whole society, massively affecting the quality of life younger generations are leading, fueling the rise of fake news, polarization, extremism, and conspiracy theories on a scale never seen before. There is already a mountain of evidence supporting this.
Will there be a solution to this problem? I don’t know. But I certainly, hope so. There are new American presidential elections coming up in just a few short weeks. We all know what happened the last time.
So, let’s wait and see if Social Media companies have learned anything after this year’s congressional hearings.