On Wednesday evening, I deleted my social media accounts.
In an attempt to better understand how I can use social media in a healthier, more sustainable way, I’m currently undergoing a 7-day‘ detox’ from social media
My thoughts were that it would help me gain some perspective on what my addiction really looked like.
A quick look at my daily screentime informed me that I had spent an average of 5 hours a day on my days off - most of which came from social media.
How long had this been going on for?
What was I possibly doing on social media to justify that amount of screentime?
And, most importantly, what would my day look like if I were able to cut this out?
After stumbling across the Netflix documentary 'The Social Dilemma' a few months back, I became intrigued by one speaker’s message in particular - Jaron Lanier.
Considered to be the founding father of virtual reality and a Silicon-Valley veteran, Lanier's concerns about 'behaviour modification' intrigued me.
After watching some of his lectures on youtube, I devoured his book "10 arguments to delete your social media accounts right now".
Now, before you read on, I'd just like to clarify that I have no interest in getting you to delete your social media accounts. If anything, as someone who depends on social media for growth, this entire article is counter-intuitive.
I do, however, feel the need to share his eye-opening arguments. Here are snippets from two of them:
"Your specific behaviour change has been turned into a product."
- Jaron Lanier
Lanier's first argument in the book is about the distinction between advertising and behaviour modification.
He believes that old-school advertising methods are innocent. Share a product on the radio or newspaper and people either ignore it or buy the product, no matter how tricky they try to be.
However, tech giants like Facebook and Google are now able to do something more sinister.
Through dynamic data collection on who you are, what you pay attention to and what actions you take, they can influence you to make decisions by what other, similar people are doing.
For example, if a woman specifically wearing a red dress ever so slightly increases the chance of selling a product to thousands of people, and you are behaviourally similar to them, they will show you that same ad, in the hopes that you might also behave in the same way.
You may consider this to be a well-personalised ad (just as I did), but in reality, the algorithm builds on small psychosocial patterns to experimentally 'hack' your behaviour and get you to do the things they want you to.
No one is watching over these behaviour modifications at a micro level, but on the macro, these algorithms can detect the smallest of trends through sheer volume of users.
The algorithms don’t need to understand how or why a small change works, just that it does.
If the algorithm doesn't gain a significant result, it discards the experiment. If it tries something that earns a marginal difference in favourable behaviour, it will amplify it and stack it upon its countless prior experiments.
In short, instead of simply selling you a product, they can make you into the type of person who would buy that product.
"It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology…"
- Sean Parker, First President of Facebook.
One of the principal aims of these ‘big tech’ algorithms is to keep people on their platform for as long as possible.
If users engage with certain posts, that's a good sign for them to boost that post to others.
The longer you stay on the platform, the more chance the platform has to sell to you and collect more data, allowing them to better target you next time.
But here comes a particularly worrying problem, content that angers us and invokes negative emotions can get us to engage the same amount, if not more, than positive ones.
"Negative emotions such as fear and anger well up more easily and dwell in us longer than positive ones. It takes longer to build trust than to lose trust. Fight-or-flight responses occur in seconds, while it can take hours to relax. This is true in real life, but it is even more true in the flattened light of algorithms."
- Jaron Lanier
The algorithms don't have empathy for the human condition.
If they detect that certain types of content get you to engage more, they'll boost that content to more and more people.
Tech giants can mute certain words and censor offending content, but even a seemingly innocent post that triggers you and sends you down a spiral of 'doomscrolling' can fly ‘under the radar’, only detected by algorithms that cascade it to a wider audience.
Even if you think that you're strong enough to resist the emotional manipulation of the tech giants, it's difficult to deny that the world is worse off when negative emotions are exploited for increased engagement.
"An unfortunate combination of biology and math favors degradation of the human world. Information warfare units sway elections, hate groups recruit, and nihilists get amazing bang for the buck when they try to bring society down."
- Jaron Lanier.
"Some have compared social media to the tobacco industry, but I will not. The better analogy is paint that contains lead. When it became undeniable that lead was harmful, no one declared that houses should never be painted again. Instead, after pressure and legislation, lead-free paints became the new standard. Smart people simply waited to buy paint until there was a safe version on sale. Similarly, smart people should delete their accounts until nontoxic varieties are available."
- Jaron Lanier
I've written a short summary along with my favourite quotes from the book. Read it here.