Tristan Harris and Time Well Spent

The recent conversation between Tristan Harris and Sam Harris on the Waking Up Podcast (“What is technology doing to us?”) was one of the most interesting podcasts I’ve listened to in a long time. Not content to be merely fascinated, down the rabbit hole I went, tracking Tristan’s numerous other digital footprints, including a TED Talk, interviews on 60 minutes and PBS, and the several essays on his own website. And no, the irony was not lost on me that I was compulsively consuming digital media that was explicitly about the ways in which we are persuaded to compulsively consume digital media. The pull of the puppet master is strong, and I’m trying to save my own soul here.

Tristan Harris worked for Google and knows a lot about tech design, the psychology of persuasion, and the attention economy. If anything is central to everything that is important in life, it is our attention. The quality of my attention, the clarity of it, is what distinguishes malaise from presence and vitality, and becoming aware of how my attention is affected by the myriad forces that seek to control its flow and its habitual patterns is perhaps the central project of my life. And while technological forces have undoubtedly been acting on my attention since my time in the womb, I am becoming acutely aware, as many of us are, of the effects that screens and social media are having on my overall state of mind.

As I’m typing this sentence it’s about 5:30pm on a Sunday, and I’ve felt compelled to consume distractotainment via one screen or another at least four or five times an hour since I crawled out of bed this morning. I’ve succumbed to the compulsion at least ten or fifteen times throughout the day, each time resulting in total derailment from the creative projects (including this blog post) I’ve been working on. While distraction is an age-old bugaboo, the problem seems to have gotten worse – a lot worse – in just the past couple of years. Social media has certainly changed a lot since I started using it about ten or so years ago. Remember when we used to be able to control what we could see in our Facebook feeds? Back in the day, one could simply see the posts made by one’s friends, and see them in chronological order. Now, most of what I’m seeing are things my friends have “liked,” which of course are often the posts propped up by Facebook’s paying customers. It is now impossible, even through a deep dive into one’s account settings, to transform one’s feed into the simple configuration of “my friends’ posts in chronological order.” Social media is all business now, and who came blame these tech companies for getting a return on their investment. It always seemed “too good to be true,” in the early days of the internet, when sites like Google, Facebook, Twitter, and YouTube were simply providing everyone with free access to their services. They were playing the long game, and we are all now becoming increasing acquainted with the losing side. We’ve been willingly attaching a string here and a string there, and now those strings are being pulled by insidious “algorithms” that lead us in directions we don’t necessarily want to go.

Here’s how Tristan Harris describes some of the symptoms of the digital disease many of us find ourselves fighting off [From Tech Companies Design Your Life, Here’s Why You Should Care]:

We grow less and less patient for reality as it is, especially when it’s boring or uncomfortable. We come to expect more from the world, more rapidly. And because reality can’t live up to our expectations, it reinforces how often we want to turn to our screens. A self-reinforcing feedback loop. […] And because of the attention economy, every product will only get more persuasive over time.

The attention economy tears our minds apart. With its onslaught of never-ending choices, never-ending supply of relationships and obligations, the attention economy bulldozes the natural shape of our physical and psychological limits and turns impulses into bad habits.

With design as it is today, screens threaten our fundamental agency. Maybe we are “choosing,” but we are choosing from persuasive menus driven by companies who have different goals than ours.

One of Harris’s most impactful points is that we now find ourselves in a situation where the design choices of a handful of tech nerds can profoundly influence the thoughts, feelings, and actions of literally billions of human beings. Of course, we are free to choose not to, say, own a smartphone, but how many people make that choice? (I am one of those people who chooses not to have a smart phone, but I feel as though I may be seduced into buying one sometime soon). But can’t we freely choose how to use our smartphones, if we do own them? Of course we can, but when the “choice architecture” is specifically designed to exploit our perceptual frailties, when the full arsenal of persuasion techniques is brought to bear on human minds already easily duped by con artists and theocrats and a million other would-be puppet masters, how much freedom is there, really?

Of course, this game is hardly new. Alan Watts, prescient as he was on so many issues, recognized the attention economy – this system of creating and controlling the mental and behavioral puppet strings of an entire society – back in 1951, in his brilliant and still-relevant classic The Wisdom of Insecurity. Check this out:

Thus the “brainy” economy designed to produce this happiness is a fantastic vicious circle which must either manufacture more and more pleasures or collapse – providing constant titillation of the ears, eyes, and nerve ends with incessant streams of almost inescapable noise and visual distractions. The perfect “subject” for the aims of this economy is the person who continuously itches his ears with the radio, preferably using the portable kind which can go with him at all hours and in all places. His eyes flit without rest from television screen, to newspaper, to magazine, keeping him in a sort of orgasm-with-out-release […]. The literature or discourse that goes along with this is similarly manufactured to tease without satisfaction, to replace every partial gratification with a new desire.

This stream of stimulants is designed [emphasis mine] to produce cravings for more and more of the same, […] to persuade us that happiness lies just around the corner…

The trendiness of mindfulness is interesting in this regard, because the process of cultivating a deep and sustained level of presence entails a certain exposure of illusion and delusion, which can go a long way to helping us gain greater control over our tech, instead of allowing ourselves to be controlled by it. While Watts might have prescribed meditation as a way to inoculate ourselves against the digital disease process, Harris recommends that we ask ourselves, “what are our goals?” and “how do we want to spend our time?” and then pay close attention to the ways that our technology and the broader attention economy actively work against our best intentions.

Just like the food industry manipulates our innate biases for salt, sugar and fat with perfectly engineered combinations, the tech industry bulldozes our innate biases for Social Reciprocity (we’re built to get back to others), Social Approval (we’re built to care what others think of us), Social Comparison (how we’re doing with respect to our peers) and Novelty-seeking (we’re built to seek surprises over the predictable).

[We must] recognize our holistic mental and emotional limits (vulnerabilities, fatigue and ways our minds form habits) and align them with the holistic goals we have for our lives (not just the single tasks), [thus] giving us back agency in an increasingly persuasive attention economy.

Harris has also created a “non-profit movement” called Time Well Spent, the mission of which goes something like this:

We live in an arms race for attention. Because we only have so much attention in our lives, everything has to fight harder to get it.

The internet isn’t evolving randomly.
We know exactly where this is going, and it will only get worse.

Our mind is our one instrument to live our lives, to be informed, to be present with each other and to solve our most important problems – and it’s been hijacked.

We can solve this problem together, but we’ll need your help.

Harris goes on to give practical advice to change our tech habits, implores companies to change their design philosophy so that it maximizes benefits to their customers’ lives, and he suggests ways that anyone interested can get involved in inventing a more human future that supports our deepest values and best intentions instead of undermines them.

I am totally on board, and I encourage everyone to give Tristan Harris a few moments of your attention (while you still can!) :o)