

Over at Current Affairs, Nathan Robinson interviews Johann Hari, author of Stolen Focus: Why You Can’t Pay Attention. Robinson writes:
Johann Hari has written multiple bestselling nonfiction books including Chasing the Scream, Lost Connections, and most recently Stolen Focus: Why You Canât Pay Attention. All three books examine the deep causes of problems that are often blamed on individuals, and Johann is concerned with showing how we can collectively solve problems that individual willpower isnât enough to surmountâwhether drug addiction, depression, or the inability to focus.
Today we discuss Stolen Focus, and Johann tells us how peopleâs capacity to pay attention has been breaking down, and what the social consequences are. He explains how smartphone addiction comes about, and the bad incentive structure that social media companies have: their revenue stream starts when you start looking at their apps and stops when you stop looking, meaning they care about âone thing and one thing only: how do we get you to open the app as often as possible and stay scrolling as long as possible.â (Many in Silicon Valley, he says, are âappalled by what theyâve done.â) Johann is scathing about those who put the blame for attention problems on the people who suffer from them: âItâs like someone is pouring itching powder all over us and then saying âHey buddy, you should learn to meditate, then you wouldnât scratch so much.ââ
But Johann goes deeper than just discussing smartphone addiction, talking about how the food we eat, the pace of life generally, sleep habits, and more contribute to the inability to stay focused. Johann discusses the small steps that people can take on their own to overcome the problem, but is emphatic that we need collective solutions at the policy level. We donât need to think that thereâs something wrong with us if we canât pay attention, but nor do we need to think that we are doomed to distraction. This transcript of Hariâs appearance on the Current Affairs podcast has been edited for grammar and clarity.
Here is a taste of the interview:
ROBINSON: At Current Affairs, Iâve noticed that so much of what Iâm thinking about in terms of the political issues Iâm working with is determined by whatever happens to pop into my Twitter feed. So, I think about ten different issues a day because those are the things that someone said a thing about, but in order to actually work on improving the world in the ways we want, we have to be able to step back from whatever is barraging us and think about what matters, and what doesnât. What are the things I want to be talking about right now? With the magazine, weâre trying to [publish what we think] you should care about, even though you havenât heard about it in a while.Â
HARI: Thatâs so good, Nathan. Itâs so important. Youâve gotten to the core of one of the problems here. It is particularly problematic that youâre getting that feedback from Twitter in particular. Itâs worth taking a step back and explaining why. I didnât really understand this until I did the research for my book. I spent a lot of time in Silicon Valley interviewing people whoâve been at the heart of this machine. The most fascinating, striking thing immediately was how guilty and ashamed they feel. They are appalled by what theyâve done, in fascinating ways that I wrote in the book. There were a few mechanisms I didnât fully understand that help us to understand why what youâre doing there, and what so many of us are doing, is so problematic and leading to such harm.
For anyone listening or reading, if you open TikTok, Facebook, Twitter, or Instagram now, those companies immediately begin to make money out of you in two ways. The first one is really obvious: you see advertising, and everyone knows how that works. The second way is much more important: everything you ever do on these apps, and everything you communicate in so-called private messages, is scanned and sorted by artificial intelligence algorithms to get a sense of who you are. Letâs say youâve said you like Bernie: itâs going to figure out youâre probably on the left. And letâs say youâve told your mom you just bought some diapers: it figures out that youâve got a baby. If youâve been using these apps for longer than a few weeks, they have got thousands, if not tens of thousands, of data points about you. They know a huge amount about who you are. And they know that, partly, to sell the information about you to advertisersâsomeone selling diapers wants to know theyâre marketing to people who have babiesâbut primarily, theyâre figuring out what will keep you scrolling for a very simple reason: every time you open the app and start to scroll, they begin to make money. The longer you scroll, the more money they make because of the ads youâll see, and every time you close the app, that revenue stream disappears. So, all of this AI and algorithms, all of this genius in Silicon Valley, is, when it comes to social media, geared towards one thing and one thing only: figuring out how to get you to open the app as often as possible and scroll as long as possible.
I remember saying to people in Silicon Valley whoâve been in this machine, it canât be that simpleâthat canât be the only thing. They looked at me, and they looked kind of baffled: âHow did you think it happened? What did you think went on?â In the same way, all the head of KFC cares about, in his professional capacity, is how often you went to KFC this week, and how big the bucket you bought was. All these companies care about, and all their design is about, is to get you to open their apps as often as possible and scroll as long as possible.
But thereâs a kicker that is really important to what youâre saying about getting your news from Twitter. The algorithms are set up to scan your behavior, and the behavior of everyone who uses the apps, and figure out what keeps you scrolling. And although this was not the intention of the people at any of these social media companies, they bumped into an underlying truth about human psychology has been known about for 90 years now: negativity bias. Itâs very simple. Human beings will stare longer at things that make them angry or upset than they will at things that make them feel good. If youâve ever seen a car crash on the highway, you know exactly what I mean: you stared longer at the car wreck than the pretty flowers on the other side of the street. This is very deep in human nature. Ten-week-old babies will stare longer at an angry face than a happy face. Itâs probably for very good evolutionary reasons. Our ancestors who were not alert to angry faces probably got clubbed to death. So, you can see why this is deep in our nature.
But, when this quirky human psychology combines with algorithms designed to maximize scrolling, it produces a horrendous outcome. Think about two teenage girls who go to the same party and leave and go home on the same bus. One of them opens a phone and does a TikTok video and says, âWhat a great party, everyone was nice. We danced to Taylor Swift. What fun!â The other girl opens her phone and says, âKaren was a fucking skank at that party and her boyfriend is an assholeââan angry, denunciatory rant. Now, the app is scanning to see the kind of language you use. It will put the first nice video into a few peopleâs feeds, and put the second video into far more peopleâs feeds. If itâs enraging, itâs engaging. You can see how these arguments go.
Now, thatâs bad enough at the level of two teenage girls on a bus. We all know whatâs happening to girlsâ mental health. Think about what happens when that principleâthe nice sane people are muffled and the angry, hostile, crazy people are pushed to the front and amplifiedâis applied to a whole country? Except we donât have to imagine it, because weâve all been living it for the last decade.
Read the entire interview here.