• Skip to main content
  • Skip to secondary menu
  • Current
  • Home
  • About
    • About Current
    • Masthead
  • Podcasts
  • Blogs
    • The Way of Improvement Leads Home
    • The Arena
  • Reviews
  • 🔎
  • Way of Improvement

Capitalism is “siphoning our attention for profit”

John Fea   |  March 14, 2023

Wikimedia Commons

Over at Current Affairs, Nathan Robinson interviews Johann Hari, author of Stolen Focus: Why You Can’t Pay Attention. Robinson writes:

Johann Hari has written multiple bestselling nonfiction books including Chasing the Scream, Lost Connections, and most recently Stolen Focus: Why You Can’t Pay Attention. All three books examine the deep causes of problems that are often blamed on individuals, and Johann is concerned with showing how we can collectively solve problems that individual willpower isn’t enough to surmount—whether drug addiction, depression, or the inability to focus.

Today we discuss Stolen Focus, and Johann tells us how people’s capacity to pay attention has been breaking down, and what the social consequences are. He explains how smartphone addiction comes about, and the bad incentive structure that social media companies have: their revenue stream starts when you start looking at their apps and stops when you stop looking, meaning they care about “one thing and one thing only: how do we get you to open the app as often as possible and stay scrolling as long as possible.” (Many in Silicon Valley, he says, are “appalled by what they’ve done.”) Johann is scathing about those who put the blame for attention problems on the people who suffer from them: “It’s like someone is pouring itching powder all over us and then saying ‘Hey buddy, you should learn to meditate, then you wouldn’t scratch so much.’”

But Johann goes deeper than just discussing smartphone addiction, talking about how the food we eat, the pace of life generally, sleep habits, and more contribute to the inability to stay focused. Johann discusses the small steps that people can take on their own to overcome the problem, but is emphatic that we need collective solutions at the policy level. We don’t need to think that there’s something wrong with us if we can’t pay attention, but nor do we need to think that we are doomed to distraction. This transcript of Hari’s appearance on the Current Affairs podcast has been edited for grammar and clarity.

Here is a taste of the interview:

ROBINSON: At Current Affairs, I’ve noticed that so much of what I’m thinking about in terms of the political issues I’m working with is determined by whatever happens to pop into my Twitter feed. So, I think about ten different issues a day because those are the things that someone said a thing about, but in order to actually work on improving the world in the ways we want, we have to be able to step back from whatever is barraging us and think about what matters, and what doesn’t. What are the things I want to be talking about right now? With the magazine, we’re trying to [publish what we think] you should care about, even though you haven’t heard about it in a while. 

HARI: That’s so good, Nathan. It’s so important. You’ve gotten to the core of one of the problems here. It is particularly problematic that you’re getting that feedback from Twitter in particular. It’s worth taking a step back and explaining why. I didn’t really understand this until I did the research for my book. I spent a lot of time in Silicon Valley interviewing people who’ve been at the heart of this machine. The most fascinating, striking thing immediately was how guilty and ashamed they feel. They are appalled by what they’ve done, in fascinating ways that I wrote in the book. There were a few mechanisms I didn’t fully understand that help us to understand why what you’re doing there, and what so many of us are doing, is so problematic and leading to such harm.

For anyone listening or reading, if you open TikTok, Facebook, Twitter, or Instagram now, those companies immediately begin to make money out of you in two ways. The first one is really obvious: you see advertising, and everyone knows how that works. The second way is much more important: everything you ever do on these apps, and everything you communicate in so-called private messages, is scanned and sorted by artificial intelligence algorithms to get a sense of who you are. Let’s say you’ve said you like Bernie: it’s going to figure out you’re probably on the left. And let’s say you’ve told your mom you just bought some diapers: it figures out that you’ve got a baby. If you’ve been using these apps for longer than a few weeks, they have got thousands, if not tens of thousands, of data points about you. They know a huge amount about who you are. And they know that, partly, to sell the information about you to advertisers—someone selling diapers wants to know they’re marketing to people who have babies—but primarily, they’re figuring out what will keep you scrolling for a very simple reason: every time you open the app and start to scroll, they begin to make money. The longer you scroll, the more money they make because of the ads you’ll see, and every time you close the app, that revenue stream disappears. So, all of this AI and algorithms, all of this genius in Silicon Valley, is, when it comes to social media, geared towards one thing and one thing only: figuring out how to get you to open the app as often as possible and scroll as long as possible.

I remember saying to people in Silicon Valley who’ve been in this machine, it can’t be that simple—that can’t be the only thing. They looked at me, and they looked kind of baffled: “How did you think it happened? What did you think went on?” In the same way, all the head of KFC cares about, in his professional capacity, is how often you went to KFC this week, and how big the bucket you bought was. All these companies care about, and all their design is about, is to get you to open their apps as often as possible and scroll as long as possible.

But there’s a kicker that is really important to what you’re saying about getting your news from Twitter. The algorithms are set up to scan your behavior, and the behavior of everyone who uses the apps, and figure out what keeps you scrolling. And although this was not the intention of the people at any of these social media companies, they bumped into an underlying truth about human psychology has been known about for 90 years now: negativity bias. It’s very simple. Human beings will stare longer at things that make them angry or upset than they will at things that make them feel good. If you’ve ever seen a car crash on the highway, you know exactly what I mean: you stared longer at the car wreck than the pretty flowers on the other side of the street. This is very deep in human nature. Ten-week-old babies will stare longer at an angry face than a happy face. It’s probably for very good evolutionary reasons. Our ancestors who were not alert to angry faces probably got clubbed to death. So, you can see why this is deep in our nature.

But, when this quirky human psychology combines with algorithms designed to maximize scrolling, it produces a horrendous outcome. Think about two teenage girls who go to the same party and leave and go home on the same bus. One of them opens a phone and does a TikTok video and says, “What a great party, everyone was nice. We danced to Taylor Swift. What fun!” The other girl opens her phone and says, “Karen was a fucking skank at that party and her boyfriend is an asshole”—an angry, denunciatory rant. Now, the app is scanning to see the kind of language you use. It will put the first nice video into a few people’s feeds, and put the second video into far more people’s feeds. If it’s enraging, it’s engaging. You can see how these arguments go.

Now, that’s bad enough at the level of two teenage girls on a bus. We all know what’s happening to girls’ mental health. Think about what happens when that principle—the nice sane people are muffled and the angry, hostile, crazy people are pushed to the front and amplified—is applied to a whole country? Except we don’t have to imagine it, because we’ve all been living it for the last decade.

Read the entire interview here.

Filed Under: Way of Improvement Tagged With: capitalism, Johann Hari, Nathan Robinson, smart phones, technology