• Skip to main content
  • Current
  • Home
  • About
    • About Current
    • Masthead
  • Podcasts
  • Blogs
    • The Way of Improvement Leads Home
    • The Arena
  • Reviews
  • 🔎

Locked In, Cradle to Grave

Felicia Wu Song   |  November 1, 2021

Creative Commons

Facebook has us where it wants us

Back in 1957, the American public was alarmed when marketing researcher James Vicary reported moviegoers were being psychologically manipulated to hit concession stands by flashes of phrases like “Drink Coca-Cola” and “Eat popcorn.” The single-frame tactic was calibrated to be just long enough for the subconscious to pick up but too short for an individual to realize that they were being subjected to subliminal persuasion. 

Most Americans don’t sit well with the prospect of creepy tactics being used to toy with our vulnerabilities for commercial or political gain. So when Vicary turned out to be a fraud and subsequent studies failed to find any indication that such tactics actually worked, the American public heaved a collective sigh of relief. 

And yet the idea that subliminal messaging could be insidiously embedded in the mechanisms of mass culture had sparked a pure sense of moral indignation. Anxieties about such abuses of power would intermittently rise to the level of courts of law or the Federal Trade Commission, particularly when corporations were directing such coercion toward our children. The 1980s were a decade marked by public flare-ups over the potential dangers of Judas Priest lyrics, Joe Camel cigarette ads, and playing certain pop songs in reverse on the turntable. Through it all we trusted that if ever there was verifiable evidence that subliminal messaging worked, we’d surely put a stop to it. 

Those days of lost innocence seem quaint now when set against the backdrop of today’s digital media ecology. For while Americans remain troubled by the possibility of being unsuspectedly coerced, we have also become a people who hedge our bets. We are resigned to the inevitability of sustaining real costs in exchange for the delicious waves of pleasures and benefits our digital systems offer us. 

Over the course of the last month, the American public has been privy to new revelations about the misdoings of Facebook. Former Facebook employee Frances Haugen testified before Congress, sat for an interview on 60 Minutes, and supplied the Wall Street Journal with internal documents revealing how Facebook executives have long known of the deleterious effects of their platforms but taken few steps to mitigate them. In-house researchers had informed them that Instagram made one in three teen girls feel worse about their bodies and that Facebook’s decisions to amplify polarizing content and weakly restrict sources of disinformation and hate speech likely played some role in encouraging both the January 6 insurrection and ethnic massacres in Myanmar. 

Amidst its subsequent denials and counter-assertions, Facebook did concede to pause their plans to roll out Instagram Kids. But this act of ritual contrition paled against Facebook’s astonishing perplexity over children’s tendencies to still talk and play with each other (instead of staring at their screens together), as evidenced in this shameless question posed in its documents: “Is there a way to leverage playdates to drive word-of-hand/growth among kids?”

Facebook’s tone-deaf manner of colonizing the social space of children as the next frontier is consistent with its history of cringe-worthy decisions. Some may recall how Facebook let Cambridge Analytica collect data on users and hand them off to foreign entities suspected of sowing discord in the run-up to the 2016 American election. And then there was the revelation that Facebook had conducted a series of covert experiments on “massive emotional contagion” in 2012, targeting teen users who had included words like defeated, overwhelmed, or failure in their posts with strategic ads for a “confidence boost.”

Others might even remember when Facebook first infuriated users in 2007, when they published user activity from other websites on their social media feeds without prior consent. Back then we had let ourselves believe that Facebook’s mantra “Move fast and break things” might actually be the new path of progress and innovation. But now, with Facebook literally mediating how 1.9 billion people daily understand the realities of the pandemic, political polarization, racial and ethnic turmoil, climate change, and more, it’s hard to take its executives seriously when they deny culpability. 

Increasingly pessimistic about the government’s will to stop Facebook from blundering about in its single-minded mission to win the digital race, the American public is starting to doubt if there is any entity with enough moral authority to get up in Facebook’s grill and demand: “Are we really going to do this?” 

Advertisers and marketers have long sought to achieve “cradle to grave” loyalty. If consumers can become loyal to a brand by the time they start elementary school, companies can generally count on another seventy years of customer purchases, and even see those loyalties passed on to succeeding generations. In one sense, the keen interest Facebook has in capturing our children’s loyalties is merely part of this larger tradition. In fact, cradle to grave loyalty works especially well with tech devices and services because our digital loyalties develop out of not only sheer familiarity and taste but also functionality. My children, growing up in a family that uses iPads, iPhones, and MacBooks, will likely keep using Apple products because of the compatibility of the operating systems. Similarly, I am reluctant to go through the trouble of moving to Spotify because for over a decade my Pandora stations have become my crown jewels of musical curation.

As a result, digital loyalty works even better than, say, peanut butter brand loyalties, because the felt loyalty towards peanut butter is completely isolated in its impact: If you change peanut butter brands, there is no impact on other consumer purchases, nor any impact on what your friends do. But if you feel convicted to leave Facebook or Instagram, you are at risk of jeopardizing significant parts of your social or professional life—which today is often integrally embedded in these digital platforms. 

Computer scientist Jaron Lanier once wrote that innovation can become stunted by a structural mode of inertia called “lock-in.” When a successful design of a technology becomes the industry standard, subsequent accessories and services are created and engineered with that standardized component in mind. Even when new alternatives emerge—new iterations that produce higher quality outcomes or simply provide better choices—there is an industry-wide reluctance to change horses because the departure and re-entry costs feel too high. 

And so we stay locked in to the first-run technologies, even when they grow inferior, problematic, and even harmful. Indeed, while we might grouse about Facebook’s ill-advised actions and propensities, few of us are actually going to change our habits or do anything to alter our reliance on its services because we are locked in.

Today we now know the truth about how social media platforms exploit human vulnerabilities for the sake of increasing traffic. And, unlike what we once believed about ourselves, it turns out that we don’t care if we—or even our children—are at risk of being subjected to psychological manipulation when it’s just too good to give up. Recognizing such lock-in might help us understand why, despite our agitation over Facebook’s bad behavior, there is no surge of parents, educators, or advocates protesting in the streets or flooding their representatives in Congress with calls to hold Facebook’s feet to the fire. 

And so we go on in quiet desperation, watching Senate hearings that offer verbal lashings but little legislative teeth. What will it take for the American public to demand something different, something better than what we have? We’ve already been willing to sacrifice democracies on the altar—both our own and those of more vulnerable countries. Does our reticence signal to Facebook—and all the other tech giants vying to conquer new frontiers—that we are ready to sacrifice our children, too? Are we really going to do this? 

Felicia Wu Song is Professor of Sociology at Westmont College. Her most recent book, Restless Devices: Recovering Personhood, Presence, and Place in the Digital Age, is due out November 30 from InterVarsity Press Academic. She is Associate Editor of Current.

Felicia Wu Song
+ postsBio

Felicia Wu Song is Professor of Sociology at Westmont College. She is the author of Restless Devices: Recovering Personhood, Presence, and Place in the Digital Age.

  • Felicia Wu Song
    https://currentpub.com/author/fsong/
    REVIEW: Aching to Be Seen
  • Felicia Wu Song
    https://currentpub.com/author/fsong/
    REVIEW: How Technology Works on Us
  • Felicia Wu Song
    https://currentpub.com/author/fsong/
    It’s Time to Be Wrong and Make Things Right
  • Felicia Wu Song
    https://currentpub.com/author/fsong/
    FORUM: The End of Roe, Day One

Filed Under: Current