Media of Our Discontents

All technology is a social dilemma

Instagram, which I deleted last month, was my last remaining social media account. I deleted Facebook about 3 1/2 years ago and Twitter a year after that. I won’t go into the many downsides of social media because it’s been well-covered elsewhere—the data mining, the extremism- and conspiracy-favoring algorithms, the increases in teen suicide and depression, the sheer amount of time we waste scrolling and refreshing—but I will say that through several years now of paying more attention to myself as a mind-body in the world, I’ve become acutely aware of how spending time on these platforms affects my mood and emotions (not for the better) while tricking me into thinking I’m doing something, participating in the world.

It’s this last that feels so devious and destructive. Billions of humans are on social media every day, having their emotions and opinions and beliefs manipulated by algorithms that were created with the sole goal of keeping our attention, so it’s not accurate to say that the online world is not the real world. The issue is in believing it’s the whole world, that what we see and engage with online is all of what is true about the world. That how we feel and act as we keep engaging encompasses all of what’s true about us as complex embodied beings.

Social media’s manipulation of our emotions and perception isn’t new—advertising has been selling us stories about reality forever—but it’s far more pervasive and influential than before, and the effects are so woven into our social fabric that individual changes are unlikely to make much of a difference, though doing them for our own well-being is still beneficial. (Think of it as akin to food: You can’t fix our broken agricultural systems by eschewing high-fructose corn syrup and eating more organic kale, but you’re still choosing to support a system that would be more beneficial for everyone, while keeping yourself healthier in the process. Just pay attention to the Farm Bill next time it’s up in Congress, too, if you’re in the U.S.)

I’ve been following the work of the Center for Humane Technology since it was founded a few years ago and have not yet missed an episode of their podcast Your Undivided Attention; they’ve recently come out with a documentary about social media called The Social Dilemma (only showing on Netflix) that sums up much of their work and why it’s so important. It’s being covered all over the place so I’m sure you’ve already heard of it, if not watched it. I don’t feel like it really got the scope of the problems that CHT has been talking about for years, but it’s a necessary and sobering introduction if you’re new to this issue.

The movie tries to end on optimism but didn’t leave me very hopeful. When I look at the past century of building car-centric infrastructure and how it’s landed us with fractured communities, air pollution, climate change, habitat destruction, and a host of human health consequences (including nearly 40,000 people dying in car crashes every year in the U.S. alone, over 6,000 of them pedestrians), it seems blindingly obvious that we are heading down a similar path with social media and digital technology.

The early 1900s didn’t see a society eager to turn their downtowns into car-only zones and their landscapes into highway corridors. They fought the intrusion of cars and particularly the deaths they caused. But they lost that battle, and car companies rewrote history to make it sound like the reality we have is the one we chose.

Car companies and highway engineers didn’t foresee the consequences of what they were building on health, society, and the planet. The didn’t intend for their products to ruin communities and human health. But that’s the point. Silicon Valley insiders assuring us that they can fix the problems they didn’t foresee by creating smarter algorithms is hardly encouraging. If you couldn’t foresee the consequences of the tech you were developing, it’s your judgment that needs help, not your interface.

CHT’s founder, former Google engineer and one-time head ethicist for the company Tristan Harris, has pointed out repeatedly that social media succeeds by hijacking our lower-order limbic system; it doesn’t need to aspire to overcoming our higher-order thinking. We have no idea what that will do to us and our societies in the long run. The situation right now isn’t good, but it could get far worse. And there’s little you can do to control it except advocate for federal, even global, policy changes and tweak your own use of digital tech—if that’s even possible. I can’t count the number of people I know who’ve said they hate how social media makes them feel and would love to give it up except that it’s required for their job.

My suggestion is to start by paying attention to how your body feels when you’re engaging online. Faster heartbeat, shorter breaths, craving for sugar, surges of anger crawling up your arms? Walk away for a minute and ask yourself who is benefiting when you feel that way. Someone is, but it’s certainly not you.

Deleting my social media doesn’t really do much. I still live in a world that is being shaped by it. But it allows me to keep myself anchored in the parts of reality that are still larger and more consequential than the internet. No matter how pervasive Facebook is, it’ll never be as large a reality as the air we breathe. At least, I hope not.

By extension, I hope I can use that grounding to help others stay anchored, and to find more people who are doing the same. We have no idea where this is all going to go. We’re going to need one another.


A lot of related stuff to read or listen to:

  • Dr. Ayanna Howard is a roboticist and former NASA engineer (she designed parts of the Mars rover!) whose audio book about AI and bias Sex, Race, and Robots just came out on Audible. (Caveat: I worked as a developmental editor/book doctor with Dr. Howard on this project.) It’s an eye-opening look by a robotics insider at how bias is built into AI, and how we’ll never even notice because we so easily assume that a computer is always objective. (It’s programmed by humans with their own unexamined prejudices and decades of biased data, so . . . no.)

  • You can’t get a better critique of the weaknesses in The Social Dilemma documentary than this piece at Librarian Shipwreck (who is totally right about the bicycles and I also can’t believe those comments made it through the movie’s editing process): “You know how you could have known that technologies often have unforeseen consequences? Study the history of technology. You know how you could have known that new media technologies have jarring political implications? Read some scholarship from media studies.”

  • If you haven’t read Douglass Rushkoff’s book Team Human yet, it might be time to get on board.

  • (You can also read my book A Walking Life, which covers, among many other things, the problems created by a century of building car-centric infrastructure and how we’re setting ourselves up for similar problems by letting the tech industry shape our digital future.)

  • The Center for Humane Technology’s podcast Your Undivided Attention, in particular these episodes: Social media’s design relationship with addictive gambling (that’s episode 1 and the interview continues in episode 2); a former CIA operative’s experience working as an elections watchdog for Facebook and seeing its monumental failures (and how little the company cares about either democracy or bias); the global crisis in trust; how online conspiracies and threats lead to real-life violence.

  • The promises and pitfalls of emotional AI, from MIT Technology Review’s podcast In Machines We Trust.

  • Looking for safety in total information and getting the surveillance state instead: will we regret it? By Thomas A. Bass in The American Scholar.

  • A fascinating interview on the Futures Podcast with journalist Jenny Kleeman on her book Sex Robots & Vegan Meat about nearly-there technology like lab-grown meat and artificial wombs, and the ethical quandaries posed by their eventual realization.

  • MIT Technology Review on how online misinformation is like secondhand smoke and should be regulated accordingly.

  • Kinda FUN: I get a kick out of writing satire based on the opening to Vladimir Nabokov’s Lolita. Last week I wrote one asking the Intellectual Dark Web to come out and define what they mean by “The Left” because they spend an inordinate amount of time complaining about it. (A previous parody was on teething.)