Media of Our Discontents
All technology is a social dilemma

Instagram, which I deleted last month, was my last remaining social media account. I deleted Facebook about 3 1/2 years ago and Twitter a year after that. I won’t go into the many downsides of social media because it’s been well-covered elsewhere—the data mining, the extremism- and conspiracy-favoring algorithms, the increases in teen suicide and depression, the sheer amount of time we waste scrolling and refreshing—but I will say that through several years now of paying more attention to myself as a mind-body in the world, I’ve become acutely aware of how spending time on these platforms affects my mood and emotions (not for the better) while tricking me into thinking I’m doing something, participating in the world.
It’s this last that feels so devious and destructive. Billions of humans are on social media every day, having their emotions and opinions and beliefs manipulated by algorithms that were created with the sole goal of keeping our attention, so it’s not accurate to say that the online world is not the real world. The issue is in believing it’s the whole world, that what we see and engage with online is all of what is true about the world. That how we feel and act as we keep engaging encompasses all of what’s true about us as complex embodied beings.
Social media’s manipulation of our emotions and perception isn’t new—advertising has been selling us stories about reality forever—but it’s far more pervasive and influential than before, and the effects are so woven into our social fabric that individual changes are unlikely to make much of a difference, though doing them for our own well-being is still beneficial. (Think of it as akin to food: You can’t fix our broken agricultural systems by eschewing high-fructose corn syrup and eating more organic kale, but you’re still choosing to support a system that would be more beneficial for everyone, while keeping yourself healthier in the process. Just pay attention to the Farm Bill next time it’s up in Congress, too, if you’re in the U.S.)
I’ve been following the work of the Center for Humane Technology since it was founded a few years ago and have not yet missed an episode of their podcast Your Undivided Attention; they’ve recently come out with a documentary about social media called The Social Dilemma (only showing on Netflix) that sums up much of their work and why it’s so important. It’s being covered all over the place so I’m sure you’ve already heard of it, if not watched it. I don’t feel like it really got the scope of the problems that CHT has been talking about for years, but it’s a necessary and sobering introduction if you’re new to this issue.
The movie tries to end on optimism but didn’t leave me very hopeful. When I look at the past century of building car-centric infrastructure and how it’s landed us with fractured communities, air pollution, climate change, habitat destruction, and a host of human health consequences (including nearly 40,000 people dying in car crashes every year in the U.S. alone, over 6,000 of them pedestrians), it seems blindingly obvious that we are heading down a similar path with social media and digital technology.
The early 1900s didn’t see a society eager to turn their downtowns into car-only zones and their landscapes into highway corridors. They fought the intrusion of cars and particularly the deaths they caused. But they lost that battle, and car companies rewrote history to make it sound like the reality we have is the one we chose.
Car companies and highway engineers didn’t foresee the consequences of what they were building on health, society, and the planet. The didn’t intend for their products to ruin communities and human health. But that’s the point. Silicon Valley insiders assuring us that they can fix the problems they didn’t foresee by creating smarter algorithms is hardly encouraging. If you couldn’t foresee the consequences of the tech you were developing, it’s your judgment that needs help, not your interface.
CHT’s founder, former Google engineer and one-time head ethicist for the company Tristan Harris, has pointed out repeatedly that social media succeeds by hijacking our lower-order limbic system; it doesn’t need to aspire to overcoming our higher-order thinking. We have no idea what that will do to us and our societies in the long run. The situation right now isn’t good, but it could get far worse. And there’s little you can do to control it except advocate for federal, even global, policy changes and tweak your own use of digital tech—if that’s even possible. I can’t count the number of people I know who’ve said they hate how social media makes them feel and would love to give it up except that it’s required for their job.
My suggestion is to start by paying attention to how your body feels when you’re engaging online. Faster heartbeat, shorter breaths, craving for sugar, surges of anger crawling up your arms? Walk away for a minute and ask yourself who is benefiting when you feel that way. Someone is, but it’s certainly not you.
Deleting my social media doesn’t really do much. I still live in a world that is being shaped by it. But it allows me to keep myself anchored in the parts of reality that are still larger and more consequential than the internet. No matter how pervasive Facebook is, it’ll never be as large a reality as the air we breathe. At least, I hope not.
By extension, I hope I can use that grounding to help others stay anchored, and to find more people who are doing the same. We have no idea where this is all going to go. We’re going to need one another.
—-
A lot of related stuff to read or listen to:
Dr. Ayanna Howard is a roboticist and former NASA engineer (she designed parts of the Mars rover!) whose audio book about AI and bias Sex, Race, and Robots just came out on Audible. (Caveat: I worked as a developmental editor/book doctor with Dr. Howard on this project.) It’s an eye-opening look by a robotics insider at how bias is built into AI, and how we’ll never even notice because we so easily assume that a computer is always objective. (It’s programmed by humans with their own unexamined prejudices and decades of biased data, so . . . no.)
You can’t get a better critique of the weaknesses in The Social Dilemma documentary than this piece at Librarian Shipwreck (who is totally right about the bicycles and I also can’t believe those comments made it through the movie’s editing process): “You know how you could have known that technologies often have unforeseen consequences? Study the history of technology. You know how you could have known that new media technologies have jarring political implications? Read some scholarship from media studies.”
If you haven’t read Douglass Rushkoff’s book Team Human yet, it might be time to get on board.
(You can also read my book A Walking Life, which covers, among many other things, the problems created by a century of building car-centric infrastructure and how we’re setting ourselves up for similar problems by letting the tech industry shape our digital future.)
The Center for Humane Technology’s podcast Your Undivided Attention, in particular these episodes: Social media’s design relationship with addictive gambling (that’s episode 1 and the interview continues in episode 2); a former CIA operative’s experience working as an elections watchdog for Facebook and seeing its monumental failures (and how little the company cares about either democracy or bias); the global crisis in trust; how online conspiracies and threats lead to real-life violence.
The promises and pitfalls of emotional AI, from MIT Technology Review’s podcast In Machines We Trust.
Looking for safety in total information and getting the surveillance state instead: will we regret it? By Thomas A. Bass in The American Scholar.
A fascinating interview on the Futures Podcast with journalist Jenny Kleeman on her book Sex Robots & Vegan Meat about nearly-there technology like lab-grown meat and artificial wombs, and the ethical quandaries posed by their eventual realization.
MIT Technology Review on how online misinformation is like secondhand smoke and should be regulated accordingly.
Kinda FUN: I get a kick out of writing satire based on the opening to Vladimir Nabokov’s Lolita. Last week I wrote one asking the Intellectual Dark Web to come out and define what they mean by “The Left” because they spend an inordinate amount of time complaining about it. (A previous parody was on teething.)
Hi from Seattle. I walk 3 miles a day, and believe our minds are in our bellies.
That might not make sense if you are anxious or depressed, but these difficult experiences go away with daily exercise. Cars make us lazy and fat.
The problem as you know is it took Standard Oil and Ford Motor Co. a hundred years to turn their collective daydreams into our nightmares.
I'd say social media is like our minds if you do not attend your thoughts. Most people are mindless and don't know that there thoughts are as automatic and mostly negative. Meditating for six months gave me the insight that Sharon Salzberg, Jack Kornfield and Tara Brach -- my first digital teachers were right. Thought is as automatic as breathing and heartbeat.
Mindless me prefers anti social media because it serves as a distraction to more difficult work, like writing a book, leading and volunteering.
It fits my conditioned mind to a t. Ping pong from here to there, I don't have to sit with other people's suffering and ways to relieve it.
Meditation me can sit and let silence sink in, then watch her open to a larger silence that connects me to all beings, to discovering again (daily) there are no others.
For me, meditation is a starting point for our future collective larger self to grow a Stewardship Society.
What if we spend the next 200 years making amends for capitalism or whatever you call the idea that life is resource extraction and not sacred? Restoring ecosystems, creating bike and walking cultures, designing our cities to support composting everything-- no waste. Our economic model is one of return, like a forest floor.
I think the daily habits like your not being on anti social media are crucial and as visible at first as breath. Or at least that is how it feels at first. But you have a lot of influence writing and teaching this way.
You connect here, you publish your books, you teach. And the world changes in one direction through this co creation with others.
The fact that car culture took 100 years to go from someone's daydream to our nightmare should give us insight and courage to notice how a few people organizing businesses to influence government can change the world.
I do have another side that I don't share much that says Mother Earth is really just a churning entropy organism, giving birth and dying are our translations as humans of carbon exchange on the planet.
But here we are, and we love others, so I want to make a difference in the direction of that loving and caring for others.
I may have said too much here seemingly in a distracted take on what you wrote. To look at tech and cars, we have to calm down enough to look at the framework of our society and the behavior modification tech companies are pursuing to make us docile consumers.
I'll leave you with a great series on capitalism -- some of the best storytelling about how markets have trumped society values for meaningful work. It ties in with Black Lives Matter movement and helps Whites see the roots of racism https://www.ovid.tv/capitalism
And looping back to your focus on tech, I have thought about Shoshana Zuboff's masterpiece on the surveillance society and Jaron Lanier on Fixing The Internet, which is a great model for teaching -- 3 short videos on NYT https://www.nytimes.com/interactive/2019/09/23/opinion/data-privacy-jaron-lanier.html
Feel free to let me know if this is too much of a ramble.
Peace and off to walk.
Timothy