The Chaos Machine – The Inside Story of How Social Media Rewired Our Minds and Our World by Max Fisher
the-chaos-machine – pdf
Super Pared Down Precis.
- Your time & attention = profits for tech giants, and this is more important to them than any moral concerns about how they are fucking up our minds & our world.
Big Ideas
-
Human Psychology: Exploiting a vulnerability.
-
Moral Outrage Drives engagement.
-
Social Referents Drive behavior.
-
The Rabbit Hole: Unhook from YourTube.
-
Whistleblowing: Eschewing complacency & despair.
“In summer 2020, an independent audit of Facebook, commissioned by the company under pressure from civil rights groups, concluded that the platform was everything its executives had insisted to me it was not. Its policies permitted rampant misinformation that could undermine elections. Its algorithms and recommendation systems were ‘driving people toward self-reinforcing echo chambers of extremism,’ training them to hate. Perhaps most damning, the report concluded that the company did not understand how its own products affected its billions of users.
But there were a handful of people who did understand and, long before many of us were prepared to listen, tried to warn us. Most began as tech-obsessed true believers, some as denizens themselves of Silicon Valley, which was precisely why they were in a better position to notice early that something was going wrong, to investigate it, and to measure the consequences. But the companies that claimed to want exactly such insights stymied their efforts, questioned their reputations, and disputed their findings—until, in many cases, the companies were forced to acknowledge, if only implicitly, that the alarm raisers had been right all along. They conducted their work, at least initially, independently of one another, pursuing very different methods to the same question: what are the consequences of this technology? This book is about the mission to answer that question, told in part through the people who led it.
The early conventional wisdom, that social media promotes sensationalism and outrage, while accurate, turned out to drastically understate things. An ever-growing pool of evidence, gathered by dozens of academics, reporters, whistleblowers, and concerned citizens, suggests that its impact is far more profound. This technology exerts such a powerful pull on our psychology and our identity, and is so persuasive in our lives, that it changes how we think, behave, and relate to one another. The effect, multiplied across billions of users, has been to change society itself.”
~ Max Fisher from The Chaos Machine
As per the back cover, Max Fisher is a former international reporter for the New York Times, where he contributed to a series about social media that was a finalist for the Pulitzer Prize in 2019. He previously covered international affairs at The Atlantic and the Washington Post.
I got this book after reading Yuval Noah Harari’s latest book, Nexus: A Brief History of Information Networks from the Stone Age to AI. Harari referenced it and, following Joseph Campbell’s admonition to read the books the writers you admire read, I got it.
It’s hard to put into words just how powerful the book is.
Booklist captures the essence of the book well in their review: “Well-researched and thoroughly unnerving. … Fisher’s lucid, clear explanations and convincing arguments are bound to leave readers questioning their own use of social media.”
The New York Times Book Review blurb on the front cover also captures it well: “Utterly convincing . . . An authoritative and devastating account of the impacts of social media.”
To put it in perspective, I don’t think I’ve EVER read a book that made me more nervous about the future of our society than this one. I like to think that I’m not easily terrified and that I’m a reasonably hopeful person, but this book was, as per some other reviews: “sobering,” “disturbing,” and “necessarily discomforting.”
If you’ve been following along, you know that I HIGHLY recommend the documentary The Social Dilemma—which revealed, for me, the scope of the unintended catastrophic consequences of attention economics and the social platforms that are driven by those economic engines.
As I just told Alexandra over sunrise coffee, this book shows that the challenges we face are 100 times worse than I thought.
If you have ever found yourself overwhelmed by a sense of moral outrage after spending time on social media platforms and/or if you have ever found yourself down a weird (and potentially dark!) rabbit hole on social media sites and/or if you have fallen prey to conspiracy theory-type misinformation and/or if you have friends/family who have gone off the rails in conspiracy theory thinking and/or if you are a parent committed to helping your kids flourish now and in the decades ahead and/or if you are a human being who wants to help create a more noble and virtuous world in which (and such that!) 51% of humanity is flourishing by the year 2051, I think you will find this book as powerful as I did.
In fact, I’d put this (and Harari’s Nexus) as close to the “must read” category as I ever get. Neither book is even remotely close to the typical “self-development” books we tend to focus on, but as citizens of the 21st century, I believe we need to understand the forces driving our culture. (Get the book here.)
We’re barely going to scratch the surface and I won’t do the power of the book justice in this quick Note but… Let’s get to work.
P.S. Jonathan Haidt’s The Anxious Generation is another book I’d put into the “must read” category to understand what’s going on with social media and how it’s affecting our lives.
“‘When Facebook was getting going, I had these people who would come up to me and they would say, “I’m not on social media,”’ Sean Parker, who had become Facebook’s first president at age twenty-four, recalled years later. ‘And I would say, “Okay, you know, you will be.” And then they would say, “No, no, no. I value real-life interactions. I value the moment. I value presence. I value intimacy.” And I would say, “We’ll get you eventually.”’
Facebook’s strategy, as he described it, was not so different from Napster’s. But rather than exploring weaknesses in the music industry, it would do so for the human mind. ‘The thought process that went into building these applications,’ Parker told the media conference, ‘was all about, “How do we consume as much of your time and conscious attention as possible?”’ To do that, he said, ‘We need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content, and that’s going to get you more likes and comments.’ He terms this the ‘social-validation feedback loop,’ calling it ‘exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.’ He and Zuckerberg ‘understood this’ from the beginning, he said, and ‘we did it anyway.’”
That’s from chapter #1: “Trapped in the Casino” in which, as you might have guessed, Fisher talks about the parallels of social technology and the tricks casinos use to create the addictive tendencies that get you to gamble more. Both casinos and social media sites leverage the same psychological flaws to hack our attention and create addictive behavior.
And… As per those quotes from Sean Parker (who created Napster before running Facebook in the early days), THE CREATORS OF THESE PLATFORMS KNEW THEY WERE EXPLOITING A VULNERABILITY IN HUMAN PSYCHOLOGY and they did it anyway.
Vice vs. Virtue. And… Unfortunately, virtue lost. :/
Now… That’s bad enough on its own. But… Social media was still *relatively* harmless in the very early days. It wasn’t until AI advanced enough for algorithms to be written that were focused on ONE THING: INCREASING USER ENGAGEMENT that the hacker’s shaping society were able to wreak global havoc.
Mapping out precisely how that all evolved and how the algorithms prioritized THE MOST EXTREME content because it elicited moral outrage which drove the most engagement (which drove the most profit!) along with all the negative downstream consequences is what the book is all about. Let’s just say my Oura sleep scores suffered significantly the nights I was reading this.
P.S. We talk about parallel wisdom in Adam Alter’s Irresistible, Cal Newport’s Digital Minimalism, and, of course, Jonathan Haidt’s The Anxious Generation. Check out those Notes for more.
Moral Outrage
“Popular culture often portrays morality as emerging from our most high-minded selves: the better angels of our nature, the enlightened mind. Sentimentalism says it is actually motivated by social impulses like conformity and reputation management (remember the sociometer?), which we experience as emotion. Neurological research supports this. As people faced with moral dilemmas work out how to respond, they exhibit heavy activity in neural regions associated with emotions. And the emotional brain works fast, often resolving to a decision before conscious reason even has a chance to kick in. Only when they were asked to explain their choice would research subjects activate the parts of their brain responsible for rational calculation, which they used, retroactively, to justify whatever emotion-driven action they’d already decided on.
Those moral-emotional choices seemed reliably to serve a social purpose, like seeking peers’ approval, rewarding a Good Samaritan, or punishing a transgressor. But the instinctual nature of that behavior leaves it open to manipulation. Which is exactly what despots, extremists, and propagandists have learned to do, rallying people to their side by triggering outrage—often at some scapegoat or imagined wrongdoer. What would happen when, inevitably, social platforms learned to do the same?”
That’s from chapter #4 called “Tyranny of Cousins” in which Fisher walks us through the evolutionary neurobiology of in-group vs. out-group aggression and how we are deeply wired to respond to perceived infractions against our in-group with moral outrage.
He tells us: “Outrage is a simple emotional cocktail: anger plus disgust. Moral outrage is a social instinct.”
Jonathan Haidt echoes this wisdom in The Righteous Mind. In fact, it’s Haidt’s research Fisher references in the footnotes to that passage above.
Haidt tells us: “The first principle of moral psychology is Intuition comes first, strategic reasoning second.”
He also says: “Morality binds and blinds. It binds us to the ideological teams that fight each other as though the fate of the world depended on our side winning each battle. It blinds us to the fact that each team is composed of good people who have something important to say.”
One of the many problems with the current algorithms running our social media platforms these days (that, I repeat, are ruthlessly focused on driving user engagement which is best driven by serving up THE most extreme/toxic stuff!) is that it elicits the most moral outrage.
Although, as Fisher says, it was “inevitable” that algorithms set to maximize engagement would serve up the worst, most polarizing content, it WASN’T INEVITABLE that the leaders of the social media companies would choose to prioritize engagement/profit ahead of the public good.
In Sapiens, Harari tells us: “So why study history? Unlike physics or economics, history is not a means for making accurate predictions. We study history not to know the future but to widen our horizons, to understand that our present situation is neither natural nor inevitable, and that we consequently have many more possibilities before us than we imagine.”
Know this: The state of our current social technology platforms and the world they have helped create is neither “natural nor inevitable.” With wiser and more disciplined, courageous, and loving leadership, we could have created VERY different technologies and, consequently, a very different world. Today’s the day to play our roles well in creating a better tomorrow.
Social Referents
“Most of the time, deducing our peers’ moral views is not so easy. So we use a shortcut. We pay special attention to a handful of peers whom we consider to be influential, take our cues from them, and assume this will reflect the norms of the group. The people we pick as moral benchmarks are known as ‘social referents.’ In this way, morality is ‘a sort of perceptual task,’ [Betsy Levy] Paluck [who had won a MacArthur Foundation ‘genius grant’ for her work exploring how social norms influence behavior] said. ‘Who in our group is actually popping out to us? Who do we recruit in our memories when we think about what’s common, what’s desirable?’
To test this, Paluck had her team fan out to fifty-six schools, identifying which students were influential among their peers as well as which students considered bullying to be morally acceptable. Then she picked twenty or thirty students at each school who seemed to fit both conditions: these were, presumably, the students who played the greatest role in instilling pro-bullying social norms in their communities. They were asked to publicly condemn bullying—not forced, just asked. The gentle nudge to this tiny population proved transformative. Psychological benchmarks found that thousands of students became internally opposed to bullying, their moral compasses pulled toward compassion. Bullying-related disciplinary reports dropped by 30 percent.
Social media platforms place us all in a version of Paluck’s school experiment. But, online, our social referents, the people artificially pushed into our moral fields of vision, are the superposters. Not because they are persuasive, thoughtful, or important, but because they drive engagement. That was something unique to platforms like Facebook, Paluck said. Anyone who got a lot of time on the feed became influential. ‘In real life, some people might talk a lot but not be the most listened to. But Facebook,’ she said, ‘puts them in front of you every time.’”
There’s a LOT we can talk about from that passage.
For now, know this: “Social referents” have a LOT of power over culture.
Unfortunately, on social platforms like Facebook, YouTube, Twitter, and TikTok, the LOUDEST, MOST VITRIOLIC “superposters” who create the most moral outrage are the ones the platform’s algorithm’s tend to put front and center—influencing the moral decisions of the BILLIONS (!) of people who are exposed to them. (Yikes.)
It’s like the vicious version of “moral charisma” we’re always talking about. A sort of magnetically powerful IMMORAL charisma these platforms amplify.
From my vantage point, there are a couple of things we should consider doing. First, I think we’d be wise to spend A LOT less time on these platforms, dying our souls in the wrong moral colors. Second, as we discuss all the time (!), we each need to do the hard work to cultivate our moral charisma (Soul Force!) that helps qualify us to be the “social referents” who can positively influence the individuals in our community.
As we’ve discussed many times, this is what Gandhi aspired to do and what he encouraged his followers to embrace, which is exactly what Martin Luther King, Jr. aspired to do as well.
Check out this +1 on Soul Force, An Origin Story (micro-chapter #384 in Areté) and this +1 on Moral Charisma (micro-chapter #366 in Areté!) and our Notes on Trying Not to Try for more on the neurobiology of moral charisma and how to cultivate it.
P.S. As I read that passage, I also thought of wisdom from our Notes on The Captain Class by Sam Walker. Want to know what makes GREAT teams great? It’s their Captains—the ultimate “social referents.” Want to know what makes GREAT communities great? It’s their Captains—the “social referents.” I repeat: YOU are the Hero/Captain/social referrent we’ve been waiting for.
P.P.S. As I read that passage, I ALSO thought of the story about the smoke in the room from Haidt’s The Anxious Generation. Short story: Bring people into a lab. Have them sit in the waiting room. Then, after a few minutes, have a strange smoke come through the vents. See how quickly they will get up and do something about it.
When the individuals are ALONE, 75% of them took action, with half of them leaving the room within two minutes of noticing the smoke’s appearance. But… When the person is in a room with OTHER people, they will look around and see if anyone else appears concerned. When no one else does anything about it, basically NO ONE (only 3 out of 25 people!) does anything about it until the smoke COMPLETELY covers the room.
Moral of the story: There’s smoke in the room. Get up and do something about it.
Unhook From Youtube & Exit The Rabbit Hole
“YouTube, by showing users many videos in a row all echoing the same thing, hammers especially hard at two of our cognitive weak points—that repeated exposure to a claim, as well as the impression that that claim is widely accepted, each make it feel truer than we would otherwise judge it to be. Most viewers, of course, probably reject conspiracy videos. But at a scale of billions, those methods overcome enough defenses among the susceptible to win thousands of converts to even the most ridiculous cause. Or the most dangerous.”
That’s from a chapter called “The Rabbit Hole.”
Fisher walks us through THE FRIGHTENING SPEED with which YouTube’s algorithms take you from “normal” content to CRAZY CONTENT—whether that’s Flat Earthism videos (the example he uses to make his point right before this passage) or Alex Jones videos (the title of the sub-section from which this passage is taken is called “The Alex Jones Problem”).
Fisher walks us through the algorithmic “radicalization pipeline” and tells us: “The social platforms had arrived, however unintentionally, at a recruitment strategy embraced by generations of extremists.”
I was so horrified by the examples of teenagers going from YouTube videos on video games to life-changing CRAZY-MAKING (!) videos, that I immediately turned off the “auto play” function and removed the “Recommended videos” from YouTube on all the computers in our house. (Note: I personally used the “Unhook” extension for Google Chrome.)
Whistleblowing
“Collectively, the documents told the story of a company fully aware that its harms sometimes exceeded even critics’ worst assessments. At times, the reports warned explicitly of danger that later became deadly, like a spike in hate speech or in vaccine misinformation, with plenty of notice for the company to have acted and, had it not refused to do so, possibly saved lives. In undeniable reports and unvarnished language, they showed Facebook’s own data and experts confirming the allegations that the company had so blithely dismissed in public. Facebook’s executives, including Zuckerberg, had been plainly told that their company posed tremendous dangers, and those executives had intervened over and over to keep their platforms spinning at full speed anyway. The files, which Facebook downplayed as unrepresentative, largely confirmed long-held suspicion. But some went even further. An internal presentation on hooking more children on Facebook’s products include the line ‘Is there a way to leverage playdates to drive word of hand/growth among kids?’
As public outrage grew, 60 Minutes announced that it would air an interview with the leaker of the documents. Until that point, [Frances] Haugen’s identity had still been secret. Her interview cut through a by-then years-old debate over this technology for the clarity with which she made her charges: the platforms amplified harm; Facebook knew it; the company had the power to stop it buy chose not to; and the company continually lied to regulators and to the public. ‘Facebook has realized that if they change the algorithm to be safer,’ Haugen said, ‘people will spend less time on the site, they’ll click on less ads, they’ll make less money.’”
That’s from the final chapter/epilogue called “Whistleblowing.” If you haven’t watched the 60 Minutes episode featuring Frances Haugen called Whistleblower: Facebook is misleading the public on progress against hate speech, violence, misinformation, you can check it out here.
She says: “The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money.”
Fisher walks us through a series of tragic events (from the genocides in Myanmar and Sri Lanka to the destabilization of democracies around the world) and shows us just how many times executives at Facebook KNEW what was going on and how their platforms were exacerbating the problems and DID NOTHING about it. It’s heartbreaking. And, frankly, terrifying to imagine a future with more of the same.
Neither Harari nor Fisher end their books on a particularly positive note. But, as Harari tells us, we must “eschew complacency and despair” and do something about the challenges we face. A potential next step? I recommend investing the 32 hours and 23 minutes required to listen to this book and Nexus. We need you to KNOW what’s going on so we can do something about it.