Like nuclear energy, social media generates power, but its use also requires strict guardrails.
“Facebook has taken Big Tobacco’s playbook.”— Senator Richard Blumenthal (D-CT), September 30, 2021.
“A part of me feels like I’m interviewing the head of a tobacco company right now.” — CNN’s Brian Stelter to Facebook VP Nick Clegg, October 3, 2021.
“Facebook is the Big Tobacco of our generation.”— The Real Facebook Oversight Board, last week.
It’s understandable why politicians, journalists and advocates have started referring to Facebook, ahem, Meta, as a tobacco company. Not just because the company is trying the Philip Morris, I mean Altria, trick of renaming itself to shed its bad reputation. One of the things that Facebook shares with tobacco companies is that its product is toxic and addictive to many of its users, and the company’s leaders have tried to hide those facts.
But the metaphor is inadequate, and if we stick with it, we’ll end up with shallow answers to the deep problem presented by Facebook/Meta. For Facebook isn’t just a form of social entertainment like smoking. It’s much more like an energy company. And not just any energy. Far better to think of it like a nuclear energy company, and one that’s been built as cheaply and quickly as possible.
Facebook is Chernobyl. Allow me to offer a brief parable, based on real history.
A little more than sixty years ago, a brilliant group of academic researchers employed by the U.S. Department of Defense figured out how to connect people into virtual networks. They called their invention “social fission” because of the energy generated by connecting people together, in homage to the energy produced by nuclear fission. They began exploring how the world could be transformed by social fission networks. Knowledge could be shared more quickly, isolated people could find others with like-minded interests or needs, and collaboration could become easier. At first progress in advancing social fission was slow, despite these advantages. People and businesses already had tools like phones and fax machines for one-to-one connections, and if they needed to reach large numbers of people they were content to use more expensive forms of connection energy like bulk mail, radio and television.
But the people working on unleashing the power of social fission toiled away. A series of technological breakthroughs eventually made it possible for more people to use it easily. In the beginning, you needed a dedicated phone line and a desktop computer to tap into social fission, plus the patience to communicate slowly by typing your messages and reading the messages of others. Early users, who like the original researchers were mostly white male engineers, reveled in the sparks that flew between people in these early fission networks, which took root in virtual parks hosted by universities and other nonprofit institutions. Sometimes people, particularly women and minorities, complained that they were hurt in those exchanges, which were often anonymous or poorly moderated, but their concerns were brushed aside as the early fission networks grew.
By the mid-1990s, social fission was catching on. One scientist, Tim Berners-Lee, invented a simple way to add more connections to a fission network, not just between people but also between documents and other forms of information. Another engineer, Marc Andreessen, developed a simple way for computer owners to see these networks and move about in them. Moore’s Law, which predicted that the cost of packing transistors onto a chip would keep dropping by half every two years, took hold and suddenly the cost of adding connections and storing the energy generated by them started declining dramatically.
And then the Clinton administration made a fateful decision. Caught up in their own ideological beliefs about the free market, and under pressure from the phone companies, who saw social fission both as a threat to their existing business and an opportunity for huge new profits, it decided to end prohibitions on the commercial use of social fission and released it into the marketplace. “Fission reactors will be built, paid for, and funded principally by the private sector,” then-Vice President Al Gore memorably said. (Well, he said “the information superhighway,” not fission reactors, but hopefully you get my point.) Not only that, to foster the growth of fission companies, they were given legal protection from being sued if the communications they enabled caused harm and allowed to sell products without charging state sales taxes.
Unhindered by government regulation and free of the physical barriers that constrained other types of connection, and backed by venture capitalists who believed growing fast and dirty (or “blitzscaling”) was the best way to harness this new power, social fission companies expanded rapidly. Along the way, they disrupted and then crushed older industries and institutions that used less modern forms of power generation. Newspapers and bookstores were among the first to be undermined by the rise of social fission. Music and video stores soon followed. The world of politics was changed as well, as outsiders tapped social fission faster than incumbents. Starting in 2001, when protestors in the Philippines used their mobile phones to organize demonstrations ousting their president, movements powered by social fission toppled powerful authorities across the globe. It seemed like a new wave of people-centered democracy was on the way.
Unfortunately, neither governments nor the original cheerleaders for social fission realized that connecting people together without careful controls to moderate what happens when large numbers of people collide with each other would inevitably produce harmful societal effects at enormous scale. Enabling anyone to connect to anyone else without social or physical context and without intermediaries is a recipe for intensifying social conflict. Sociologists, psychologists, and some media scholars already knew how in-groups always intensify hostile feelings toward out-groups. But most of us were blind to the danger.
Meanwhile, in just a few years, a handful of companies discovered that they could profit enormously from a key element created by social fission — personal data. Soon the leaders of Google and Facebook had amassed enormous wealth and power selling access to that data, creating a huge incentive for them to ignore how much their own systems were generating harm. And then authoritarian demagogues and other malicious actors discovered how to take advantage of the system’s lack of safeguards, and here we are, living in a toxic social landscape.
Unlike social fission, nuclear fission has been heavily controlled by governments since its invention. Here in the United States, the nuclear energy industry has a relatively benign track record (though the problem of long-term disposal of radioactive waste can’t be solved, and many existing reactors are built far too close to populated areas and aging beyond their safety limits). Not so in the former Soviet Union, which rushed to build reactors as quickly and cheaply as it could, and rewarded engineers and bureaucrats for how much power their machines generated. That’s what led to the Chernobyl disaster.
As we awaken to the danger presented by companies like Facebook, it is critical that we not apply the wrong framework for understanding them. Tobacco is an addictive and toxic substance with little societal utility. It makes sense to prohibit its use by minors and to tax its sale prohibitively, to limit its use. Social fission — or, to exit my parable, social networking and information sharing — is different. Its use can be of enormous value to society, but only with strict guardrails on its speed and scale. For starters, in the same way that we require the use of control rods to moderate nuclear fission, we probably need to tamp down the size of online groups and the speed of online communication to human levels, to limit their harmful effects. That may mean requiring professional human moderation on all platforms at a scale of something like one moderator per 1000 users. Facebook currently employs about 1 moderator per 150,000 users. Engagement incentives, like follower counts, which currently reward users for posting or sharing the most incendiary content, maybe need to be eliminated, hidden or capped. And the legal protection from liability provided by Section 230 needs to be reexamined carefully.
We already have many examples of social media companies that connect people and knowledge in powerful ways without generating huge toxic effects. Some of their names are familiar, like Pinterest and Reddit. Others are less well known because they focus on local scale, like Front Porch Forum in Vermont. What they all have in common is leadership that prioritizes responsible community stewardship over hyper-growth and social domination. Facebook, too, could be that kind of company, but only if its leadership is changed and it is forced to adopt a more responsible business model. And if Mark Zuckerberg won’t budge, then the answer to our Facebook problem is the same as the USSR’s answer to its Chernobyl problem: it should be shut down.