Consensus not censorship

We’ve become obsessed over the past few years with the problem of misinformation. And for good reason. “Flood the zone with shit” is now standard operating procedure for a variety of interests and factions. Groups who pretend to be above that kind of thing let confirmation bias do the same work, elevating conjectures they find convenient to believe far beyond the evidentiary basis for believing them, and transmuting concurrence among prestigious groups whose biases are aligned into “authority” to which they demand deferrence. Casual information consumers become divided into two camps, the “do your own research” types who imagine, mistakenly, that they are capable of seeing through all this (and so succumb to their own confirmation bias), and those who more accurately understand that they cannot reliably distinguish truth from bullshit (and so opt out of democratic deliberation with a shrug, other than perhaps to vote for the candidates whose political party they distrust less).

“Combatting misinformation” has, understandably, become a prominent matter of public concern. I want to argue, however, that it’s the wrong approach. One way or another, trying to eliminate or suppress or deamplify misinformation amounts to a kind of censorship, It begs the question of who decides what qualifies as misinformation and why we should defer to their understanding of true and false, fact and fiction. If we all were comfortable that sources branded “Harvard” or “The Washington Post” or “CDC” were capable of doing the job, and that they would always “play it straight” with the public rather than triangulating interests of various stakeholders and insiders, then misinformation wouldn’t be a problem. We’d all happily defer to high quality information from trusted sources. Unfortunately but not incorrectly, we are now sharply divided over whether and when traditional authorities can be trusted, and over how much or little epistemological deferrence they merit. “Combating misinformation” as defined by these authorities amounts to letting sometimes untrustworthy and corrupt factions censor information that might be correct and important.

So, we are in a pickle. Our current information environment is dysfunctional. It divides and paralyzes us, and leaves us ill-informed. Our leaders, who are responsive to public opinion, make bad mistakes in order to flatter errors of constituents who have “done their own research” or who trust unworthy authorities. Suppressing misinformation could in theory lead to a correct consensus, but the very foundation of free-speech liberalism is that we have, in general, no certain basis for distinguishing information from misinformation, and therefore attempts to suppress “falsehood” are likely to repress important truths.

Free speech liberalism used to seem compatible with a functional society in a way that it now does not. Why is that? By virtue of the physical architecture of information, sources of broadly important information were much more centralized, prior to the emergence of the internet and social media. In the network television age, it was a free country, you could say whatever you want, you could publish subversive ‘zines and stuff. But unless and until your perspectives were adopted by some gatekeeper of centralized media, they would struggle to be relevant in any systemic and politically effective way. However, unlike in, say, contemporary Russia, the gatekeepers of traditional media were themselves fairly decentralized. There were three TV networks, plus many important newspapers and mass publishing houses, each marinating within some ungated local avant-garde. Politics and culture were genuinely contestable, to a degree. Meaningfully distinct publishers competed to form the mainstream. But they were mostly corporate actors with similar interests and vulnerabilities to state and advertiser pressure, and with a shared stake in maintaining something like the status quo. The struggle in that era was to get from margin to center, and that could never be a viewpoint neutral struggle.

Nevertheless, we had a functional polity in that era, with dissidence, yes, but also with broad consensus about what was true, false, and subject to reasonable contestation. As someone who often felt dissident, I can tell you that it sucked. Lots of important values and ideas got no meaningful hearing outside of very ghettoized information spaces. At the same time, it was a much more livable society beyond the frontiers of ones own dissidence. There was a lot one could get away with just taking for granted, as an individual trying to make sense of the world. Collectively, politically, we were a much more capable society, we had a stronger shared basis for action in the common good. The church of network television was consistent with an era of bipartisanship, and with experiments in policy—which were often mistaken, in part due to the narrow and blinkered information environment that framed them! But at least things could be tried, which is more than we can say for our polity at present.

We cannot, and I would not, go back to the church of network television. For all the confusion and outright nightmarishness of contemporary social media, I cannot help but score as a blessing the fact that a much wider range of voices can permissionlessly publish themselves over media capable of reaching large and influential audiences. However, the lesson we should retain from the equilibrium we have left behind is that a wild-west of free speech can coexist with a functional epistemological cohesion, if there are institutions via which a widely shared consensus can somehow rise above the din.

As Martin Gurri has pointed out, the internet can be understood as a kind of solvent of authority, and of the capacity of traditional institutions to sustain the trust that undergirds it. One way traditional authorities might counter that effect is by suppression and control, limiting the internet cacophony to a chorus reinforcing the messaging and goals of those authorities. That is the approach China and Russia have taken, and it has not been ineffective. “Combatting misinformation” can be understood as a variation of that approach, an adaptation of it to the formally liberal West. If internet forums can be persuaded to suppress as misinformation speech that is most at variance with traditional authorities, and to shape reach so that speech aligned with traditional authorities diffuses more quickly and more widely than alternative views, perhaps consensus around traditional authority can be sustained.

However, this approach brings two practical problems:

  1. It forfeits any opportunity to use the broader conversation as a means of informing and improving what becomes deemed authoritative. Our crisis of authority owes something to the cacophony of voices, sincere and disingenuous, that now outshout and dilute traditional authorities, but it also owes a great deal to the (reasonable!) perception that traditional authorities have performed poorly and so merit less deference. A soft censorship approach to restoring authority does nothing to remedy the sources of poor performance, while buried in the zone flooded with shit may be perspectives that are important and could contribute to wiser authority.

  2. Judging by the behavior of the Chinese and the Russians, soft censorship — encouraging important forums to suppress misinformation without actually banning it — may not be sufficient to restore consensus and then trust. "The Net interprets censorship as damage and routes around it,” John Gilmore famously wrote, and there is some truth to that. Relying upon suppression to sustain state authority creates a dynamic under which predictable challenges encourage ever more coercive and expansive restriction, abandoning free speech liberalism rather than saving it

Rather than suppress or censor, it would be better if we could build new institutions of consensus, whose authority would be based on stronger, more public, and more socially dispersed evidence than the institutions that are now flailing. This may sound naive, and it may prove impossible. But it seems to me we’ve done very little that could be accused of meaningfully trying.

I don’t have a silver bullet, of course. I don’t have anything more than half-baked ideas. But half-baked is better than not baked at all, or not even attempted. Let’s actually make a concerted, society-wide effort to design new forms of authority that would be more resilient to the cacophony of an open internet.

Some half-baked ideas:

  • We could dramatically expand our use of “citizens juries” or “deliberative minipublics” to help authoritatively resolve factual disputes. Much of the reason why traditional authorities are so distrusted is because publics and factions reasonably perceive them having particularities of interest that come unbidden with their roles and expertise. A Harvard professor may be more than qualified, may be “smart” enough, but if her interests and values are very different from yours, why should you accord any authority to her policy advice? The very expertise on which her claim to authority is based might well be used to snow you! We expect that politicians’ views will be colored by their electoral (or post-electoral) career interests, but jockeying for votes (or sinecures) and crafting policy well might call for very different choices. A citizens jury makes use of expertise (just like “expert witnesses” are called before legal juries), but vests the authority to make determinations in a “minipublic”, a group of citizens selected by lot, and so statistically likely to be representative of the public not-mini-at-all. Their role is to elicit evidence and probe experts, then deliberate directly and interpersonally in order to produce findings on behalf of the public at large. There are a lot of potential devils in details. If a competent prosecutor can get a grand jury to indict a ham sandwich, can we come up with procedures that genuinely empower the minipublic, rather than leaving it subject to manipulation and capture by its organizers? If participation in citizens juries is not compulsory (probably it should be!), will self-selection leave us with unrepresentative, and therefore unauthorative, minipublics? I don’t have answers to all of these questions, except to say that the more we try, the more likely we’ll learn how organize citizens juries effectively. I encourage you to read my friend Nicholas Gruen, and the wonderful Equality by Lot blog for more on the subject. (See also a recent piece by Michael McCarthy in Noema on using minipublics to make investment decisions.)

  • We could integrate the community college system much more deeply into the public epistemology side of academia, reducing the degree to which academic expertise is attached to the socially narrow class of elite research faculty. Community colleges should be a bidirectional bridge — helping communicate and explain current academic consensus to America’s plural communities via direct interaction with locally trusted experts, but also ensuring that the diverse experiences and perspectives of American communities are taken into account when forming academic consensus on policy-relevant questions, which necessarily touch upon values as well as potentially objective fact.

  • We could use “permissioned blockchains” (which involve no speculative financial tokens or environmentally destructive “mining”) ubiquitously in important institutions to notarize almost everything, generating public evidence of institutional history that would be difficult to hide, repudiate, or tamper with ex-post. This wouldn’t be an anticorruption panacea. Premeditatedly corrupt actors would try to circumvent a panoptic notary by falling back upon informal communication channels, the bureaucratic equivalent of turning off the bodycam. Or they might plan in advance paper trails of falsehoods, sequences of lies properly timestamped and notarized. But most corruption is not that smart, not that careful. In science class when I was a kid, I was taught that nothing should be crossed out in a lab notebook. Instead, mistakes should be struck through with a single line, permitting a reader to see both the mistake and the correction. This doesn’t prevent premeditated fraud, but it does reduce the temptation to “fix” or “fudge” things after the fact. Cryptographically attributing and notarizing everything as a matter of routine (which would not require making document contents universally public) strikes me as a similar structural encouragement of integrity.

In addition to reforms that might harden some forms of authority against the solvent of contemporary cacophony, there are reforms that might make the cacophony a bit less indiscriminately corrosive of even reliable information.

  • As Lee Drutman has described, a two-party electoral system creates incentives for each party to undermine the authority attached to information presented by officials of the other party, indifferent to the actual truthfulness or quality of the information undermined. Our system encourages partisans to tear down virtuous authority as readily as corruption and lies, indeed to confuse the former as the latter, if the institution whose authority might otherwise be enhanced is identified with the opposing party. Multiparty democracies have much less of this dynamic, as other parties are sometimes coalition partners as well as rivals, there is not a simple zero-sum game where one party’s success is everyone else’s disadvantage. A bit less radically, Jon Haidt, in his excellent article on how the internet has undone us, points to electoral reforms within our two party system that elevate candidates with cross-party appeal over more party-exclusive candidates to whom this zero-sum logic most applies.

  • We could try to reform the internet and social media structurally, in ways that don’t involve some superauthority making judgements about, then playing whack-a-mole with, putative disinformation. The contemporary internet’s encouragement of the divisive and salacious over less entertaining, more constructive speech plausibly has everything to do with most of that speech being hosted by gigantic businesses to whom accuracy or quality is a matter of indifference but emotional engagement drives activity and profit. I think we should seek an online civil society hosted by thousands or millions of smaller sites whose product is quality and curation for users rather than the eyeballs of users for advertisers. I’ve suggested before that we repeal or dramatically curtail Section 230 protections, to clip the wings of the current megaforums. We could pair this with content-neutral public subsidy to people who offer microforums which would actively curate and accept responsibility for the material they host.

  • As human beings, our understandings of the world are tangled up with our interests. Upton Sinclair’s man who can’t be got to understand what his salary depends on his not understanding is, to a first approximation, all of us. We develop sincere beliefs about the world that flatter, or at least are reconcilable with, the preconditions of our own well-being. People with very divergent interests will develop very divergent beliefs. A society that made greater use of social insurance, in which personal outcomes would vary somewhat less across individuals due to political choices, in which we really would be more "all in this together", would have an easier time finding epistemological consensus than one in which a person might make themselves unusually wealthy by accepting and promoting divergent beliefs. We'd have more consensus about climate change if there weren't influential groups of people who benefit materially by believing and arguing it is not a serious concern. If people in the fossil fuel industry only became somewhat better off rather than fabulously wealthy by persuading themselves and others climate change isn't real, we'd have less of such persuasion, and reach a functional consensus more easily. In general, there’d be less incentive to be a “grifter”, as many online influencers are accused of being, if we were a materially more equal society.

Maybe you like my specific suggestions. Maybe you don’t. Regardless, if we want to preserve liberal free speech in form, function, and spirit, we’ll have to develop new institutions for coming to authoritative consensus that rise above a now much louder din.

It’s a very urgent task. As I write, we collectively face a delicate crisis which, if mishandled, could lead to nuclear war, millions or billions dead, the end of modernity. It is not okay that the way we are thinking together is largely via TikTok, Twitter, YouTube, Facebook, MSNBC, and Fox News. These are low quality deliberative institutions.

As Aviv Ovadya put it in a conversation with Julia Galef

We're…living in a world now where, let's say stability isn't quite as quite where it was, where individuals can have far more influence on sort of the overall stability of the world and where you have a whole bunch of really tricky challenges up ahead within the next five to 20 years that could easily derail even a very, very well-functioning civilization. You're in this environment, and now you're making everyone dumber. You're making them less capable of handling it, both at an individual level and at a societal level.

You can think about this as, you’ve got your civilization driving its car down the road. And it's now starting to take LSD, and it's like seeing these hallucinations all over the place. And it's still trying to drive. There's going to be some level, some amount of LSD or some amount of like, of hallucination that you can still sort of drive without crashing. But there's going to be some level where you can't. We're just increasing that.

Hopefully we get lucky and muddle through our current crises. But we won’t get lucky forever. We have to develop the capacity to collectively speak, reason, and act together in ways that keep us free but also wise.

Update History:

  • 23-May-2022, 9:10 p.m. PDT: “…to render the clip the wings of the current megaforums.”
  • 23-Aug-2022, 12:oo p.m. EDT: “the ‘do you your own research’ types”; “…to people who host and curate offer microforums…”
 
 

13 Responses to “Consensus not censorship”

  1. Ed writes:

    I agree with your concerns but not your solutions. I don’t think they solve the fundamental problem that the American political system is so logjammed with structural impediments that producing paralysis amd right wing movement is too easy. It’s worth noting that almost no parliamentary democracies have this problem.

    To me, it seems like the real solution is to push to a true 50%+1 democracy. Reality has a much better chance if it has to only get to 51%, not 75% as it is in the current US system.

  2. Detroit Dan writes:

    I agree with your concerns and solutions (c:

    I also read the referenced Jonathan Haidt article and find his perspective useful, if somewhat depressing. I think the best way forward is to present concrete solutions, as both you and Haidt do. Ed’s suggestion that we switch to parliamentary democracy is also constructive.

    Each of us can help by trying to understand the “enemy”. For me, the enemy is the elite of the West which I see as being hopelessly corrupt and, consequently, just plain wrong about many facts. But the elite are, in many cases, struggling to improve the system as they see it. Censorship, secrecy, spying, manipulation of media, militarization, defending the status quo, promoting big business, and vilifying dissent may all perhaps be necessary to some extent if we are to maintain the good lifestyles to which we have become accustomed.

    Well maybe we should draw the line at vilifying dissent. Certain opinions are vile, and certain beliefs are factually incorrect. The best way to fight these, in my opinion, is to present facts as you see them and state your values and how certain opinions conflict with those values. In this vein, I am in the ‘do your own research” camp.

    Doing your own research includes deciding who you trust. Personally, I currently do not trust the elite and the conventional wisdom in the West as they have failed repeatedly throughout my lifetime — Vietnam, Iraq 2003, Afghanistan, financial crisis of 2010, COVID, etc. There is a difference between distrusting all authority and distrusting authority with a record of disinformation and failure. Other authorities have better records with regard to the facts in these matters, so I trust them more. That’s democracy as it should be.

  3. Sam Bleckley writes:

    I’m unconvinced that the problem has any relation, positive or negative, to a wider diversity of voices and opinions being written or heard.

    Would you characterize your information landscape as introducing you to many *new* perspectives on already-fraught topics? When were you last served a *new* take on abortion, or nuclear power?

    The new forms of media which were *supposed* to be firehoses of diverse opinions are, instead, largely filled one way or another with the *illusion* of popular consensus. We’re right, the other guy is an idiot, a child, mentally deficient, evil, morally turpitudinous, etc, and probably doesn’t really believe what he says anyway.

    I don’t think there’s any problem at all with one person presenting their lizardmen conspiracy online instead of in a zine. Geocities did not cause the kind of problems that twitter and facebook do! The problem now is the same as it ever was: whose opinions get amplified? And how aware are we that they are *being* amplified? There are even fewer news sources now then there were then; adding the twitter and facebook algorithms to the list doesn’t count as a major change to the size of that pool — what’s changed is our ability to *pretend* that there is no editorial control. “This is what real people really think!” we think as we scroll through posts filtered by us, by an algorithm, and by advertisers; and if they all say one thing, or two directly opposed things, we assume there’s either consensus or a two-idea fight.

    Coming up with additional new and more complicated ways of creating the *appearance* of consensus — either for topics with no clear truth, like ethics, or for topics with clear evidence-based answers — not only doesn’t seem like a solution, it seems like a complete blindness to the problem. The problem is not “misinformation” it’s *popular* misinformation: “it seems kinda silly, but everyone else seems to be buying into this idea and the people who don’t buy it seem to be the Wrong Sort Of People; maybe there’s something to it!”

  4. Detroit Dan writes:

    Sam may be on to something, but I’m not sure if I understand it clearly. At any rate, his comment has encouraged me to think more on the problem of the Internet as a “solvent of authority”. Rather than dismissing this solvent, I think that the Internet provides the ability to do your own research rather than admit you can’t distinguish truth from bullshit. I think I can distinguish truth from bullshit in most cases, given the amount of information available on the Internet. Bullshit leads to contradictions and lack of response to those pointing out the contradictions. Truth checks out and complements additional research. Confirmation bias is a natural part of the truth seeking process. Truth will prevail if we are allowed to think for ourselves.

  5. Steve Roth writes:

    Institute the draft!

  6. Detroit Dan writes:

    Some sort of compulsory national service seems like a good idea.

    More than anything, we need a freer mainstream press, in my opinion.

  7. Detroit Dan writes:

    With regard to a “freer mainstream press”, it would be good to try to achieve consensus on facts that divide us. Where reasonable differences of opinion exist as to the truth, it would be good to acknowledge this. Where the facts are known but are in contradiction to previously accepted beliefs, these should also be trumpeted. Consensus could follow from shared truths.

  8. TGGP writes:

    I’m surprised to find no mentions of Robin Hanson’s ideas like prediction markets, track records of predictions, news accuracy bonds. To the extent people actually care about accuracy, those would help foster it. And once common, disinterest in them would be indicative of disinterest in accuracy (rather than mere conformity and disinterest in innovation).

  9. Unanimous writes:

    Another suggestion: expand current laws related to misinformation. It is currently illegal to give people a false impression to get them to give you money – eg. fraud or false advertising. Just make it illegal generally to give a false impression without taking efforts reasonable in the circumstances to make the impression accurate. This of course needs to be proven beyond reasonable doubt, so any topic on which there is no truth proven beyond a reasonable doubt is not prosecutable. In the circumstances of a conversation it’s not reasonable or practical to engage in reasearch, so conversational speculation or honest mistakes would not be prosecutable. Nor would any speculation as long as it was labelled as speculation.

  10. Mike writes:

    I think the immediate needed is to train people from an early age how to think more critically, and to be self-aware regarding their biases, and how to receive new information with an open but critical mind. As Sam mentioned, we should also be aware that certain voices tend to be amplified, for good or for worse, by vested interests or by algorithms. And as Steve said, corrupt actors could be planting false paper trails, which serve to frame the collective discussions, and how people appoach the discussions. Censorship will only amplify certain voices further, but self-aware citizens hopefully can recognize and counter any misinformation when they see it. Kudos to Steve, for artcles that are always good resources in learning how to think.

  11. Detroit Dan writes:

    I like all the ideas here, because these are all constructive attempts to deal with a serious issue. I would vote for a candidate who raised the issue in a partisan way, and grappled with ideas such as these.

    War, of course, makes civil discussion difficult.

  12. Detroit Dan writes:

    Conflict of interest in war reporting is a major problem: The “Gentlemen’s Agreement”: When TV News Won’t Identify Defense Lobbyists. As war rages, viewers watch commercials for weapons dealers, often without knowing it.

    The corruption unfortunately is pervasive — not just restricted to the military-industrial complex. How many news shows are sponsored by Pfizer?

    So systemic changes are needed to get back on the right track. The Internet as solvent of authority is provoking some destructive backlash and we should work to channel this more constructively.

  13. David Brin writes:

    As close as the article comes to well-describing the problem, alas, this author misses the core point.

    1. The ‘disinformation war’ is nothing new. Cynical manipulators have abused every new medium of communication with lies since the invention of movable type. Radio and loudspeakers almost slew liberal civilization and likely humanity, during the 1930s, till society developed some immunity to lies carried by those media.

    2. The powerful driver is always sanctimonious tribalism, rich in emotion and oversimplification, exploiting the reflex that ‘MY kind of folk are virtuous and our opponents are venial, corrupt, stupid and scheming.’ Watch the faces of Fox-viewers and (to be frank) some of the most-passionate on the other side. You will not get volunteers to forsake that hormonal rush by calling for ‘consensus.’ (See http://tinyurl.com/wrathaddicts )

    3. Any institutions you set up to work toward ‘consensus’ agreement on facts will instantly be denounced as an incipient, Orwellian ‘Ministry of Truth’ or MoT. Much as the right pushed for an end to the old Fairness Doctrine requiring rebuttals to opinion ravings on mass media and Fox News heads today denounce every attempt at a fact-checking service. The ‘MoT!’ denunciation has proved highly effective at leveraging one of the most basic American reflexes – Suspicion of Authority. There are some institutional approaches that might bypass this highly effective retort and I offer a few here. https://www.davidbrin.com/nonfiction/factact.html

    There is one potential solution to the Disinformation Crisis that is almost never tried, or even raised, despite the fact that it is the same method that the Enlightenment Experiment used to craft Democracy and fecund-creative markets and justice courts and science. That method is regulated, flat-fair competition. An approach that I have described elsewhere. It is the root stock of all that we have… and is almost utterly ignored.

    For a rather intense look at how “truth” is determined in science, democracy, courts and markets, see “Disputation Arenas: Harnessing Conflict and Competition.” (https://www.davidbrin.com/nonfiction/disputation.html.)

    …. This early version leaves out a Fifth Arena that actually makes the point even better… sports! No league or team would survive any given weekend without benefiting from tight regulation to keep cheating to a minimum, illustrating a core truth that also applies to the other four great competitive-creative arenas markets, democracy, courts and science….

    … that competition only delivers its cornucopia of positive-sum benefits when there is both transparency and cooperatively created regulation to deter the age-old human curse of cheating. Cooperation and competition are essential partners, not opposites.)

    This early version appeared as the lead article in the American Bar Association’s Journal on Dispute Resolution (Ohio State University), v.15, N.3, pp 597-618, Aug. 2000,