This one is about conspiracy theories.
Looking back into the rosy mists of time, I imagine that there was a moment when believing the expert consensus was the sign of a reasonable person. It might have been more or less normal to hold a humble belief that people who are not climate scientists, epidemiologists, theologians, medical doctors, or lawyers would be wise to shut up and listen to those people in their respective fields.
But now, it seems, the opposite is often true: only a total moron believes what he is told on the basis of the elite credentials of the teller. In the internet age, it is a sign of good discernment and wisdom to look behind the veil, to disregard the dull consensus and look for the improbable and dramatic explanation instead.
Oh, you don’t believe that Pete Buttigieg and/or his husband fixed the Iowa caucus by somehow being affiliated with a nefarious company behind the ill-fated app they used to tally votes? Wow, what a dummy.*
That’s just a fun little sample of one conspiracy theory I became aware of sometime in the very long decade that has been 2020 so far. As you probably know, there are lots more where that came from. My awareness of these theories largely comes from my inability to simply avoid reading the comments. This inability of mine lives in tandem with conspiracy theorists’ inability to avoid commenting to educate the sheeple, and, well, here we are, symbiotic and miserable.
(If I’m a little more aware of, or a little more interested in, the grimy conspiracy underbelly of the internet than you are, fear not. I’m a bit of a disaster tourist, straddling several online worlds, and gawking at still others. This is confusing, and relevant, on which more later.)
I’m not here to rehash these various conspiracy theories, because they’re mostly incredibly dumb and awful, and a lot of them are about covid, which is disheartening. Instead, I’m here to wonder why.
I think there are three main temptations to conspiracy thinking:
1. Knowledge vs. Comprehension
Others have observed that a lot of us think of experts as people who have more facts in their fact-hoard than we do. The idea is that experts are nothing more than people who know a lot of things that we don’t know about. If this were true, then experts would be obsolete as soon as we know a lot of stuff. The internet gives us the illusion of knowing a lot of stuff. Thus, we no longer need experts, because we are in an equally good position as anyone to collect facts, which practically makes us experts, too.
The problem is, of course, learning facts is different than understanding them. I spent three months memorizing law for the bar exam, and that process bore almost no relationship to the three-year process that made me actually understand law. I imagine the same can be said about most areas of expertise, whether they come from formal education or not.
And the bigger problem with this idea that we can all be experts if we have the facts is that “facts” is a leaky category. A lot of “facts” one might encounter are false, or we have misremembered or misunderstood them. Someone whose brain is clouded with things “people are saying,” or things “I heard,” is not necessarily someone who has any idea which of those things are accurate.
Unfortunately, usually the counterbalance to our wrongness is…an expert, parachuting in to tell us that we’ve got the wrong end of the stick. And if we’ve already decided that experts are just stuffed-shirt bozos, we’re unlikely to listen or care.
2. Desire for Enlightenment
Another temptation: the desire to see beyond the veil. The desire to transcend the mundane and get insight into the real.
Confession time: I often fly too close to the sun of wellness-Instagram. It starts with the gateway drug of well-meaning therapeutic content, then it veers into arcana, and then before you know it you’re tumbling next door to anti-vax/healing crystals content. This is not great, for me.
At that extreme edge of wellness content, these people (who trend more liberal than not, but who are probably not terribly political in the normie sense) begin to unironically entertain Infowars ideas. (If you don’t know what Infowars is, count your lucky stars and stay that way.) These tender babes will look at the notion that big governments and/or big corporations fabricated the current pandemic and go, “Wow, this is a big question Alex Jones is asking! He wasn’t afraid to go there! Boom! Fire!” and then just…leaving it behind, along with all the other vaguely exciting but ultimately fruitless bits of boom/fire these people have ever encountered.
And, pardon me as I rant a bit: these unnamed people are impressed when someone plays binaural beats (for the uninitiated, that means music that is purported to stimulate good brain waves) to CBD oil. Which, it apparently needs to be said, does not have a brain. In the next breath, they talk about mainstream media selling “snake oil.” Pardon my french, but FUCK.
There is obviously nothing wrong with “asking the big questions.” The willingness to think critically is fundamental. But knowing how to simply ask a big and explosive question, without knowing what to do with the answer, is a recipe for chaos. We need a framework to put our big questions and big answers onto. This is, typically, what experts help guide us to. But if we’ve discarded them, we are left with the vague sense that we are on our own to examine the world, and we are pretty sure that we’re going to get somewhere by poking around. And the bigger stick we get, the harder we poke, maybe we’ll really shake things up and get somewhere! So the thinking goes, I suspect.
“Asking the big questions,” by itself, is an easy substitute for actually thinking. Just entertaining a wild idea momentarily provides a feeling of having broken through, having obtained some kind of enlightenment, leaving all the dummies behind, and then nothing further is required. No examination or follow-through is needed, because (to quote Jesus, perhaps ironically), one has received one’s reward in full. This kind of conspiracy thinker may not even remember the conspiracy; they may not engage with it beyond the first time they go “whoa, shit’s wild.”
But what this kind of experiential conspiracy-theorizing does do is leave a trail. It amplifies conspiracy theories for people who aren’t just looking for a little high. Unfortunately, we all share one internet, and this kind of behavior is sort of like lighting a firecracker in a dry forest. The explosion might be a lot of fun, but it doesn’t last long and it can cause a lot of damage for a while after.
3. Desire for Belonging
A third temptation: the desire to belong. For a social animal like a human, the fear of being ostracized for being a total dummy is a very effective policing mechanism. I mean, do you want your community to think you’re a dummy? I rest my case.
Here’s an example of how this works. The tweet is about the second Democratic debate including Mike Bloomberg, who had totally bombed the first time around but was getting a surprising amount of positive crowd reaction. (Note: I’ll return to this later, but for now just look at the fear of being ostracized:)
Notice how this language activates your emotions: if you don’t think the situation is sketchy, you have a baby brain. A brain a literal tiny stupid baby would have. It’s just name-calling, but it certainly makes an adult brain defensive, and if you care about what this person thinks of you (I have no idea who they are), you might find yourself agreeing with them reflexively for fear of being a dumbass.
Now, this is where my internet dilettantism gets confusing. Because I’m an internet tourist, as I said above, my sense of what’s acceptable is always yanked into confusion: on the one hand I get too close to healing-crystals Instagram, where everything is chiropractic and holistic remedies and there’s an overlap with “5G caused coronavirus”/antivax stuff (yikes). Other pockets of my internet experience bring me adjacent to prissy religious types, communists, hyper-normal political types, aggressively woke YA book twitter, and archaeologists.
I like the archaeologists the best.
But it’s impossible to keep up with all of these groups. I don’t belong to any of them, let alone all of them. I think this is what keeps me sane: if I got too deep into healing crystals, I would immediately be shamed out of it by my next foray into a different corner of the internet. Thank God the filter bubble has not yet succeeded in skewering me to my core beliefs to the point that it can feed me more and more extreme versions of what I already think.
Because that’s how it happens, folks: you see more and more extreme versions of whatever activates your emotions and your opinions. That’s how they make their advertising money.** There is a community there. To belong to it, very naturally, you begin to adhere to its beliefs.
And once you’re inside, it doesn’t seem weird anymore. The things that might have made you raise an eyebrow before just seem like givens. Things that make you raise an eyebrow now are things you should probably consider believing, because God knows everyone else does, and you don’t want to be the lone weirdo on the outside who’s slow on the uptake, do you?
Bonus: When Broken Clocks are Right
I’ve come this far with my nose in the air, making it sound as though I’m immune to conspiracy theories. But I’m totally not! And here’s the thing: it’s not a witch hunt if there are actual witches! Every so often, completely wild and nefarious dealings are going on! The only question is, which ones are true?
I happen to believe in various semi-conspiracy-theory views about the military-industrial complex, which I won’t get into here (for obvious reasons!) And I believe, without any caution at all, that tech companies are basically intentionally radicalizing us all—making us into conspiracy theorists, in many cases—to keep us clicking long enough to see plenty of ads.
Remember that tweet above about Mike Bloomberg’s sympathetic audience in his second debate? I was watching that debate live. And at the time, I shared that suspicion. It just made sense to me that Bloomberg had quite possibly paid a bunch of people to pack the audience and boo/clap on cue. It fit with his entire strategy in his brief campaign, which consisted of throwing small-island-nation-GDPs-worth of money into TV ads and astroturfy outreach efforts. Why not hire people to sit in the audience? It didn’t seem that outlandish.
As it turns out, while the tickets to the debate were very expensive (meaning the audience was not representative of the population), there is no evidence of a conspiracy of Bloomberg filling the chairs with paid supporters. And, by the way, I came into writing this post with a false memory of it having been more or less confirmed that Bloomberg did pay people to sit in the seats. Then I googled it in the course of writing this, and it wasn’t so. That doesn’t exactly shake me to my core, but it gives me pause, because I bet I have a lot of other false facts, squirmy memories, and untested beliefs swimming around upstairs.
I don’t know how to end this post except by more or less saying “good luck out there.” But honestly, good luck out there. Our brains were not made for all this. We’re struggling.
*A note: if he was nefarious enough to do this, how was he not nefarious enough to do…more than this? To win more states, and to clearly win Iowa to start? This is my main objection to most conspiracy theories, honestly: they both over- and under-estimate the power of a conspiracy. Somehow Hillary Clinton was able to manufacture an international conspiracy or five against Donald Trump, but wasn’t able to win just a handful more electoral votes? I don’t understand the world in which that is possible. But alas, I’m a sheeple. A sherson.
**Do I sound like a bit of a conspiracy theorist myself, here? Keep reading.
Pingback: My strange addiction | PsychoPomp