Who among us can say that they don’t get really mad when people refuse to use their brains properly? Especially strangers on the internet. There is an irresistible temptation to shame the person into thinking better, often by trying to show them how their actions or beliefs or words are harmful or hypocritical or just intensely moronic.
But how often does this work, really? Basically zero percent. People often dig in even deeper, finding that the attempted shaming proves their original point.
Here’s the thing: shame is externalized guilt. That’s why it doesn’t work to convince people to change their minds. Allow me to explain.
When I regret that Ian has no more Oreos to eat because I ate the last one, I feel guilt. But when Ian goes “Wow, how could you do such an unkind thing as eating all the Oreos so that I have none to eat?” I feel shame.*
For shame to work, I have to agree with the premise. I have to be guilty about the thing and then have someone else also reflect that guilt at me. I have to agree that it would be good for Ian to have any Oreos, and I have to agree that it was unkind of me to scarf them all down in one sitting. (As it happens, I agree with those premises. For now, at least.)
If, on the other hand, it is my firm belief that the Oreos were mine to begin with, and he deserved none, and it was actually right and good for me to eat them all less than 24 hours after the biweekly grocery shop, then I would simply toss my head back and laugh at his attempt to shame me.
See the problem? I have to agree with a premise before I can be shamed by it.
What happened in your brain and your body when you looked at it? For me, it’s a bit of a recoil. The word is laden with a kind of strangling Puritan sexual morality.
This is why, when I was learning about the moral tastebuds and learned that my own morality is based strongly around purity, it was a bit uncomfy. Surely I am no Puritan!
But purity in that moral sense is not (only) about sex. It’s about the very human reaction against the dangerous. The same urge that would make you avoid drinking a cup of water with a cockroach in it. A good urge.
When this urge to avoid physical contamination becomes symbolic, it teaches us to avoid mental and spiritual contamination, to seek out the beautiful. Also a good urge.
But if you look, you’ll see it metastasizing everywhere.
Exhibit A: I used to take all kinds of things from the law school cafeteria in my pockets. Tea bags, an uneaten half of a bagel, single-serving peanut butters, fruit. This was technically not allowed, but the thrill was worth the risk. My little law-school dorm room desk filled up with crumpled tea bags I kept on hand in case of any caffeine emergencies. Just to have. For later.
A brief post on procrastination, unrelated (I’m sure) to my ongoing attempt to write three novels at once:
I’ve always been a procrastinator. In high school, after I got home I wouldn’t dream of starting my homework before chatting with everyone for a while (this was during the great age of AIM), putting up a perfectly despairing away message, and probably also spending some time surfing the dearly departed pre-social-media Internet. As a direct consequence of this behavior, I would end up having to stay up past my bedtime to do my actual homework, and the next day always started the same way: bleary-eyed at 6:30 with twenty minutes to leave the house on six hours of sleep. Not ideal for a teenager.
Similar patterns followed me all through college, law school, and into my working life, even after the sad demise of AIM. Even if I had a morning off from class, I didn’t think to treat it like work time: it was time for running, rambling, or even watching TV guiltily in my room. No, work time was that brutal late-night race against my body’s ability to stay awake. Later, in my office jobs, the best hour of my day was always the last one, when I felt as under the gun as possible.
(Now, part of this might just be that chronobiologically I’m more of an afternoon/evening type than a morning type, and that would be fine, to the extent I’m not actively sleep-deprived, which I nearly always have been).
I came to think of myself as someone who has some weakness of will preventing her from just doing things at the right time, whose only hope was a tight enough deadline that things would actually happen eventually.
But let’s go back to that first thought: that I’ve “always” been a procrastinator. Logically, “always” can’t start in high school.
I’ve truly always been someone who does things–but only certain things. Things I want to do. I procrastinate only sometimes, on some projects, and only in some circumstances.
I notice: there are things I jump at doing and don’t put off. And then there is everything else, for which I drag my feet.
Procrastination was, I thought, a problem I needed to fix. Simply figure out the right way to coerce myself into doing stuff earlier.
But recently, with some assistance, I turned my attention to the why of it all. Why do I delay some things, sometimes? Why is it that, sometimes, I can get started right away on a task, whereas other times I sit around until the eleventh hour to begin?
The problem of procrastination turns out to be a problem of the assignment itself. I found that I can fix it by tasking myself differently. Two steps here have helped immensely:
Make sure it’s the right task. Is it something I want to do at all? This has been helpful in creative writing. I sometimes think I know what sort of writing I “should” be doing, based on so-and-so’s recommended method, but if I find that I’d rather reorganize my sock drawer than try so-and-so’s method, maybe it’s just not the right task. Is there a way of altering the task so it actually appeals? On the other hand, if the task is something I must do (say, filing taxes, or work for my employer), making sure I really believe in the “why” behind it is helpful for motivation.
Make it small. The smaller the task, the more likely I will be to actually do it. This has always seemed counter-intuitive to me: I think whatever causes my last-hour productivity panic also causes me to chronically underestimate how long something will take. It then seems reasonable to think I could write, say, 2,000 words of a new novel a day. (Note: It is not.) Setting a daunting task like that for myself strains credulity. My brain inherently knows that ain’t happening. So it doesn’t. But if I set out a miniature task, one that seems far too small to even worry about–that will get done. And, to state the obvious, a lot of complete little tasks over time are better than even one incomplete big one.
This is not to say that I’ve got it all figured out, or that I don’t still find myself in a distraction stupor while the day speeds away around me. So when all else fails, well, I’ve made it this far on eleventh-hour panic. I guess that’s good enough, even if I never learn how to fully get rid of the little rebel inside me that just loves to watch me sweat near the deadline.
I’ve been thinking about stories: how ubiquitous they are, and how terribly important to making it through the winter.
The flip side of this innate bias toward stories is that we turn everything into a story, don’t we? When we’re telling loved ones about our day, we try hard to turn it into a proper narrative with a rise and fall. And especially when we go further back, the episodes we recall the most from our deep pasts are those that have some great punchline or a deep emotional resonance, using the same language and tugging at the same feelings as a good fictional story do.
Is this why I am often fighting my own anxiety about the future? I’m seeing my own future in the same terms as I see the unread bulk of a novel I’m just beginning. In that novel, in those pages that fit in my hand, something is going to happen. Someone made it happen, and it’s only going to go one way. It’s going to turn out one way or another, and then it will be over.
Plus, any novel worth its salt will lay out all the threads of that story early on, and they will twist and braid until they come to an appropriate conclusion.
That isn’t at all what life is. It’s not foreordained to turn out one way or another. It’s not even foreordained to turn out at all, except in the certainty of eventual death 💀 (Oh God, it’s turning dark, stay with me). But until then, there’s never a final word.
So this difference between real life and story life—the fact that one has an author and an arc and a tidy ending, and the other has—who knows what?—it makes it hard for me to remember that stories can actually taint our own view of our lives. It can make us overinterpret the past, picking through like story weavers for the threads that prove why the path we’re on now is the right one and always has been, or was never the right one from the start, or finding proof that so-and-so has always been trustworthy or has always been a rat.
And we overpredict the future, or at least I do: we get a thread going about how it’s set to work out, and then we just think all we have to do is follow that thread until we reach the end of the spool.
And we see signs everywhere along the way, like mystery readers looking for the keys planted by the author about whodunnit.
But life is episodic, not advancing toward anything in particular. It waxes and wanes, and some things happen for no apparent reason at all. To the extent there’s meaning, it’s earned through reflection and by applying lessons learned to our future behavior.
This Sunday marks the end of the liturgical year, for those who use the Christian Lectionary. At the end of the year, right before we go into advent, the readings get into the Apocalypse. They are the terrifying prophetic visions of the prophet Daniel, and a bit from Revelation.
The word “apocalypse” itself, in Greek, means something like “revealing” (and this is why the book “Revelation” is called that: in Greek, it’s “apokalipsis.”) It’s when the curtain is drawn back, and we can see the truth that has always been present, waiting for our attention. Apocalypses can happen in our own lives whenever our patterns are disrupted, whenever tragedies (large or small) strike and shake us. Whenever something forces us to reckon with truths we hadn’t wanted to face.
In ordinary speech, “The Apocalypse” is a single event that some people think will happen, just like an earthquake or a war may happen. But there’s another way to look at it, and it’s one that makes quite a lot of sense along with the fact that the lectionary has us read about it every year at the end of the church year: it’s episodic. It’s repeating. Like the seasons, it recurs regularly.
But recurrence isn’t enough to destroy surprise. Every winter we’re surprised by the cold and dark, and we remark about it in conversation: “I can’t believe how dark it is so early!” Every year, literally like clockwork it happens, and yet we never remember how it feels. It never loses the capacity to shock. It’s just like watching a show over and over. Even though we know the characters will eventually fall in love, say, some part of us can still be on the edge of our emotional seats not knowing if they will-or-won’t.
Likewise the Apocalypse is something we can feel in our own lives all the time. It would be one thing to wait for a single event, a cataclysm that will happen once in time, in someone else’s distant future. But don’t we all feel the rise and fall of our own story lines of our own expectations that are met or not met, or are met in surprising and possibly mindbending new ways? When thing are revealed, and the curtain is drawn back, and we can be so surprised by our own lives. That is apocalypse.
Our lives are all a craving for story, because we crave for it all to make sense. And the beautiful thing about stories is—they do make sense, more than basically anything else. They survive so long, long after the linear events are over. They thrive, and take on new meaning, and keep us warm even in the surprising dark of the longest nights of the year. They remind us of the promise of the recurrence of longer days, of spring, of rebirth, of respite. Of revelation.
We funneled into 100 Hutchins Hall for orientation. A few hundred first-year law students (1Ls, in law-school jargon) lined the rows of the auditorium, each clutching a slim blue copy of the U.S. Constitution. We were to sign it, signifying our commitment to a future of ethical lawyering.
I only half-knew what I was getting into. Like so many bachelors of arts, although concrete skills were few, my verbal abilities were highly satisfactory, and there wasn’t much for me to do in a down economy than go to law school. But unlike a lot of my new classmates, I didn’t already know much about it. I knew next to nothing about the curriculum or the legal field or how to get a law job or how to prepare for a law school final. I had to frequently double-check the names of the current Supreme Court justices to avoid severe embarrassment.
In other words, as I still often do, I had one foot in and one foot out of the place I am and the thing I’m doing, with often frustrating results. But that’s a different topic.
Anyway, the presentations started. Various deans introduced us to our new career and shared anecdotes of when they were in our shoes. They told us something that surprised me: they were not there to teach us the laws. Oh, no. Instead, their planned instruction was simple, and they repeated it over and over:
“Thinking like a lawyer.”
“You will learn to think like a lawyer.”
“It’s amazing, the most important part of law school is how you begin to think like a lawyer…”
Think like a lawyer.
And then there we were, tender 1Ls, beginning to repeat this line, like the hypnotized diners Eddie Izzard imagines beginning to request a certain salad dressing:
Where the heck did balsamic vinaigrette come from? … Balsamic fucking vinaigrette. How long ago? Ten years? Ten years ago?
“Would you like a dressing? We have thousand island, we have 970 island, we have 400 island, we have 3-mile island,—or balsamic vinaigrette, balsamic vinaigrette, balsamic vinaigrette…”
“I would like the balsamic vinaigrette, balsamic vinaigrette…”
It was just some suggestive thing.
Eddie Izzard, “Sexie” (2003)
Helpless in the face of this prophecy, we took our oath and then set out from that auditorium, freshly on our path to thinking like lawyers. We tucked into our textbooks and took assiduous notes in lectures and awaited the change.
I didn’t know what it meant until I was studying for finals that first semester, and I found my trusty brain working differently. The transformation is now complete enough that I can barely remember my former, unconverted state, but I do recall the sensation of changing: I was beginning to feel a bit like a computer.
Things that used to be connected no longer were. There were great chasms between concepts that used to feel related. And there were relations between concepts that had not been comparable before. But the more notable sensation was of a linear progression I hadn’t been aware of before. Questions were all multi-part: there were steps everywhere. Every possible inquiry had a number of associated sub-inquiries, which must be separated and arranged in a certain order to arrive at the right answer. The only alternative was a dreadful, murky chaos.
But this didn’t just extend to the concepts I needed to study for that semester’s grades. It infiltrated even my personal life, even emotional matters. Not only could I now take a far more rational approach to any question, but I could not not. I began to say “actually” a lot more.
And here I am, several years on from that mental conversion. I want to connect with my fellow humans when they discuss a matter of law or policy or the Constitution, but they just do not understand! Their eyes glaze over when I explain to them the all-important steps. Poor souls, they think of a matter all at once: they see it as an item to be discussed and to have feelings about. It saddens me, and I live in hope for their timely conversion.
And yet, something troubles me. In moments of dark doubt, I begin to worry about this setup: those of us who have been taught to think like lawyers can indeed think like lawyers—but no one else can, not without three years of pricey study at an accredited institution. But law is not like, say, engineering, which (I assert with blind confidence) can be left to the professionals. I can know blissfully nothing about how bridges are constructed, and yet generally drive over them fearlessly and without consequence.
But you non-lawyers, with your unconverted brains—we ask you to live in our law world at all times, and it isn’t as simple as driving over a bridge. You’ve got to avoid violating the countless thousands of civil and criminal statutes that apply to your every move. You don’t know about most of them, but if you violate them, we will generally pretend that you did know. Or, perhaps to put it more precisely, it doesn’t matter if you did or not, for various very-good reasons that we all learned about…in law school.
Does that seem unfair? Like you might need some assistance in that perilous field? If so, you might want to seek a lawyer’s counsel. Yes, it will cost money (how much? It depends) but it is a much better idea than waiting for the other side (that spiteful neighbor who hates your fence; the copyright holder of the TV show from which you borrowed a reaction GIF; the federal librarian who looked askance at you for bringing your kazoo to the Library of Congress) to lawyer up. Your lawyer will get you into the least-worst position against their lawyer, on your behalf—and yes, legally, it will be you doing it, because it was your responsibility to follow the laws all along.
Which—doesn’t that basically sound like a protection racket? I’m not trying to talk myself out of a job here, but I get a little nervous when it feels like I am trained to be some kind of mental bodyguard, ready to size up the other guy’s mental bodyguard, especially when we start growing our team because the other guy started growing his team, and now we’re basically in a mental bodyguard arms race, which obviously is really good for the bodyguard industry, although admittedly the excess bodyguards do nothing for the body-guarded.
Plus, I have a theory—on which more later, perhaps—that this setup fuels kooky DIY law theories like sovereign citizens. After all, if we make law seem arcane and volatile, we can hardly blame people for thinking that all they need to do to beat the system is to get a little arcane and volatile.
Okay, I hear you, lawyers: my argument teeters on that dreaded slippery slope to the suggestion that we eradicate lawyers, which doesn’t work for several reasons, a few of which are: (1) vigilante justice will be the remedy to future disputes (imagine: IRS agents beating up tax evaders); (2) who if not lawyers will enforce the ban on lawyers? One shudders to think; and (3) I’m not here to take jobs away from hardworking Americans who aren’t prepared to do much else. So let’s not even go down that path; it’s not what I mean.
What I do mean is: I’m troubled by the feeling that there is a wall between the people—to whom the laws apply and belong, whom the Constitution protects—and those laws and the Constitution. To the extent that you can’t really understand the laws or the Constitution without a law degree (if then), that is a failure on the part of those documents. Meanwhile, those who write new laws, regulations, and legal rulings interpreting them (🙋🏻♀️), are almost always lawyers. We understand each other, and we trust that the gatekeepers and interpreters of what we write will always be others from our guild, other future people who have sat in 100 Hutchins and been told that their brains were about to change. We’re basically a club with boring initiation rituals and steep annual dues, but we’re everywhere, and we don’t let you ignore us.
I suppose I could start to be the change. I could be the lawyer who practices in a way that is more accessible to non-lawyers. I could say “actually” less. I could dig deep, see if it’s still possible to access that pre-transformation brain that was able to think unlike a lawyer, even about law and its practical side, the way it actually affects people constantly.
So from my side of the chasm to yours, non-lawyers, I’d like to request some thoughts and prayers for the recovery of my old brain.
As is probably evident from the content of this blog, I’ve gotten in pretty deep with self-improvement-type topics. My Instagram feed is increasingly full of coaches, therapists, and spiritual writers. It’s a positive, empowering space (peppered with the occasional millennial-despair meme account I still follow).
But it is impossible to even glance at this cozy corner of the internet and not notice the glaring truth that it’s populated mostly by women. It’s not anywhere near an even split. Men are an endangered species around there.
Now, perhaps it’s not surprising that women are more drawn than men are to internet spaces for reflection, self-improvement, empowerment, for reasons we needn’t bother going into here. But don’t men need something like it?
I mean, glancing generally at the news and the unfiltered spaces of the Internet, you might well ask: are men okay?
Jordan Peterson is fairly well-known for a clinical psychologist. He’s also a voluminous YouTuber/podcaster, posting lengthy lectures about Jungian psychology and how to apply ancient wisdom from the Bible and the Epic of Gilgamesh to modern life.
He’s also famous for some political stands he’s taken in recent years against what he sees as the creeping dangers of rabid progressivism. Google it if you’re interested.
His star rose due to the political stands (because ADD LINK conflict pays, y’all), but he would rather think of himself first and foremost as a public professor. On the basis of his heightened name recognition, he published 12 Rules for Life: An Antidote to Chaos. As it promises, it’s 12 rules that Peterson believes are the key to developing character and living a worthy life.
As a whole, I found the book (like Peterson himself) full of contrasts. Deeply embarrassing, but also kind of sweet and avuncular. Fascinating and challenging but also terribly dense, as though he’s missing a chunk of every point he’s trying to make.
In the grand scheme of things, his rules, once he gets around to listing them, are great:
Stand up straight with your shoulders back.
Treat yourself like someone you are responsible for helping.
Make friends with people who want the best for you.
Compare yourself to who you were yesterday, not to who someone else is today.
Do not let your children do anything that makes you dislike them.
Set your house in perfect order before you criticize the world.
Pursue what is meaningful (not what is expedient).
Tell the truth – or, at least, don’t lie.
Assume that the person you are listening to might know something you don’t.
Be precise in your speech.
Do not bother children when they are skateboarding.
Pet a cat when you encounter one on the street.
We could meditate on those alone and have a good time. Except for the skateboarding one, on which more later, the rules make sense on their face, and all seem like pretty good guideposts for living a good life.
But of course, he doesn’t stop there. Each chapter contains Peterson’s explanation of what he means by a rule, in prose that is personal and rambling and cerebral. He alternatively gives his evidence in the form of personal memories, clinical anecdotes, and Jungian interpretations of stories from the Bible. Sometimes it takes many pages for him to explain what he’s getting at. It’s tons of generalizing and much of it would be tagged  on Wikipedia. But for all that griping, the book had me fully engaged and thinking the whole time, typing notes out on the silly little Kindle keyboard, and that’s got to count for something.
What we have, by the end, is a guide for how to be a particular kind of good person: one who stands up for what he believes in, who works hard and succeeds at his endeavors, who maintains a pragmatically positive outlook, who is honest and unafraid to probe his own flaws and improve them. Who takes care of himself and those around him. Who demands excellence of himself, as a moral duty.
Why do I use male pronouns? It all strikes me as such masculine—even macho—self-help. Which isn’t to denigrate it, but just to wonder: is this the men’s version of my self-improvement internet? Is this what the fellas turn to when they feel a need to grow and change?
If it is, Peterson doesn’t know it. I’ve heard him say (in my own  moment, I can’t be arsed to find the link) that it is a mystery to him why his following is so crowded with young men, and not with women. But he must know how unbalanced his ideas are, if attracting a mixed crowd is his goal. Right out of the gate in Rule 1 (“Stand up straight with your shoulders back”) he comes out swinging with anecdotes about how lobster brains work, how dominance hierarchies are embedded into them, how subduing or being subdued by another male lobster will, through the workings of serotonin, change the lobster brain, rendering loser lobsters depressed and laconic. All this to say that you, human reader, will have a better life if you stand up straight with your shoulders back, like a serotonin-rich alpha lobster.
Oof. Is this what men like? I wouldn’t say it resonated with me (as they say on my side of the Internet gender partition).
I don’t begrudge men this, for a moment. Even if we can’t share our wellness spaces (and even if it’s because they actively stay away from my side of the therapy economy), God knows they need something.
But here’s the rub. Peterson can’t stop himself, even when he’s ahead, even when he’s ten rules deep and going strong, and we’ve learned a good deal about how one might interpret the story of the Garden of Eden, and we’ve convinced ourselves to treat each other and ourselves with dignity and respect. No, he has to get into it. The conflict. Just as you get the sense he’s convinced himself that he deserves to be famous for his depth psychology and not for those shouty viral videos…he does a thing.
Rule 11, purportedly about not bothering children while they are skateboarding, is about how we ought to think about “patriarchy” so as to better the lives of both men and women, both boys and girls. I dare to summarize it thusly: adults (read: women, as women are all the examples he gives) do far more harm than good by attempting to protect children (that is, evidently, boys) from the suffering and danger inherent in a life worth living. Moreover, the feminism Peterson imagines, the one that demands the diminishment of masculinity to make room for the empowerment (masculinization, he coughs) of women, ends up harming women in the long run: it fills the world with bad men.
This chapter is Peterson at his dense worst. He doesn’t bother to understand what he’s refuting. But worse, it reminds you, just as you begin to trust him, that even though Peterson was doing Jung and the Bible for years, it’s taking a public stand against progressives that made him a star. Again, conflict, deployed well, is a ticket to success. Even if he fancies himself the reasonable guru, his numerous followers found him on the parts of YouTube with all the all-caps titles. No matter how much he pretends it isn’t the case, he is a star because of the brand of male resentment that burbles everywhere these days. All that to say: it’s a little hard to take him seriously when he tries to don the hat of objective critic of gender relations.
This brings up something that troubles me about therapy for the fellas: has it always got to resist women? My female corner of the wellness internet has nothing negative to say about men. But it feels to me, in my casual (and very much ) observation, that men’s self-improvement, self-actualization, whatever have you, always boils down to being about not women. As though men were the negative space made up of whatever is not feminine, and their very existence depends on separating from the feminine. (Although Peterson would be the first to tell you that it is the feminine that is typically represented as the dark, the night, the passive. Again, I say, gesturing generally at all of this citation needed.)
All that aside for now. If it weren’t for rule 11, and for Peterson’s reputation in general, I wouldn’t hesitate to recommend the book. Not to say that it’s without its flaws, but it’s a thought-provoking read all the way through, even if you end up tearing at your hair a bit. I think we have to understand this moment that Peterson is a big part of.
But we might be able to understand it a little more concisely from Olive, a sturdy character from Elizabeth Gilbert’s recent novel City of Girls:
“The field of honor is a painful field,” Olive went on at last, as though Peg had not spoken. “That’s what my father taught me when I was young. He taught me that the field of honor is not a place where children can play. Children don’t have any honor, you see, and they aren’t expected to, because it’s too difficult for them. It’s too painful. But to become an adult, one must step into the field of honor. Everything will be expected of you now. You will need to be vigilant in your principles. Sacrifices will be demanded. You will be judged. If you make mistakes, you must account for them. There will be instances when you must cast aside your impulses and take a higher stance than another person—a person without honor—might take. Such instances may hurt, but that’s why honor is a painful field. Do you understand?”
I think Peterson would understand Olive, even if it would be slightly ironic, all things considered, that a lesbian character from a book called City of Girls basically could scoop him in a paragraph.
Conflict has been on my mind a lot recently, and not just because the world seems to be so full of constant knives-out energy (although that doesn’t help.) The very idea of conflict is central to my Enneagram type: 9s are among the more conflict-averse types, and my own conflict aversion was a huge wake-up call when I started learning about the Enneagram.
But lately, I’ve heard from a lot of people who are close to me that they don’t see me as particularly conflict-averse, or prone to merging with others to the point of disappearance, or unwilling to state an opinion, which are all ways that I have described myself. This might mean a few things:
First, I might be falling into the confirmation bias trap that lurks in all models, and certainly in the Enneagram. Especially when people talk about the Enneagram in a way that focuses on behavior rather than motivation, it can become simplistic to the point of pure falsehood. If you believe the memes, 9s are always buried under a blanket watching TV and tipping over into a fugue state when someone requires them to make a decision. So I may well be ascribing habits to myself that aren’t really as consistent as all that, falling into the gravitational pull of the stereotypes.
(I don’t think that’s the reason that I come across as less conflict-averse than I profess, though. I suspect the others:)
Second, I might have a more developed 8 wing than I realize. Like many 9s, I identify with nearly every type, often thinking I am all of them–except 8. Reading about 8s is, for me, like reading about aliens. That’s not me, at least, I can say defiantly. But don’t I have lots of rage, often internalized? Uh, yeah. And like to poke at people’s lazy thinking? Yes. And don’t I deeply resent being controlled (even if I am more prone to react passive-aggressively than proper aggressively?) Totally. But I’m only recently seeing these traits, because I think ordinarily I suppress noticing them. They don’t fit with the shallow version of myself I historically tried to inhabit: the unobtrusive, kind, peaceful, dreaming sort. (Some other time I might tell the story of intentionally throwing the Myers-Briggs test to empirically be as wood-nymph-like as possible.)
Third, and most important, at the end of the day the fear is not so much of conflict itself, but of disconnection. Conflict is a quick ticket to disconnection in a weak relationship, so avoiding it can be a shorthand for avoiding disconnection. I still get a little stomach ache thinking about the driver with whom I exchanged fingers a few weeks ago: I was walking; she nearly ran me over then flipped me off; I lost it at her quite impotently then fretted for a full day about how someone who doesn’t know me at all could have such malice toward me, and whether she’s out there thinking I’m the asshole, as though it really matters.
The thing is, instinctively, I’d rather hang on to the hollow shell of a relationship than risk losing it. So that’s when the hiding, the aversion to difficulty, is helpful.
But in a strong relationship where connection is plentiful, where I feel secure that conflict won’t lead to disconnection, I can let myself show more. I can be a bit of a pain. I can needle people into refining their opinions. I can feel, and show, my frustration. This allows me to work through it, get past it, rather than simmering internally. The phoniness drops. And it’s simplistic to call all of this “conflict,” and to say that I hate it, because it’s a part of the big complex tapestry that any relationship is.
All that to say, your girl is still trying to get comfortable with the idea that not everyone has to like me, especially if it comes at the cost of having been myself. And this is the kind of stuff that the almighty models can do well, at their best: show us the ways we might be hiding from the truth about why we do the things we do. This is when we have a chance to change those things, if they’re not working out for us.
As my pastor recently observed, it’s the new year. Sort of. At least it is the sense that kids are going back to school and the blazing hot summer is starting to break. There’s an opportunity for a fresh start–which is to say, even though a fresh start is always available, right now it might feel a little more possible than it often does in the middle of things.
A few days ago, I was introduced to Gretchen Rubin’s Four Tendencies, which is a program to help with procrastination and follow-through depending on one’s type of accountability. Are you accountable to your own expectations, or other people’s expectations of you, or both, or neither? You can take her quiz here if you’re curious.
But the quiz didn’t really illuminate anything for me, because I’m so inconsistent and ambivalent. I’m accountable to others–and I’m not. I’m accountable to myself–and I’m not. I’m often saying yes to invitations and requests when I’d rather say no. This builds resentment on my part. It also may cause resentment on the part of those who kindly asked me to participate but who wouldn’t have asked if they’d known I would be sullen and difficult, or half-present.
And when it comes to commitments to myself, I’m the same way: I over-commit, always making ten plans when one would do, and then when I only end up completing one, I irritate myself. Or I burn the candle at both ends striving to get all ten done just to say I did. Plus, the weight of the ten expectations I set makes me want to rebel by procrastinating like a toddler striking from naptime, with more or less the same predictable results.
Still, for an optimistic committer such as myself, the lure of the new plan is often irresistible. I use Google Keep to store my dozens of to-do lists, containing everything from the grocery list to the daily to-do list to the weekly to-do list to the prioritized list of movies I want to watch soon (I know) and, of course, the long-term to-do list that is the only barrier between me and fatal inertia. A lot of this in Enneagram terms is down to my type’s tendency to struggle with priorities: all priorities seem equally urgent, so I put them all off equally, which causes total chaos. My two most productive times are the panicked hour before I must leave work, and the panicked hour that straddles my planned bedtime.
As an experiment, I’m taking a break from the to-do lists. We’re going to see what happens. May need to schedule a wellness check just in case this experiment results in me starving to death because the list didn’t tell me to shop for groceries.
But contrast this list-making, task-obsessed, rebellion-inducing behavior with the following:
Right around the new calendar year, I printed out a page with three Greek words on it, that I hoped would set the tone for my year. It hangs on my fridge thusly:
Eudaimonia: literally, “well-daemoned,” “well-spirited.” Happy, virtuous, excellent, living well. For me, this meant to aim for the things that create that sense of joyful ease, creative flow, peaceful purpose.
These words are a guidepost, not a to-do list. This means they don’t provide the tremendous satisfaction I associate with the conquered to-do item (seriously, nothing like hitting that check box on Google Keep; how I miss it…) But I see them every day, and every day I’m a little more likely to think about what I have to be grateful for, or how to lean further into the things that give me joy, away from the things that don’t. Or to reflect on the ways transformation is always possible.
I think it’s helping, even if it’s hard to know without being able to cross it off as complete. I still have a long way to go with the transformation part. But for today, I’m sitting outside with the birds and the breeze, and that is hard to beat.
Perhaps as is evident from my current fixation on the Enneagram, I’m drawn to personality metrics, and all kinds of explanatory models for why people act so damn different from each other (which is to say, from me) all the time.
One fascinating one is the “moral foundations” quiz (link below), which is based on the work of Jonathan Haidt and his colleagues. Haidt is a psychologist who studies the psychology of morality, and if that seems a little odd to you, then I’d highly recommend his book The Righteous Mind. It is completely fascinating.
When I read it in 2017, it blew my mind–essentially because I had become so unaccustomed to hearing the word “morality” used in a secular context. I think many of us have a strong distaste for that word, preferring a term like “ethics,” which seems a little more civilized and rational, a little less feral, than “morality.” But the book forces you to look right at that word, and to examine what your own morality might be. “Haven’t got one,” I thought, slightly tongue-in-cheek, for the first hundred pages or so. But I was wrong.
Haidt identifies six moral foundations, which he calls “tastebuds.” Each of us cares more or less about each of the six, resulting in our own individual morality which we use constantly, unconsciously, to evaluate the world. Each of the six is a set of concepts, as follows:
The degree to which each person feels each of those concepts strongly will determine how they evaluate a situation. For example, most of us can readily understand care vs. harm. Just imagine a situation in which someone or something vulnerable–a child or a puppy–is in need of care, or worse, is being mistreated. The feeling that stirs in you or the action it might drive you to, so the theory goes, shows your care/harm tastebud activating. And if the bare mention of the idea that such a thing could happen got you squirming, congratulations: your care/harm tastebud is alive and well.
But what about the others? The one that fascinates me the most is sanctity vs. degradation, also known as “purity.” The most obvious applications of this tastebud are in sexual morality. But it’s also about food, and personal hygiene. Imagine a clean, cool glass of water. Now imagine someone drops a harmless, sterile cockroach in it. Would you drink it? Your rational mind may accept that the cockroach is sterilized, but your sanctity/degradation tastebud will violently protest if you try to drink it. Even though that may not sound like “morality” at first, consider how religions have almost always set standards for cleanliness of body and of food. We must conclude that food and cleanliness can be deeply moral to humans. If Haidt is right, this tastebud demonstrates the influence of purity on your morality, even at an unconscious level, and even when the tastebud reacts to circumstances that don’t appear to have a moral component at all, like the cockroach.
In the last portion of The Righteous Mind, Haidt goes on to apply his findings about the moral tastebuds to the modern* political climate. He found experimentally that American liberals and American conservatives have recognizable patterns that show up in their tastebuds: liberals tend to weigh care/harm and fairness/cheating very highly, and weigh the rest of the tastebuds less strongly. Conservatives weigh all six somewhat evenly, meaning that their care/harm and fairness/cheating tastebuds come out a little lower than liberals’ did, but the others are higher.
You can easily imagine how that would play out in a political argument: person A might be making excellent points about how Policy X will harm someone, while person B is making excellent points about how Policy X diminishes oppression. They may quickly begin talking directly past each other, unable to hear the value in the other’s point, or see the weaknesses in their own.
(You definitely have something in mind for what policy X is, don’t you? You even know what you think about policy X, and I haven’t said what it is.)
Anyway, if you’re interested, take the test and find out where you come out.
*A note on the “modern” political climate: The Righteous Mind was published in the salad days of 2012, back when politics were all about the Obama era and the Tea Party moment. It feels like so long ago. And even way back in 2017, I thought Haidt’s descriptions of what makes someone “liberal” or “conservative” were pretty outdated. Things have changed, folks.