Maybe not “conflict:” or, further Enneagram thoughts.

Conflict has been on my mind a lot recently, and not just because the world seems to be so full of constant knives-out energy (although that doesn’t help.) The very idea of conflict is central to my Enneagram type: 9s are among the more conflict-averse types, and my own conflict aversion was a huge wake-up call when I started learning about the Enneagram.

But lately, I’ve heard from a lot of people who are close to me that they don’t see me as particularly conflict-averse, or prone to merging with others to the point of disappearance, or unwilling to state an opinion, which are all ways that I have described myself. This might mean a few things:

First, I might be falling into the confirmation bias trap that lurks in all models, and certainly in the Enneagram. Especially when people talk about the Enneagram in a way that focuses on behavior rather than motivation, it can become simplistic to the point of pure falsehood. If you believe the memes, 9s are always buried under a blanket watching TV and tipping over into a fugue state when someone requires them to make a decision. So I may well be ascribing habits to myself that aren’t really as consistent as all that, falling into the gravitational pull of the stereotypes.

(I don’t think that’s the reason that I come across as less conflict-averse than I profess, though. I suspect the others:)

Second, I might have a more developed 8 wing than I realize. Like many 9s, I identify with nearly every type, often thinking I am all of them–except 8. Reading about 8s is, for me, like reading about aliens. That’s not me, at least, I can say defiantly. But don’t I have lots of rage, often internalized? Uh, yeah. And like to poke at people’s lazy thinking? Yes. And don’t I deeply resent being controlled (even if I am more prone to react passive-aggressively than proper aggressively?) Totally. But I’m only recently seeing these traits, because I think ordinarily I suppress noticing them. They don’t fit with the shallow version of myself I historically tried to inhabit: the unobtrusive, kind, peaceful, dreaming sort. (Some other time I might tell the story of intentionally throwing the Myers-Briggs test to empirically be as wood-nymph-like as possible.)

Third, and most important, at the end of the day the fear is not so much of conflict itself, but of disconnection. Conflict is a quick ticket to disconnection in a weak relationship, so avoiding it can be a shorthand for avoiding disconnection. I still get a little stomach ache thinking about the driver with whom I exchanged fingers a few weeks ago: I was walking; she nearly ran me over then flipped me off; I lost it at her quite impotently then fretted for a full day about how someone who doesn’t know me at all could have such malice toward me, and whether she’s out there thinking I’m the asshole, as though it really matters.

The thing is, instinctively, I’d rather hang on to the hollow shell of a relationship than risk losing it. So that’s when the hiding, the aversion to difficulty, is helpful.

But in a strong relationship where connection is plentiful, where I feel secure that conflict won’t lead to disconnection, I can let myself show more. I can be a bit of a pain. I can needle people into refining their opinions. I can feel, and show, my frustration. This allows me to work through it, get past it, rather than simmering internally. The phoniness drops. And it’s simplistic to call all of this “conflict,” and to say that I hate it, because it’s a part of the big complex tapestry that any relationship is.

All that to say, your girl is still trying to get comfortable with the idea that not everyone has to like me, especially if it comes at the cost of having been myself. And this is the kind of stuff that the almighty models can do well, at their best: show us the ways we might be hiding from the truth about why we do the things we do. This is when we have a chance to change those things, if they’re not working out for us.

…depending on how you count.

Tonight someone who had just read my book asked me how long it had taken to write. And as I generally do, I responded: “It depends on how you count.”

Really, how long does it take to write a novel?

All I have is my own minimal experience, of course, but let’s try to tabulate. Our counting options are below:

  1. Hours. I wrote for an estimated average of thirty minutes a day for roughly 182 days or six months–at least for the second draft. This works out to about 91 hours, or a hefty two-week timesheet.*
  2. Months. I wrote consistently for six months in one year, then let it sit for a while, then wrote for another six months, during which time I ended up rewriting virtually the entire thing. So that’s either six months or twelve, depending on whether I credit that first draft with really being part of the finished** draft.
  3. Years. I started seriously planning this book a little over three years ago. Serious planning involved making outlines, character sketches, and doing some research. This planning bears almost no relationship to what’s in the draft now, but it was a start. In short: planning in 2016, first-drafting in 2017, introspection and re-planning in 2018, and second-drafting in 2019 (with further edits TBD, perhaps also in 2019)?
  4. Decades. This story has been banging around my head as a little novel kernel since I was a kid. It was pretty insistent on getting out one of these days.

So there you have it, folks. If you’re looking for anecdata about how long it takes to write a novel, it’s somewhere between two work weeks and your whole life.

*I work in the government, after all.

**Ha.

Three Greek words for a fresh start.

As my pastor recently observed, it’s the new year. Sort of. At least it is the sense that kids are going back to school and the blazing hot summer is starting to break. There’s an opportunity for a fresh start–which is to say, even though a fresh start is always available, right now it might feel a little more possible than it often does in the middle of things.

A few days ago, I was introduced to Gretchen Rubin’s Four Tendencies, which is a program to help with procrastination and follow-through depending on one’s type of accountability. Are you accountable to your own expectations, or other people’s expectations of you, or both, or neither? You can take her quiz here if you’re curious.

But the quiz didn’t really illuminate anything for me, because I’m so inconsistent and ambivalent. I’m accountable to others–and I’m not. I’m accountable to myself–and I’m not. I’m often saying yes to invitations and requests when I’d rather say no. This builds resentment on my part. It also may cause resentment on the part of those who kindly asked me to participate but who wouldn’t have asked if they’d known I would be sullen and difficult, or half-present.

And when it comes to commitments to myself, I’m the same way: I over-commit, always making ten plans when one would do, and then when I only end up completing one, I irritate myself. Or I burn the candle at both ends striving to get all ten done just to say I did. Plus, the weight of the ten expectations I set makes me want to rebel by procrastinating like a toddler striking from naptime, with more or less the same predictable results.

Still, for an optimistic committer such as myself, the lure of the new plan is often irresistible. I use Google Keep to store my dozens of to-do lists, containing everything from the grocery list to the daily to-do list to the weekly to-do list to the prioritized list of movies I want to watch soon (I know) and, of course, the long-term to-do list that is the only barrier between me and fatal inertia. A lot of this in Enneagram terms is down to my type’s tendency to struggle with priorities: all priorities seem equally urgent, so I put them all off equally, which causes total chaos. My two most productive times are the panicked hour before I must leave work, and the panicked hour that straddles my planned bedtime.

As an experiment, I’m taking a break from the to-do lists. We’re going to see what happens. May need to schedule a wellness check just in case this experiment results in me starving to death because the list didn’t tell me to shop for groceries.

But contrast this list-making, task-obsessed, rebellion-inducing behavior with the following:

Right around the new calendar year, I printed out a page with three Greek words on it, that I hoped would set the tone for my year. It hangs on my fridge thusly:

Eudaimonia

Eucharistia

Metanoia

Eudaimonia: literally, “well-daemoned,” “well-spirited.” Happy, virtuous, excellent, living well. For me, this meant to aim for the things that create that sense of joyful ease, creative flow, peaceful purpose.

Eucharistia: gratitude.

Metanoia: transformation.

These words are a guidepost, not a to-do list. This means they don’t provide the tremendous satisfaction I associate with the conquered to-do item (seriously, nothing like hitting that check box on Google Keep; how I miss it…) But I see them every day, and every day I’m a little more likely to think about what I have to be grateful for, or how to lean further into the things that give me joy, away from the things that don’t. Or to reflect on the ways transformation is always possible.

I think it’s helping, even if it’s hard to know without being able to cross it off as complete. I still have a long way to go with the transformation part. But for today, I’m sitting outside with the birds and the breeze, and that is hard to beat.

Moral Tastebuds

Perhaps as is evident from my current fixation on the Enneagram, I’m drawn to personality metrics, and all kinds of explanatory models for why people act so damn different from each other (which is to say, from me) all the time.

One fascinating one is the “moral foundations” quiz (link below), which is based on the work of Jonathan Haidt and his colleagues. Haidt is a psychologist who studies the psychology of morality, and if that seems a little odd to you, then I’d highly recommend his book The Righteous Mind. It is completely fascinating.

When I read it in 2017, it blew my mind–essentially because I had become so unaccustomed to hearing the word “morality” used in a secular context. I think many of us have a strong distaste for that word, preferring a term like “ethics,” which seems a little more civilized and rational, a little less feral, than “morality.” But the book forces you to look right at that word, and to examine what your own morality might be. “Haven’t got one,” I thought, slightly tongue-in-cheek, for the first hundred pages or so. But I was wrong.

Haidt identifies six moral foundations, which he calls “tastebuds.” Each of us cares more or less about each of the six, resulting in our own individual morality which we use constantly, unconsciously, to evaluate the world. Each of the six is a set of concepts, as follows:

  1. Care/Harm
  2. Fairness/Cheating
  3. Loyalty/Betrayal
  4. Authority/Subversion
  5. Sanctity/Degradation
  6. Liberty/Oppression

The degree to which each person feels each of those concepts strongly will determine how they evaluate a situation. For example, most of us can readily understand care vs. harm. Just imagine a situation in which someone or something vulnerable–a child or a puppy–is in need of care, or worse, is being mistreated. The feeling that stirs in you or the action it might drive you to, so the theory goes, shows your care/harm tastebud activating. And if the bare mention of the idea that such a thing could happen got you squirming, congratulations: your care/harm tastebud is alive and well.

But what about the others? The one that fascinates me the most is sanctity vs. degradation, also known as “purity.” The most obvious applications of this tastebud are in sexual morality. But it’s also about food, and personal hygiene. Imagine a clean, cool glass of water. Now imagine someone drops a harmless, sterile cockroach in it. Would you drink it? Your rational mind may accept that the cockroach is sterilized, but your sanctity/degradation tastebud will violently protest if you try to drink it. Even though that may not sound like “morality” at first, consider how religions have almost always set standards for cleanliness of body and of food. We must conclude that food and cleanliness can be deeply moral to humans. If Haidt is right, this tastebud demonstrates the influence of purity on your morality, even at an unconscious level, and even when the tastebud reacts to circumstances that don’t appear to have a moral component at all, like the cockroach.

In the last portion of The Righteous Mind, Haidt goes on to apply his findings about the moral tastebuds to the modern* political climate. He found experimentally that American liberals and American conservatives have recognizable patterns that show up in their tastebuds: liberals tend to weigh care/harm and fairness/cheating very highly, and weigh the rest of the tastebuds less strongly. Conservatives weigh all six somewhat evenly, meaning that their care/harm and fairness/cheating tastebuds come out a little lower than liberals’ did, but the others are higher.

You can easily imagine how that would play out in a political argument: person A might be making excellent points about how Policy X will harm someone, while person B is making excellent points about how Policy X diminishes oppression. They may quickly begin talking directly past each other, unable to hear the value in the other’s point, or see the weaknesses in their own.

(You definitely have something in mind for what policy X is, don’t you? You even know what you think about policy X, and I haven’t said what it is.)

Anyway, if you’re interested, take the test and find out where you come out.

*A note on the “modern” political climate: The Righteous Mind was published in the salad days of 2012, back when politics were all about the Obama era and the Tea Party moment. It feels like so long ago. And even way back in 2017, I thought Haidt’s descriptions of what makes someone “liberal” or “conservative” were pretty outdated. Things have changed, folks.

More or less without comment…

The Lectionary is the daily set of Bible readings many Christian churches use: each day there’s a passage from the Old Testament, one from a Psalm, and one from the Gospels. On Sundays, there’s a reading from the epistles.

I read it most days, because it fascinates and inspires me. It also so often speaks directly to something I or the collective “we” are going through. Uncannily, sometimes.

Today, the Old Testament reading is from Judges. Speaking the first parable of the Bible, Jotham cries out on the mountaintop against the kingship of Abimelech:

All the citizens of Shechem and all Beth-millo came together and proceeded to make Abimelech king by the terebinth at the memorial pillar in Shechem.

When this was reported to him, Jotham went to the top of Mount Gerizim and, standing there, cried out to them in a loud voice:
“Hear me, citizens of Shechem, that God may then hear you!
Once the trees went to anoint a king over themselves.
So they said to the olive tree, ‘Reign over us.’
But the olive tree answered them, ‘Must I give up my rich oil,
whereby men and gods are honored,
and go to wave over the trees?’
Then the trees said to the fig tree, ‘Come; you reign over us!’
But the fig tree answered them,
‘Must I give up my sweetness and my good fruit,
and go to wave over the trees?’
Then the trees said to the vine, ‘Come you, and reign over us.’
But the vine answered them,
‘Must I give up my wine that cheers gods and men,
and go to wave over the trees?’
Then all the trees said to the buckthorn, ‘Come; you reign over us!’
But the buckthorn replied to the trees,
‘If you wish to anoint me king over you in good faith,
come and take refuge in my shadow.
Otherwise, let fire come from the buckthorn
and devour the cedars of Lebanon.’

For context, Abimelech was a power-hungry would-be king of Shechem, a city in Israel. He killed all of his brothers but one, Jotham, to seize the throne.

Stan Patterson writes:

A dominance orientation is always rooted in an exaggerated opinion of self and a marginalization of others. It opens the door for coercive behavior that engenders fear and force limited only in terms of what the character of the person will allow. In his bid for dominance, Abimelech’s character allowed the most extreme coercion–deception and murder. The reward was his coronation beside the “oak of the pillar which is at Shechem” and the title of king.

Does this remind you of anyone yet?

Jotham’s parable of the buckthorn (or “bramble”), which he shouted from the mountaintop, criticizes the people (the mighty Cedar of Lebanon) for choosing a scheming, parasitic bramble of a man to crawl all over them and declare himself king. How can a bramble, which is not even a tree, but which may constrict and suck the life from a tree, be king of the trees, by the trees’ own choosing? Woe to the trees. Woe to any of us who let ourselves be ruled by a bramble man.

And then…this happened:

Never say the lectionary isn’t relevant, kids.

Swamp lessons.

Something else I’m always talking about in meatspace (perhaps “complaining about” is more apt) which is especially relevant in August in DC:

The dew point.

Bear with me.

Most of us are accustomed to talking about the humidity percentage. But as I laze here indoors on this mid-August mid-Atlantic free sample of purgatory with a lingering migraine, it’s 93 degrees out and just 50% humidity. So why does it “feel like” 100 degrees? Why, when one steps outside, does the sweet release of death suddenly seem so appealing, whereas 93 degrees in my arid hometown would be patio weather for the semi-hardy? 50% humidity doesn’t sound terrible.

And how can Olympia, Washington be one of the most humid cities in the country, and no one there sweats profusely while cursing existence?

As this article from the Washington Post’s Capital Weather Gang explains, humidity percentage is an expression of relative humidity—how much water vapor is in the air relative to the air temperature. You could have 90% humidity at 32 degrees, or 90% humidity at 90 degrees, and there is a very different amount of water vapor in those two air situations. Hot air can hold far more water vapor than cold air, so higher relative humidity in warmer air is bad news for people who don’t like their air drinkable.

The relative humidity percentage, then, doesn’t tell you what you want to know, which is how miserable you’ll feel if you dare to venture out. Weather nerds prefer using the dew point, which is the temperature at which water vapor will form dew. Think of a cold drink brought outside on a hot day: how long will it take for water droplets to form on the outside of the bottle? A higher dew point means that the droplets will form quickly, because the air touching the bottle won’t have to cool very far to form dew. The higher dew point means there is more water vapor in the air generally, as the air becomes closer to saturated with water. And more water in the air means more discomfort.

For those new to dew points, the number will mean almost nothing to you at first. But if you memorize a quick set of reference points about the dew point (perhaps from this handy chart) you can quickly get a sense of the system. For example: during my walk earlier today, the dew point was 74, which is abominable. At the time, the temperature was 81 or so, which gives us a relative humidity of about 80%. I was…not thriving. And I knew it, because the dew point was 74, and I looked like this:

Now, the dew point is supposed to tell us how our delicate human bodies will feel when we are exposed to a certain amount of water vapor. The Capital Weather Gang helped us out with this handy graphic:

But, it seems, even though no one ever says they get used to this humidity, we are somewhat accustomed to it. I mean, look at how much more delicate people are in Idaho:

“West is best,” I scream into the swamp as I melt into a puddle of sweat and tears and the whisper of a dream of autumn.

Indecision fatigue.

It’s a time of plenty, friends, by just about any measure (which isn’t to say that this plenty is properly distributed, or that there is no room for improvement, but that’s another matter). And yet I notice quite often that my mind has a habit of seeing scarcity everywhere. See, for example, my almost perilous aversion to wasting food, culminating in narrowly avoided tragedy in the glass-shard soup incident of 2015. My grandparents lived through the Depression, yes, but why is its hold so strong on me still, when I grew up wanting nothing?

Scarcity looms everywhere, though, probably as a form of the common human negativity bias. It tries to keep me alive by telling me to make hay while the sun shines, (seriously, get out there, it’s sunny, what are you doing, the sun will go away and you won’t have enough hay, dummy!) It keeps me alert to potential danger by reminding me that a bird in the hand is worth two in the bush (and, hey, you’ve only got one bird in that hand; what are you going to do when you’re next in need of a bird? You think birds grow on trees?)

Pretty obnoxious, really. And one place it comes up in this kind of meta way is when I am working on an idea: some little voice in the back of my head is piping up: “Hey, what about the next idea? Is this the only one you’ve got? You might never have another. Just thought you should know!”

It is, to say the least, hardly helpful. There’s nothing less creativity-inspiring than worrying about the potential future scarcity of creativity, just as there’s nothing less satisfying to the stomach than fretting about where one’s next meal will come from. Yet even as I’m working on picking what I like and seeing how it grows, the little scarcity gremlin looks over my shoulder and warns me that I’m probably going to run out of ideas soon, and probably half of the ones I’ve got are going to fizzle out before I can finish…just trying to be helpful!

So there I am, clinging white-knuckled to the little list of ideas I’ve got, squeezing the life out of the poor things. But in Big Magic, which I read recently (and you should too, if you want to live a more creative life and could use a little boost), Elizabeth Gilbert has an interesting idea that turns my scarcity gremlin on his head: she pictures inspiration as a sentient force, a personified entity, that goes where it likes, and stays if you entertain it generously. With a straight face, she tells the story of the time that the idea for a novel visited her for a while, then once she’d demonstrated that she didn’t have the space in her life to tend to it fully, entered the body of Ann Patchett the day that they met. Ann Patchett ended up writing this book that so badly wanted to be written. Which is not to say “cling harder and panic when you have an idea, lest it enter the body of a more talented nearby artist.” (That would be our neighborhood scarcity gremlin talking again). I suppose it’s just to say: be grateful when it’s there. Welcome it. Don’t squeeze the life out of it.

Now I’m doing better, taking some deep cleansing breaths and welcoming the friendly inspiration spirit in. I’m living in abundance mode, not scarcity mode. I’m countering that negativity bias. Noticing the plenty, not the lack.

But then–oh no–such plenty. Ideas coming so hot and heavy that I have several different digital to-do lists, like so many crumpled purse-bottom post-it notes, cluttering up the joint. Ideas coming so fast and incomplete that they talk over each other and my mind starts to feel all jerky and fuzzy like it does when I’ve been visiting the portal too much. Each one is a welcomed guest but, “well,” I begin to say to the throng of them in my living room, “I do visit the portal a fair amount, and I have my day job to consider, and my social life, and I’ve got to stay showered and need to shop for groceries now and then, and as it is I’m already halfway through two of you, so I might not get to you” (pointing to one of them in the back, a new arrival staring at me slackjawed) “until halfway through next year, at the earliest, I’m sorry to say.”

How do I pick what I like and see how it grows when I struggle to follow a thread long enough to plant it, and when I like quite a lot of things?

Don’t be fooled. This is scarcity talking. Scarcity of time. Like any other scarcity fear, there’s a grain of truth within it, but it’s hardly helpful to let it run the show.

So you have to conquer the first scarcity, the fear that ideas will come at all; perhaps you do this by trusting that the idea has its own will, and will find you if and when it pleases, no matter how tightly you squeeze or how much you suffer in the meantime, so you may as well relax. Then, when you’ve caught one, you have to conquer the second scarcity fear: the fear of scarcity of ability, the fear that you won’t do the idea justice and shouldn’t even try. You do this by learning to entertain it gently, and just follow it where it leads.

But how do you conquer the third scarcity, of time to follow the many possible leads you’ve got? You’re indecisive. You don’t know which one is top priority. (After all, if everything is top priority, nothing is, and there we are in the bowels of the couch playing Flash games until long after our bedtime). Okay, so just do something. Do literally anything. Print out a list and throw a dart at it, see which one it lands nearest. Use a random number generator. Just start. Just for ten minutes. Just mess around with it on your phone while you wait in line at the pharmacy.

But the fear is still there. The gremlin hasn’t left. It says: “okay, you’ve got a lot for now, that’s great. But what will you do in five years? You haven’t really got five years’ worth on that list, do you? Aren’t you worried that you’re going to to flame out in a while, even if you do find the time to do any of them?”

That’s when I look that gremlin in the eye and go, “listen, buddy, I know this game.” I know all about trying to pin my future down and buy it an insurance policy. It’s a fool’s errand. I look back to what I was dead certain I wanted my long-term future to look like when I was 15, 18, 20, 22, 25, 28…every single time, if I could have locked it in, of course I would have. But every single time, I either didn’t get it, or I got some very different version of it, and thank God for that, because I can’t imagine much worse than being trapped in a life younger-me chose. She doesn’t know anything about me.

So ideas will come. And I’ll entertain them as best I’m able. And I’ll get new ones, or I won’t, because maybe me in several years won’t even want them, and that’s totally fine. And I’ll find the time, here and there, or maybe future me will. She can figure it out.

Reith Lectures. A history of histories.

We live in a time of many, many podcasts. But while we’re here, one benefit is the ready availability of recordings that weren’t even made to be podcasts. Recently, I’ve been meandering through the archives of the Reith Lectures, an annual series of five lectures by a single speaker each year, exploring some topic of interest. The BBC has been putting the series on since 1948 before a small, invited live audience, but the whole catalog is available online for free, for posterity.

Some of my favorites have been Bertrand Russell speaking on “Authority and the Individual” and the need for a world order to prevent the recurrence of world war (1948), Dr. Jonathan Sacks on “The Persistence of Faith” in a secular world (1990), John Keegan on “War in Our World” (1998), Niall Ferguson on the enemies of the rule of law (2012), Kwame Anthony Appiah on culture (2016), and Hilary Mantel on historical fiction (2017). Other lecturers include Stephen Hawking and Aung San Suu Kyi.

The archive of these lectures is how the internet shows the best version of itself: it acts as a leveler to share humanity’s knowledge somewhat evenly. Without mass communication, these lectures would have been heard only by those elite few who were invited to the talk. The BBC was able to tear down that barrier by broadcasting the talks on the radio. But the internet sends the talks even further, into the future and into great distances of space, lasting however long the digital archives of the BBC do.

Beyond these recommendations, I have some observations.

First, are we now living through a new age of audience participation (and can we make it stop?) In the Reith Lectures, at some point around the late 20th century, audience questions and answers show up. I assume that the audience was allowed to ask questions before then (although, it would be fascinating to find out that they weren’t!) but it was only relatively recently that the audience participation was included as part of the lecture recording. As someone who has sat through several too many “more a comment than a question” sessions, I’m never stoked about the opportunity to be held hostage to the often egomaniacal whims of whoever runs fastest to the public mic. But sometimes, miraculously, the commenter does have something of interest to say, despite my impatience. Anyway, I wonder if the inclusion of these questions in the recording signals a recent phenomenon–a symptom of some deep societal change opening us up to the thoughts of the rando, or the decreasing cost of data storage, or who knows what. In any event, this has made the actual substance of the lectures shorter, but the recordings stay the same length to make time for the almighty q&a, and I suppose I have to live with it.

Second: in the last 60 years of lectures, the history of the period echoes a bit in the choices of lecture topics. Some topics are of continual interest (scientific advancement is a perennial favorite), whereas others dwindle: there have been very few lectures on religion in the last 30 years, for example. This reminds me of a story about the making of Downton Abbey: allegedly, the audience never sees the family begin eating a meal, because historical accuracy would also require showing them praying, and the modern audience would find that too spooky indeed. So too, perhaps, with the lecturers: we are in an age recently in which religion is not a topic for lectures of general interest.

Another fun aspect of a dive in the archive is that you get to hear the accents mutate and diversify from the absolute lunacy of the mid-20th century upper-class pinched aristocrat voice to a spectrum that includes all manner of Brits, international voices, and even the occasional Canadian.

A final observation: there is a strange middle ground here between enjoying something old as a curiosity, and enjoying something recent by learning from it at face value. A work of scholarship that is very old is a historical document. We’d study it to find out more about the period in which it was written, but we are not as likely to use it for information about what it actually says. In other words, it ages enough until it is a primary source document–but only a primary source document. On the other hand, very recent scholarship is just scholarship. But where is the line? How old does something have to be before it becomes a primary source for a historian studying the time it was written, or a curiosity?

The first lecture, from 1948, is from Bertrand Russell, talking about his views on the need for a world authority to prevent the proliferation of nuclear weapons to prevent another world war. His views sound essentially quaint. They would be quite useful in assembling a history of the early anti-nuclear movement. But no one would use them now in a policy paper about nuclear policy or international relations.

More recent lectures felt, to me, to be in a funny kind of limbo in that regard. Lectures from the ’80s and ’90s are not yet far enough away in time to be quaint or revelatory (“my goodness, I didn’t realize people thought like that way back then!”). Instead, they’re just outdated. I guess they’re waiting in the vault, aging like a fine wine, but somewhat awkwardly. (Perhaps my bias is showing: I’ve never been very interested in history of the recent past. My interest in history has always been next door to my interest in fiction, trying to find other worlds to visit.)

As an example of this phenomenon, let’s briefly compare Bertrand Russell in 1948 to military historian John Keegan fifty years later in 1998. Both speakers saw a strong UN or other world body as the key to eliminating war. Russell thought it should do so through disarmament, eliminating nuclear weapons. In his day, scarcely out of World War II, it must have seemed quite possible that nuclear war would soon be frequent. But Keegan, speaking after the end of the Cold War, is adamantly against disarmament: he argues that the dangerous knowledge of how to make nuclear weapons can never leave humanity; therefore, responsible actors (the United States!) must retain them. For Keegan, nuclear weapons still looked like the future of war.

Twenty years on, as a lay observer of war, I’m not so sure. Yes, we have scares about nuclear Irans and North Koreas. But we also see something akin to “war” happening purely online through hacking and disinformation campaigns, and through the quiet misappropriation of information. These are “war” in the sense that the security state turns its full attention to them. But these appearances of “war,” in which nuclear weapons sit basically forgotten (we pray), where actual combat occurs only with unsophisticated non-state actors and technology otherwise is the most feared threat, would be unrecognizable to an ancient person. Or even to Russell.

Book Two.

I recently finished a draft of a novel. It was a meandering process, guided by occasional spurts of planning, and then re-planning around the total chaos that ensued when I actually sat down to write.

What’s more, it was based (semi-closely, in places) on true events, so I didn’t have to make everything up on my own. I also couldn’t make everything up on my own, to the extent that I felt bound to be loyal to the true events. Sometimes this took away from my ability to follow the ideals of story crafting. (Or, isn’t it pretty to think so, a nice excuse.)

Now that I’m done with that draft, I’m going to try to do another one. Different genre, not based on true events this time, no longer historical fiction. In other words:

The grand plan is to spend the next six months or so planning Book Two, and then hammering out my SFD (“shitty first draft,” for anyone who is rusty on their Anne Lamott.)

I come prepared so far with a very rough outline: as I see it now, it’s three mini-novels, unified by a common story that underlies each part. I have the loose sense of where I want each part to go, plot-wise, and the feel I want each part to have. I know, for now, where each part is set and a lot of what might happen. But what I lack right now is something tremendously fundamental to a readable novel:

Characters.

Sure, I have the vaguest sense of a few of them. One of them looks like Bebe Neuwirth in her Frasier years. I know their Enneagram types, because naturally I would. And that actually is something: it gives a framework to start building on, both with what the behaviors might be and with the directions these people might go when distressed, or when secure (about which more later.) But I don’t know much else about them, and that won’t do.

A tool that is often helpful for people in building characters is doing an exercise that I have been internally calling, grotesquely, a “character dump,” but Google is giving me no reassurance at all that this is a phrase anyone else uses. Could it be that I, on my own motion, started calling it that? Horrors. In any event, this exercise is one in which you sit down and start writing down a bunch of details about a character. When were they born? What do they like to eat? What do they wear? What was the saddest thing that happened in their childhood? What makes them laugh? Who is their best friend? What dreams do they have? Do they think a hot dog is a sandwich? And on and on.

Now, you may say, “you just said a few paragraphs up that you don’t know much about your new characters yet. How could you possibly know all these details?” Gentle reader, I don’t, and I probably won’t even after I finish this exercise. I’ll make some stuff up, like a kid half-assing an English Lit final, and maybe some of it will resonate, but most of it won’t be helpful. That’s okay. This process isn’t what you might call efficient.

Then, later, I’ll do other exercises relating to plot. They might similarly be inefficient, but then I’ll have more words down, about both these half-formed people and the half-seen things that will start happening to them. Then I’ll sit down and start writing the thing. Every 200 words or so, I’ll realize I’ve got a plot hole, or my character isn’t wanting to behave the way I expected. Fine. I’ll make new stuff up. I’ll do some tailoring at the edges of the plan. I’ll start a new plan. I’ll write more. I’ll find some massive hole. I’ll give up for a day or two, binge some romantic comedies on STARZ. I’ll come back to it and realize I have a way out that is just brilliant. I’ll write in a delighted frenzy for a week. Then I’ll realize I created an even bigger hole than before, I’ll fix it. Someday I’ll have a draft, and I’ll start over and write draft 2.

It’ll be fun.

Brain molds.

When I was young, my parents occasionally threw what they called an “Ugly Food Party.” The concept (patently my father’s brain child) was a potluck, to which each guest should bring a dish that was both delicious and ugly. The more delicious, and the uglier, the better. At the end of the evening, a first, second, and third-place winner would be announced. Memorable dishes ranged from the cartoonish (bean dip served in a diaper) to merely unusual-in-my-neighborhood (a whole sheep’s head). One was an opaque greyish jello served in a brain mold. 

It was the jello brain mold that got me here, because this is how my actual brain works: I came here to write about neuroplasticity, which is to say, brain molding, hence brain mold, hence ugly food party. My brain did this little hopscotch in about four seconds. This is, perhaps, why I am almost constantly distracted. 

Copyright 2019

Speaking of habits of the mind, neuroplasticity has been on my mind lately because I just finished reading The Brain That Changes Itself, which I would highly recommend. Dr. Norman Doidge writes accessibly about how the brain works, and the history of our growing understanding that the brain is constantly changing and adapting. Our thoughts and behaviors quite literally physically change the brain, because the brain adapts the tasks we put it to frequently. Thoughts we often think together become physically linked, until we make a conscious effort to disentangle them. 

The book is full of fascinating examples of people who have been through tremendous physical or emotional trauma, and whose brains have adapted incredibly to overcome it, often with the help of therapy. The final chapter is about a woman who was born with literally only half of her brain (the other half never developed, after some unknown trauma in utero), and yet she was able to read, speak, and socialize, and was excellent at some kinds of math. Her half-brain had adapted to work as a whole one, finding space for all of her needs. 

To summarize the conclusion of the book: you get better at what you practice. This is not what you might call groundbreaking. But to understand that it is true on a physical, neurological level is pretty amazing. Like a muscle, the brain becomes competent at, and then efficient at, doing what is repeated–whether these habits are harmful or helpful.

The book came out over a decade ago, and straddles the market between popular science and self-help. It offers very little in the latter category, but it is popular in those circles no doubt because it provides such a clear illustration of what needs to be done to improve oneself. Self-improvement on the order of increasing positivity, decreasing anxiety, or improving at an artistic task, begins to look like a physical reality, like straightening one’s posture or practicing piano scales. The prescription may remain the same as before (do more of the thing you want to do; do less of its opposite), but the explanation for why this helps makes it feel a little more tangible.

Even as I’m writing this now, I’m imagining my brain getting just that much better at associating the motor skills and verbal skills together that are needed to type while I think. And on a larger scale, I think back to a few weeks ago when I first decided to start this blog. At first, I had one or two ideas. Then, as soon as I articulated the thought to myself, “I will have a blog,” before I knew it I had quite literally dozens of ideas listed in a document. The neuroplastic explanation for that might be that even in imagining writing in this format, my brain strengthened the pathways between thoughts. I’m drawing connections between things, and drawing connections between those things and the ideas and actions of writing them down, more quickly and efficiently. It’s just practice. It’s picking what you like, and seeing how it grows, on a neurological level. It’s pretty neat. 

Yesterday, I listened to a recent episode of the “Typology” podcast, which is about the Enneagram. The guest was Dr. Richard Lindroth, a 5 who was willing to come on the show. (Type 5 is the “observer” or the investigator.” They are typically introverted and analytical, both of which traits might explain why it would be the rare 5 who would want to come talk about him- or herself on a podcast about something rather woo-woo like the Enneagram). Dr. Lindroth quoted the famous statistician George Box: “All models are wrong, but some are useful,” and that is precisely how I think about the Enneagram. I decided I quite liked him. 

He also struck me by mentioning neuroplasticity in connection with the Enneagram (about ten minutes into the interview). As a 5, one of his struggles is in allowing himself to feel his emotions. His inclination is to tamp them down and think them, rather than to feel them. Since learning about the Enneagram, he says:

“One of my practices for the last couple of years has been to reduce that distance [between myself and my emotions]…When I am experiencing an emotion [I] take the time and really experience it in the moment, hold it in my head and really experience it for 20 seconds. That’s my neuroplasticity rule…If I can engage with this thought or engage with this emotion for 20 seconds, it’s going to help rewire my brain to accommodate emotions probably in a more healthy way.”

The imperfect model of the Enneagram can hold up a mirror, showing us aspects of ourselves that we had no idea were unusual. Learning that we do these things (and not everyone does, and we needn’t keep doing them), can point out some areas where we can constructively change and grow. Thinking about this in the context of neuroplasticity, one might say that the brain of someone deep in their Enneagram type has grown and adapted and developed efficient patterns of that type: Dr. Lindroth has learned to distance himself from his emotions, and I have learned to associate conflict with danger, and to shut down at the first sign of it.

But by noticing that our brains have that habit, and by imagining doing things differently, and by engaging for 20 seconds with an emotion or by staying mentally present when interpersonal things get rough, we can start to weaken the associations our brain has made, and to strengthen new ones.