Ye Olde Sexism, Ye Newe Presentism

Who would lose: a twelfth-century sexist or a twenty-first-century presentist?

The book I’m currently writing is set nearly 900 years ago. This poses certain difficulties. Last night, I missed my bedtime falling down an internet rabbit hole trying to figure out whether a specific woman rode her horse side-saddle, or astride.

If you choose to go back this far into history, you’ll find that it’s clouded by all the time that’s passed since. You’re trying to imagine a woman riding a horse in 1140, but you’re actually picturing virginal twentieth-century dorm-room posters reprinted from Victorian paintings imagining late-medieval scenes.

Putting you on blast, John William Waterhouse.

This is not evidence of what the middle ages were like.

If you begin to research a question like “did women in 1140 ride side-saddle?,” you’ll find a lot of websites with vague, unattributed statements like: “Women weren’t allowed to be independent or wear pants until the twentieth century because they were supposed to be demure and chaste.”

Citation needed! It’s not categorically wrong, but it’s wrong categorically!

This kind of assumption obscures the past. It flattens the past into a single “bad old times.”

And—it makes us lazy. There is a common temptation to think about time as a steady march of progress, like so:

But things can, and often do, get worse. New systems of oppression are created. Life-sustaining creations are destroyed.

Women are allowed to ride horses in the most efficient way, until they are told that instead have to ride in a way that only goes slowly and probably feels not great for the horse. From my digging, it looks like that happened sometime after 1140.

So I’m letting my gal swing up on her palfrey, get her skirt bunched up under her. Letting her feel the breeze on her shins.

Follies, Ruins, and Palimpsests

On All Saints’ Day, we look back.

Looking backward is as familiar to me as breathing is, which is to say, I often fail to notice I’m doing it. (After all, my type’s orientation to time is sometimes summarized as “preserve the past,” which is the kind of impulse one needs to keep a wary eye on.)

From childhood I’ve had a near-obsession with the past. This obsession led me by the hand through a lifelong historical-novel habit, a history degree, and a tendency to ruminate. To a panicked feeling of things always going too quickly. A pang that I’m not quite done with chapters of my life which have ended without my permission.

It also means I love old buildings. Before that history degree, in my foolish youth, I loved any old-looking building indiscriminately. But education has led me out of this darkness. I now realize there are, broadly speaking, three categories of old-looking buildings: follies, ruins, and palimpsests. Let’s explore them, shall we?

A palimpsest is a manuscript that has been wiped clean to have other writing put on it—or, more broadly, any object that has been reused for some new purpose. I’m abusing this word slightly to refer to old buildings that have been long in use. You often hear, for example, that old houses in this area of the East Coast are log cabins surrounded by newer and newer rooms, built up and out. My dad’s friend had a house like that: a modern enough house, but with one room with a dirt floor that once was the entire house. It was a palimpsest: something new built right up inside and on top of something old, until the two became one.

Palimpsest buildings like this are disappointing to a past-looker like myself. They seem to cover the best bits up, hiding them in modern taste or functionality. After all, very-old buildings have to be maintained. This means new workmanship, new materials, replacement walls and doors.

Look at the amazing Taos Pueblo, which is one of the oldest inhabited buildings in the country, over a millennium old.

Taos Pueblo, NM

Does it look precisely as it did a thousand years ago, asks my past-loving heart? Of course not. It is a home, a city. It has had to withstand the weather, the climate, wars and famines and droughts and population changes, and dozens of generations of children clambering around it. People live in it. They maintain it as their house. They build it and go on building it.

Bummer, sighed the past-lover in me. I wanted to see literal millennium-old adobe, untouched.

Palimpsests are the realest kind of old building, but they disappoint. They are buildings—houses or churches or offices or shops whatever they want to be—rather than reverent monuments to the past.

Give me a reverent monument to the past, I cry!

Here, have a folly.

Follies are fake old buildings built as decoration. (Now we’re cooking with gas). You might be fooled by them if you aren’t on guard. You might be wandering around some estate which belonged to someone with vastly too much wealth, and, oh my God, is that a castle? Is that a ruined Roman amphitheater?

No, dear, it’s a folly, from the French folie. Crazy.

Roman folly at Audley End, Essex, UK

Follies are like expensive jeans: often either neat as a pin or stylishly, intentionally weathered. They’re like catnip for people like me who watch a lot of costume dramas. And then, once you figure out their fakery, they’re pretty embarrassing.

A tidy-jeans folly. Photo by David Evans – Paxton’s Tower – Carmarthenshire, CC BY 2.0, https://commons.wikimedia.org/w/index.php?curid=42421316
A folly of the ripped-jeans kind at Mount Edgcumbe House, Cornwall, UK. Photo by Mark A Coleman, CC BY 3.0, https://commons.wikimedia.org/w/index.php?curid=55152363

Now that we’ve learned to spot a fake, let’s move on to the really real old building. The one that isn’t bastardized by modern hands. Let’s look at some ruins.

Ruins have beauty and tragedy. They look great in the rain. They’re really romantic. You can imagine having some very strong emotions there, growing your hair long and getting a little windswept. And the fact that they’re dead makes them extremely fun for a past-looker: they’re pure in some way that a palimpsest or a folly could never be. They’re like an above-ground time capsule.

Machu Picchu, Urubamba Province, Peru

Until you realize that ruining doesn’t just happen. Not usually. It’s more natural for buildings to become palimpsests over time, if they’re any good, because people naturally want to keep using what they’ve got. Ruins, I’m finding more and more, are often on purpose.

In preparation for Book Three, I’m researching a lot of 12th-century castles and abbeys in France and England. The ones in France are often still there, or parts of them that haven’t been repurposed. But many of those in England were ruined intentionally. Henry VIII sacked the monasteries to get Anglicanism off to a proud start, and Oliver Cromwell “slighted” (cannon-balled and pulled down) many castles to deprive his enemies of a foothold.

Me at Fountains Abbey, North Yorkshire, UK–a ruin you can blame on Henry VIII

This makes me shake my fist at them, both for being such intolerant dipshits (pardon), but also for making it hard for me to know the precise dimensions of some of these buildings. Yes, this is about me!

So I’ve come full circle: I turn my nose up at follies, now, and ruins make me a bit sad. I see them more as lost information, lost usefulness. They are buildings that didn’t get a chance to live on as palimpsests, to be useful to people.

Ian at the Seneca Quarry, Montgomery County, MD, a ruin you can blame on Victorian architecture going out of style

But the irony even there is: when these armies knocked down an abbey or a castle, people (being resourceful) used the rubble. They picked up the bits of the fallen walls and used them to patch their houses, their bridges, their crumbling garden retaining walls. These old ruins are living on as palimpsests, but spread out all over the countryside.

Take it one step further: after all that slighting, the rich started to love the aesthetic of the ruined buildings everywhere, looking rather elegant and skeletal, picked clean of rubble. They built some of their own in the backyard so they could stare at it over breakfast. Isn’t that a palimpsest of a kind, the repurposing of the very idea and function of a ruined building into a piece of artwork?

And take it another step: during some famines in Ireland, the wealthy landlords didn’t necessarily want everyone to starve to death, but couldn’t abide the idea of simply giving away cash or food. Instead, they gave the suffering masses construction jobs building “Famine Follies,” which sometimes were actual follies and sometimes were simply roads to nowhere. Unnecessary manual labor, I guess, rendered people deserving of food.

Picture that: starving people put to work hauling stone around the countryside to build a pretend ruin, which is to say, a building of no practical use masquerading as a building that once had (but no longer has) a practical use.

By now the whole idea of looking backward at pretty old buildings is collapsing in on itself. The idea of gazing longingly backward at all is foolish: uninformed at best; reactionary at worst.

Much better to love the idea of the palimpsest. To love the building that many generations have adapted and molded and fitted to their needs. To love the stones that fell out of the wall and ended up filling the gap in someone’s chimney.

For All Saints’ Day, better to stop looking back with regret and desperation, trying to freeze it in place, trying to see it in clear focus. Better to know that just like seeds in winter, what is dead still has a future of its own strange kind.

In the long meantime, everything is recycled. Nothing will be stagnant. Nothing will be resurrected whole. Old buildings are resurrected in others or returned to the earth. Old chapters of our lives will not come again, but they take on new resonances with every year, like the turn of a kaleidoscope.

Such a long time to be gone, and a short time to be there.

Euphemism drift

Today, a few examples of a linguistic phenomenon that delights and vexes me, which I am calling “euphemism drift.”

Full disclosure: when I sat down to write this post, I thought it was an original idea. But Google corrected that impression. So here is a Wikipedia segment that is more or less the topic. And apparently Steven Pinker has named this concept the “euphemism treadmill.”  

Still, I think being deterred by unoriginality is a coward’s game, so on I press.

For the record, this post is going to use the “r word,” but not (as I hope you’ll find) in a derogatory way. In fact, I want to show that the word has never been the problem. 

Yikes, right?

But let’s start with a fun example first. “Happy hour.” Now, what is happy hour? To take it literally, it’s an hour that’s happy, or more likely, an hour during which people spending the hour are happy. But we all know that’s not what it means. No one talks about “pre-dinner drinking time;” instead, we have chosen the euphemism “happy hour.” 

Continue reading

Thoughts on “content”

If you’ve been on the internet in the last five years, you may have noticed that all of a sudden, “content” is everywhere. There’s new content, great content, content overload, and there are content creators keeping the whole operation going.

The word “content,” of course, has many meanings across all parts of speech, but the one I’m referring to is the third noun entry here: “the principal substance (such as written matter, illustrations, or music) offered by a website.”

Let me concede off the bat that “content” in its internet usage is a “real word.”*

Look upon my content, ye Mighty, and despair!

HOWEVER.

Continue reading

History of words, words about history…

Still on the subject of podcasts in the midst of a Busy Holiday Season®️, there’s another one I feel completely compelled to share, even if absolutely none of you will be interested in joining me: The History of English podcast (recently misheard rather intriguingly as the “History of English Podcasts”) is completely wonderful.

The show, running since 2012, appears to be an extracurricular passion project of a solo practitioner lawyer from North Carolina, who says absolutely nothing about himself on the show. But he has quite a lot to say: he presents the history of English in meticulous hourlong increments, starting from the absolute dawn of the knowable history of human speech all the way up to—God knows, because seven years in, he’s only gotten a little beyond Sir Gawain and the Green Knight, a few hundred momentous years shy of Shakespeare.

Anyone interested in etymology or English history or both would almost certainly enjoy the show. One thing I find tremendously charming is the way that Kevin (for that is the host’s name) delivers etymological facts by theme as he marches forward in time. An episode documenting the messy bloodline of King Alfred the Great, for example, provides him an opportunity to talk about Old English words for family and inheritance. But just when it veers close to feeling like a lengthy fact dump, the show manages to keep moving along narratively.

But anyway, enough about the show. Let’s talk about me.

I’ve had a long-simmering interest in the history of languages. Before the internet, I remember staying up late with my parents’ encyclopedia, reading the cross-references to work out how languages are related to and descended from each other. I briefly flirted with the idea of majoring in linguistics, before realizing that (at my university, at least) the subject was a great deal more medical, more wetly throaty, than I’d anticipated.

But there’s no shame in being a dilettante, I hope, and Kevin from the podcast gives me hope about even the prospect of being a devoted learner and teacher in one’s spare time around a busy lawyer’s schedule.


Okay, actually, enough about me. Back to the show.

The early episodes of the podcast go way, way back. By episode 7, we’re still in the land of Proto-Indo-European, which is the language that gave birth to most European and some Asian languages. It was spoken so long ago, by people who did not write, that all we know about it has been reconstructed by linguists working backward from modern languages like forensic analysts, finding traces of ancient words in the similarities and gaps between current words.

This absolutely blows my mind, and always has. Not only do linguists figure out little clues about dead languages by finding commonalities between their daughter languages; they also bring in geography and botany and biology and genetics to connect the dots. For example, we find some clues about where these Proto-Indo-Europeans lived by analyzing which words they had, and didn’t have, to describe the world around them—no words for “monkey” or “palm tree,” so not the jungle, and none for “olive” or “grape,” so a colder climate. Words for certain kinds of sheep only, which tells us something about what kinds of animals they could have raised, and that in turn tells us something about what their world looked like.

This kind of thing is completely bananas to me: can you imagine doing this as a job? Can you imagine tracing the spoken words of people who died five thousand years ago, and also getting to learn a lot about sheep in the process? Goals, I tell ya.

Anyway.

Something I find so fascinating about the history of words is that it traces the history of thought, and the history of sound. These are things that don’t tend to leave impressions in the archaeological record, and they can be obfuscated in written histories. But words can’t help but shift and change with use, like a well-worn pair of jeans thinning around the wearer’s knees.

And one thing that language doesn’t lie about is the thought process that goes into the mundane everyday choices of words that average people make. Despite the best intentions of grammarians and usage experts everywhere, language never has been primarily about perfection. It’s about communication. It does its job to the extent that people can understand what others want to say, and can make themselves understood in the process.

People of all stripes are natural geniuses at inventing new, easier, and more nuanced ways of saying what they mean. Sometimes they borrow and break old words to do so. Sometimes, this way, words come to mean their opposite: pairs like “guest” and “host,” “give” and “take,” and “black” and “white” come from the same Indo-European root word. Through the messy process of speech occurring over generations of people delicately navigating their societies, these words took on seemingly nonsensical new meanings. And just like we’re all writers now, we’re all the masters of how to communicate our meaning, our humor, and our nuances exactly how we please.


Okay, now back to me.

In Book One, I indulged myself by writing a little sub-subplot about linguistic history. (This is the pleasure of writing a book: no one can stop you). I imagined a pair of late-Victorian scholars chasing a theory about how one might get to know the ancient inhabitants of Europe by looking at the words they borrowed from each other. As it happens, I think the theory as presented in the book is wrong, but the great thing about fiction is that, again, no one can stop you. I can do that on purpose and no one is allowed to criticize me!

I imagine most of you are either long gone or reading out of mere politeness by this point. But to sum it all up: I think there’s something tremendously beautiful about how language can pry open our deep history. Every time we open our mouths to speak, we’re not only articulating our own present thoughts—we’re also building upon the feelings and frustrations and joys and creativity of millions of people over thousands of generations. All the people that came before us still live through us in this little way, carved into our bodies in our DNA and carved into our brains with the words we keep shifting and borrowing and laughing and shouting.

Further recommended reading if you are interested: John McWhorter’s piece, which includes a fascinating idea that the weird way English uses the verb “to do” (as in: “do you like me?” Where every other self-respecting language would say: “Like you me?”) actually comes from Celtic languages.

And on that note, Merry Christmas to all!

Time, as a symptom.

(Pair this post, if you dare, with a listen to my absolute queen Joanna Newsom’s album Divers, which is all a meditation on time and what it means to love another person in the face of the temporary span of a life. It’s a ton of fun. Here’s a sample:)

And what lies under now the city is gone. Look, and despair.

“Sapokanikan,” Joanna Newsom

I recently spent a week with my parents in some of the National Parks of the Southwest. We went to the Grand Canyon (very grand indeed), Zion, and Bryce Canyon. These are all fantastic places to spend time, and I would highly recommend them. The views, man.

Of my parents and me, two of us arrived in a new decade in the last year. We thought, in not quite so many words, that a trip to the parks would be a good opportunity to mark the passage of time. Our trip also happened to coincide with All Saints’ Day (or All Hallows’,) and All Souls’ Day, when quite a lot of people are considering all those who have come before us.

So that’s fun.

But you don’t have to be in landscapes like these long to understand that the kind of time you can count in birthdays, or even in entire human civilizations, is nothing in comparison with the kind of time that is cleaved open and on display in a canyon.

You learn at the Grand Canyon that, even though you’re looking a mile down into a few billion years of rock history, which the river has carved through in the last five million years (give or take!), there are a casual 270 million years of rock history that eroded clean away before the river had a chance to cut into them. Just—270 million missing years, and what you’re looking at is the rest.

You see these amazing landforms that basically defy logic. It’s rock behaving as rock has no business behaving. You know better (because you watched the video in the rangers’ station and you read the plaques) but it looks like rocks are growing like trees. It looks like rocks are flowing like molasses. It looks like rocks are flopping like pancakes, one on top of the other.

The whole thing is just time and mutation. Volcanoes beget flatlands. Marshes give way to oceans, which give way to deserts, which rise thousands of feet to become mountains, and then rivers file them down into chasms, revealing the history from within. This happens not at all silently, but wordlessly.

And even though it all took a few million or billion years, depending on how you count, it’s also changing every year. The spooky hoodoos of Bryce Canyon are inherently temporary. Every winter, ice pushes them apart more, and every summer their rocks fall. I saw it happen: a dozen or so rocks the size of my fist tumbled quite mundanely off a cliff as I hiked below. This, day after day, is how plateaus become walls, walls become windowed, windowed walls become towers, and towers crumble into hillocks, then into flatness. The high land will erode, 1-4 feet every century, who knows how far back? Until the infrastructure of the park, which sits atop the plateau looking down over the canyon, will be eaten away and gone.

We happen to be able to see it now, but it’s anything but permanent.

It’s enough to make you wonder what “conservation” is all about. We wouldn’t be wise to attempt to conserve a hoodoo. We’d do more damage to the park by trying to freeze it in time than by letting it be. What it is is something that exists for a time, maybe a few decades, and then collapses. It wouldn’t be conservation to turn it into something else entirely—something that lasts forever. That would be transformation. (Appropriate for Halloween, perhaps, when zombies and monsters of all kinds roam, but not for All Saints’ Day, when we peacefully remember those who have come and gone before us.)

Maybe conservation is, instead, giving all the entities that make up the Earth a chance to make their own story, in rock or tree or fur or desert.

So it would seem to be true:

when cruel birth debases, we forget.

When cruel death debases,

we believe it erases all the rest

that precedes.

But stand brave, life-liver,

bleeding out your days

in the river of time.

Stand brave:

time moves both ways,

“Time, As a Symptom,” Joanna Newsom

On long walks out there, I started to think about sandstone. You’re surrounded by it in that environment. Sand is everything: it’s the desert, and it’s the ocean floor, and now it’s the canyon. But it’s just sand. Walking along the canyon bottom you feel it underneath your boots, just as slow and yielding as a beach. What’s beneath your feet used to be the canyon walls above you. It’s the bits that have disintegrated lately.

If you touch the walls as you pass, you might rub off some sand. It comes easily, when it’s ready. But underneath the loosened part, there’s hard sandstone. It’s not ready, not yet. But give it a year or a thousand and it will blow away too.

That’s stone, and we’re people. We live on different time frames, by a factor of many zeroes. But like rock we always change, even until we die. That’s the wonderful thing about being alive. The change doesn’t all happen at once, and it doesn’t happen in an orderly way. We’ll find that there are places that are a little looser, a little more ready to give. We can be grateful for those. The rest might be a little tougher. That’s okay. Give it time. Because the loosening of what’s easy, the letting go of the stone that’s ready to be sand already, makes room for more change. And the loosening of that loose sand is what slowly, imperceptibly, loosens the hard stuff.

And Time, in our camp, is moving

as you’d anticipate it to.

But what is this sample proving?

Anecdotes cannot say what Time may do.

“Anecdotes,” Joanna Newsom

A history of old things, part 3: on pedigrees.

To recap: I’ve been addressing the interesting but often unverifiable claims that the Enneagram is quite old. Often, this claim is equal parts squishy definition and wishful thinking.

Now I turn to my final chapter on this journey: why is it wishful thinking? Why is being old desirable, let alone desirable enough to get us all to do some dubious accounting?

There’s the undeniable romance of it, of course. Compare this snippet of a description of the Enneagram’s history:

Variations of this symbol also appear in Islamic Sufi traditions, perhaps arriving there through the Arabian philosopher al-Ghazzali. Around the fourteenth century the Naqshbandi Order of Sufism, variously known as the “Brotherhood of the Bees” (because they collected and stored knowledge) and the “Symbolists” (because they taught through symbols) is said to have preserved and passed on the Enneagram symbol.

Speculation has it the Enneagram found its way into esoteric Christianity through Pseudo-Dionysius (who was influenced by the neo-Platonists) and through the mystic Ramon Lull (who was influenced by his Islamic studies.)

On the frontispiece of a textbook written in the seventeenth century by the Jesuit mathematician and student of arithmology Athanasius Kircher, an Enneagram-like figure appears.

https://www.enneagramspectrum.com/173/history-of-the-enneagram/

With the pedestrian-sounding contention that it’s an amalgamation, created in the 1970s, out of bits of various ideas.

Wouldn’t you rather have that Indiana Jones trek through the sands of time, with a soupçon of a Da Vinci Code-style coverup, a conspiracy of ancient and esoteric brotherhoods?

Who wouldn’t? Especially when your alternative is that people in the 20th century, many of whom are still living, just sort of…made something up.

We have a deep craving for authenticity. This is good. We are naturally skeptical. Also good. So especially when it comes to something as sensitive as a system that will purport to turn a dark mirror on our subconscious motivations, we may well take a step back and demand to see some badges. Ancient brotherhoods are decent badges to flash, quelling at once our skepticism and our thirst for intrigue.

Or, those of us who have found this system helpful, and who also thrill to old stuff, get deeply excited when we see a glimmer of a comparison in something old: Look, the stops on Odysseus’s journey in the Odyssey appear to align with the Enneagram types! Look, there are seven or nine deadly sins, give or take! And we may skip over the task of actually finding a credible connection that goes beyond coincidence, flapping our hands instead at the unknowability of ancient wisdom traditions.

But here’s my theory: saying “it’s new” isn’t all that much more accurate than saying “it’s old.” Was it made up in the 1970s? In my opinion, yes: anything recognizable as the Enneagram of Personality was. But I don’t think we need to stop there, because those people in the 1970s (principally Ichazo and Naranjo) didn’t make up this system out of new cloth. They made it out of lots of bits of old cloth.

And new stuff out of old cloth is as close as we often get in this world to “old cloth.”

Here’s an analogy: even as the genetic testing industry continues to grow (and even though I very much enjoy genealogy), it’s clear that you don’t have to go back very far until the distinction between your-family and not-your-family breaks down.

An example from the tremendously interesting A Brief History of Everyone Who Ever Lived by geneticist Adam Rutherford:

One fifth of people alive a millennium ago in Europe are the ancestors of no one alive today. Their lines of descent petered out at some point, when they or one of their progeny did not leave any of their own. Conversely, the remaining 80 percent are the ancestor of everyone living today. All lines of ancestry coalesce on every individual in the tenth century.

Rutherford at 162.

Does that seem impossible–that every living person of European descent is descended from every European in the year 1000 or so who has any living descendants? He explains further:

[A]ccept that everyone of European descent should have billions of ancestors at a time in the tenth century, but there weren’t billions of people around then, so try to cram them into the number of people that actually were. The math that falls out of that apparent impasse is that all of the billions of lines of ancestry have coalesced into not just a small number of people, but effectively literally everyone who was alive at that time. So, by inference, if Charlemagne was alive in the ninth century, which we know he was, and he left descendants who are alive today, which we also know is true, then he is the ancestor of everyone of European descent alive in Europe today.

Rutherford at 162.

So all of us who have European descent are related to Charlemagne. All of us who have European descent are, equally, related to Kurt the Pig Boy who lived just down the hill from Charlemagne’s palace, as long as Kurt has any living descendants.

This kind of math takes a bit of the wind out of the sails of genealogy: those of us who are able to trace our ancestry back several generations often feel proud if we find someone notable in the genetic heap, perhaps forgetting how many hundreds, or thousands, or even millions of others also can claim the same heritage. Charlemagne is indeed in many of our bloodlines, but in menial quantities that it’s hardly worth mentioning. He belongs to history far more than he belongs to our genealogy.

Rutherford shows how this logic goes if we zoom out from Europe to the entire world: at least one researcher has estimated that “the most recent common ancestor of everyone alive today on Earth lived only around 3,400 years ago.” Rutherford at 164.

If this sounds too recent, or baffling because of remote populations in South America or the islands of the South Pacific, remember that no population is known to have remained isolated over a sustained period of time, even in those remote locations. The influx of the Spanish into South America meant their genes spread rapidly into decimated indigenous tribes, and eventually to the most remote peoples. The inhabitants of the minuscule Pingelap and Mokil atolls in the mid-Pacific have incorporated Europeans into their gene pools after they were discovered in the years of the nineteenth century. Even religiously isolated groups such as the Samaritans, who number fewer than 800 and are sequestered within Israel, have elected to outbreed in order to expand their limited gene pool.

Rutherford at 164.

So go back less than 4,000 years, and there’s some anonymous man or woman from whom every person alive can claim descent. It’s kind of mind-blowing.

There is no simple, linear descent of humans. Human genealogy is inherently a net, a web, that reaches all around the world much more speedily than we tend to assume.

Likewise, to bring it back around to the Enneagram, there is no pure, arcane, secret tradition. People, and ideas, don’t work like that. There is change and exchange and learning and borrowing and mixing and syncretizing. And that’s generally good. 

Just as “no population is known to have remained isolated over a sustained period of time,” ideas don’t tend to idle intact within secret brotherhoods, nor do languages sit immobile in faraway mountain hollers. Change, not stability, is the story.

And in my view, the fact of the Enneagram of Personality being relatively new is what makes it so valuable. If the system is fixed, then it is no one’s–it is unaccountable and inflexible to new insights and new generations’ shifting perspectives. If it refers back to an ancient brotherhood, who’s to say what it is? What it’s not?

Instead, I vote that we recognize–and celebrate–the new origins of this old thing. This new quilt made from various semi-old rags. It’s a solution that gives us some of the romance of the old, and some of the novelty of the new. That’s about as good as we can do.