I thoroughly enjoyed this challenging but very interesting chat with Simeon Burke on faith, motherhood, feminism, why I don’t believe in progress and why the term ‘post-liberal’ doesn’t really make sense because all politics is post-liberal now.
Had the most wonderful epic chat with the delightful Benjamin Boyce, where we roamed across such terrain as the psychotic side-effects of postmodernism, why nihilism isn’t the answer, why I don’t believe in progress and what’s left out of the internet’s parody of the social. It’s on YouTube:
Scuba diving is both magical and terrifying. Put on your gear, slip under the surface, and find yourself freed from gravity. In the glory days Before Coronavirus, I remember diving through the clear waters of coastal Turkey, drifting on warm currents and rolling to stare at the sunshine playing on the surface, from underneath.
But even as I rippled through the deep, marvelling at flashing schools of fish, there was a trade-off: constant self-control. Don’t breathe out through your nose. Don’t sneeze. Never, ever panic. For a short while it’s possible to pretend that you have the freedom of such an alien world, but in truth you’re only ever a tourist, granted safe passage thanks to technology, training and self-discipline.
Something about this sense of crossing an uncrossable threshold surely also powers our obsession with mermaids. And it is an obsession: mermaids are everywhere. Monique Roffey’s novel The Mermaid of Black Conch: A Love Story recently won the Costa Book Prize, while “mermaiding” — swimming in the sea wearing a “mermaid tail” — has gained a cult following in Australia. And you only need to browse the girls’ clothing selection in a high-street shop to find countless cartoon girls with fish-tails, sequinned and sparkly, smiling at you from t-shirts, dresses, wellies, duvet sets, pencil cases and the like.
As a parent of a four-year-old, I’m more familiar than I’d like with mermaid content, and Disney is a rich source. Sofia the First: A Mermaid Tale is a favourite with my daughter, who is entranced by the moment when Sofia is magically transformed into a mermaid and dives underwater. There, she swims in circles exclaiming: “This is incredible!”. And it is. The rest of the story is almost an afterthought, with the whole narrative punch condensed into that moment of metamorphosis, and the dive into a new and mysterious realm.
If mermaids offer an enchanting dream of transformation, perhaps it’s no surprise that the transgender movement enthuses about the special place mermaids have in their iconography. Activist Janet Mock links this to Ariel, heroine of the 1989 Disney film The Little Mermaid, who chafes at her underwater life and longs to visit the world beyond.
Ariel falls in love with a human, Prince Eric, and persuades the sea-witch Ursula to give her human legs, in exchange for her voice. Of course, being Disney, it all ends happily: Ariel gets her transformation at the end and marries the prince. It’s an elegant, arresting fantasy of pursuing and realising a seemingly impossible vision, and encapsulates perfectly the Disney motto: “Where Dreams Come True”.
Today, it’s increasingly accepted that we should support each individual in pursuit of their dreams — even to the extent, as in Ariel’s case, of accommodating those who radically alter their bodies to align with inner identity. So perhaps we shouldn’t be surprised that in the 31 years since The Little Mermaid was released, the association between mermaids and those who pursue an identity at the cost of physical transformation has only deepened. Six years after the film’s release, a charity was founded with the aim of supporting transgender youth — and given the name Mermaids. Meanwhile, Starbucks (whose logo is a mermaid) ran a 2019 campaign in partnership with Mermaids, celebrating the moment a young transgender person hears their preferred name spoken by a Starbucks barista and, for the first time, identity supersedes body.
But just as scuba divers gain the enthralling freedom of the deep only via technology and absolute self-control, a delve into the deeper iconography of mermaids suggests that crossing a threshold as un-crossable as that between air and water isn’t as straightforward matter of making Dreams Come True. As T.S. Eliot hinted in 1911, the dream of oneness with the ocean always comes with a price, or else comes to an end:
We have lingered in the chambers of the sea
By sea-girls wreathed with seaweed red and brown
Till human voices wake us, and we drown.
The modern mermaids of pre-teen iconography are both ultra-glam and sexless, sporting revealing, shimmery shell bikinis and jewelled hair — even as the iconography swims smoothly past the question of what’s going on below the waist. As transgender icon Amiyah Scott puts it to Janet Mock: “With mermaids, the bottom is kind of like an unknown and I like that.” It’s not really done to speculate about how mermaids make more mermaids.
For a culture that simultaneously offers pre-teen girls Playboy-branded merch and rages about paedophilia, this style of mermaid perfectly combines an alluring, hyper-feminine aesthetic with a convenient evasion of the sexual dynamic that hyper-femininity is meant to evoke in adults. But the deep history of mermaids — and their element, the ocean itself — tackles those far darker and more turbulent feminine sexual associations in a way that’s far less sanitised.
Once you’re out on the open sea of unbounded female desire, the mermaids of legend aren’t pretty and sexless at all. They’re alluring, slippery and apt to steal your loved ones. In one Cornish folk tale, chorister Matthew Trewalla followed a mysterious woman out of the Sunday service in the mining town of Zennor, straight to the ocean where he vanished.
As the story goes, Trewalla was never seen again — until spotted by a ship’s captain some years later. He had transformed to a merman, swimming alongside his mermaid wife and mer-children. Zennor’s mermaid tempts Trewalla to turn his back not just on friends and family but on land itself. She’s far closer to the temptress of seafaring lore, who sings to passing ships and causes them to run aground.
In Old Norse mythology, sea-maidens are more menacing still: the nine daughters of Aegir the sea-god and Rán, goddess of the drowned, are waves on the ocean. Each of these nine sea-maidens has a different aspect — such as the frothing one, the billowing one, the welling one — “through which one can see heaven”. The seafaring Vikings who told these stories were intimately familiar with, and healthily afraid of, an ocean seen as both feminine and deeply dangerous.
This hostile undercurrent to the association between women and the sea comes out even today. There’s no shortage of not very polite modern euphemisms for women’s genitalia referencing seafood, for example, while in drag culture someone is said to be ‘fishy’ if they pass as a woman.
So is crossing to the other side desirable or detestable? And what’s the price of a visit? In the movie of The Little Mermaid, there is no price: Ariel’s father Neptune uses his magical trident to transform his daughter permanently into a human, whereupon she leaves permanently for the land and marries her prince. There’s no sense in the movie that this is anything other than an unambiguously happy ending. But while older mermaid tales evoke that same longing to cross the boundary, either seaward or landward, they usually carry a far greater sense of loss or danger than this “Dreams Come True” retelling.
The hauntingly sad Hans Christian Andersen story that inspired Disney couldn’t be further from wish-fulfilment. As in the film, the mermaid falls in love with a prince she rescues from a storm. But in exchange for giving her legs, the sea-witch doesn’t just steal her voice but cuts out her tongue. Even as the magic grants her a pair legs, walking on them is agony. And though the prince is fond of the transformed mermaid, he loves someone else. There’s no happy ending: the mermaid knows his marriage will break her heart, but though her sisters beg her to break the spell by murdering the prince, the mermaid loves him too much to save herself in this way. Instead, she throws herself into the water and dissolves into foam.
From the Disney perspective, this is all a bit grim. After all, we can all be whatever we want if only we believe. Can’t we? Far from offering a happy tale of dreams that come true, though, Andersen’s story reads like a bleak cautionary tale about struggling against your own natural limits.
Disney animations have a way of crowding out earlier and more ambiguous fairy tales, and it’s a safe bet the founders of Mermaids hadn’t read Andersen’s story when they named their organisation. In their search for a positive depiction of youth gender transition, it seems unlikely they had in mind constant physical pain, the loss of one’s authentic voice and a lifetime of being passed over as a sexual partner.
So even as the ancient history of mermaids has mixed feelings about the beauty and peril of femininity, the modern mermaid reboot is just as ambivalent about what’s real and what’s artificial — and just how far artifice can help us realise our desires. Perhaps it’s fitting that even as we’ve filled our real-life oceans with plastic, we should make such a concerted effort to give the archetypal oceanic feminine a plastic-toy — or plastic-surgery — makeover.
But despite this embrace of the mermaid as a poster-girl for a consumer approach to identity, the mermaid as symbol isn’t so easily sanitised, or persuaded of her own happy ending. Some mermaids decide they don’t like the land after all, even if they’re no longer quite at home in the sea either. Some find new voices, and use them.
For despite all the toys, t-shirts and upbeat Disney stories, darker, older currents still roll beneath the safe and sparkly modern mermaid. These currents invite us to wonder: maybe some dreams bring no relief, even when they come true. Maybe some kinds of restlessness can’t be cured, only navigated.
The Motherhood Blind Spot
The text below is my opening remarks from Res Publica’s 17 December 2020 seminar on post-liberal feminism, with Kathleen Stock, Louise Perry, Nina Power and Nimco Ali. Watch the full video below:
Before I had a baby, I believed all the usual liberal things about men and women. We’re all basically the same apart from our genitals. We all aspire to freedom and want to choose our relationships and values rather than have them imposed on us. A successful career is something everyone aspires to. Unequal career outcomes are the result only of sexism. Women can do anything men can: we just need the freedom to try.
Then I got pregnant, and found I was no longer a free individual as before. Instead, I was something my liberalism had no language for: a person in a symbiotic relationship. But my symbiote wasn’t some kind of parasite, she was a longed-for baby. She was loved and wanted as well as dependent, and her wellbeing was more important to me than pretty much anything else.
Then I found out this feeling of symbiosis didn’t end when I give birth and was physically separate from my baby. I regularly woke in the night a few seconds before she started crying for milk. I’d lose the ability to think clearly when she needed food. The only time I’ve ever damaged a car in 20 years’ driving was trying to get it round a tight corner with a hungry baby screaming in the back.
All these things get less overwhelming as a baby gets older, but talking to other mums my sense is that feeling of being not totally separate from your kids never really goes away. I’m 41 now, and my mum still often phones moments after I’ve thought of her. I call this the Mum Bluetooth. We have no language for talking about it. This blind spot has political repercussions for women.
Babies are weirdly missing from mainstream feminism except as a problem to be solved. Either they’re an unwanted pregnancy, or they’re holding your earning potential back, or they’re causing ‘unpaid labour’ (also known as caring work) which isn’t shared equally by men.
The unstated premise behind all this is that individual freedom is the highest aspiration for all humans, and inasmuch as female biology pushes against individual freedom women’s biology needs to be overcome.
The Economist, writing about the loss of earnings that accompanies taking a career break to care for children, calls this the ‘motherhood penalty’. That is, for a feminism that’s premised only on freedom and individualism, motherhood is not a superpower. It’s a punishment.
This grudging relationship of femaleness to the ideal liberal subject goes all the way back to the first liberal thinkers. Jean-Jaques Rousseau, one of the foundational thinkers of modern liberalism, didn’t even believe women could be free in this way, and envisaged an education for men that trained them to be independent liberal subjects while women should be raised as charming, compliant support humans.
It wasn’t long before Mary Wollstonecraft challenged the idea that liberalism was just a boys’ club. She claimed education, freedom and emancipation for women on equal terms with men, kick-starting the movement that eventually became feminism.
But here I’m going to be provocative and suggest that actually, in a way, Rousseau was right. Women are less well-suited to liberal autonomy than men. But this isn’t an argument against women, or motherhood. It’s an argument against liberalism.
If we believe the ideal human condition is autonomy, we have no way of thinking about humans as interdependent. And motherhood is the most concrete example of interdependence. An unborn baby is not a separate individual, but nor is it a parasite, or merely a thing.
Even after a baby is born, it’s not really a separate person. The paediatrician Donald Winnicott famously said ‘There is no such thing as a baby, only a baby and someone’. I wasn’t imagining that feeling of being merged, that was so strong when my daughter was tiny. It was an accurate understanding of her condition. If I, or someone else, didn’t love and care for her, she’d die.
In the framework of freedom and individual rights, we have no language for this interdependence.
Liberalism is a doctrine that gives a good account of human society only if you airbrush out all states of dependence. That means that to make the privileging of freedom work, you have to look away from childhood. From old age. From illness. From disability. And if you base your worldview only on freedom, you’ll also end up scribbling out the other side of dependency, which is care.
So it should come as no surprise that we have more freedom than ever before, but we also have a care deficit that no one knows how to address. We clapped for carers during the lockdown, then went straight back to underpaying them. We wince at every nursing home scandal, but have no idea what to do about them, because ignoring dependency and undervaluing care is baked into the liberal worldview. And women, whose biological superpower is the ability to create new humans through a process of symbiosis followed by years of loving care, find that superpower treated as though it’s in fact a handicap.
The sociologist Catherine Hakim has argued that developed-world women’s working preferences actually break down roughly as follows: 20% of mothers prefer to spend all their time with kids, 20% prefer to focus mainly on career, and the remaining 60% prefer a balance of the two. That certainly accords with my anecdotal experience.
But what this means is that 80% of women prefer to make some space in their lives for priorities associated with caring. And yet, because feminism has the liberal blind spot around dependency and care, we find the preoccupations of feminism heavily skewed toward the priorities of that 20% of women whose main priority is individual self-actualisation. That is, the 20% who want a career on the same terms as men. So we have a feminism of childcare, pay gaps, workplace etiquette, celebrating the achievements of successful women and so on. What about the other 80% though? Are we not also women? Whenever I tell people I don’t want to work any harder because I prefer to make some time for family, I sometimes feel vaguely as though I’m letting the side down. But loving your kids shouldn’t be a source of shame.
To be clear, I’m not arguing for sending women back to the kitchen. What I’m saying is that a number of key issues for women can’t easily be addressed unless we stop pretending it’s possible to worship individual freedom and also advocate for women.
If we privilege freedom over biology, we end up writing female bodies out of feminism altogether. That means obstetric care, reproductive healthcare and family issues are no longer specific to women.
It also means even where sex segregation is in place for women’s safety, this becomes difficult to defend. Likewise, if we see males and the careerist female 20% as the workplace default, we’ll struggle to rethink work in ways that meet the needs and preferences of the 80% of women who prefer a balance.
That in turn means a huge proportion of women will end up spending their working lives either having fewer kids than they’d like, which is now the norm all over the West, or else chronically guilty and burned out trying to live up to feminist ideals that were supposed to free us.
To repeat: this is not an argument that there’s something wrong with women. It’s that there’s something wrong with worshipping freedom and calling it feminism. The feminisms that reject this privileging only of freedom and seek to re-centre the women’s movement on female bodies re diverse and there’s plenty to disagree on. This is a space where conservative Christian thought overlaps with radical feminism, as well as with others such as me who don’t fit neatly in either of those groups.
My aim here is just to name the blind spot. To create more space for acknowledging the overlapping themes of women’s bodies, motherhood as a superpower, and the politics of love and interdependence.
Nothing makes it more self-evident than gestating a baby that we belong to each other, not just to ourselves. That’s an idea I’d like to see embraced not just by feminists, not even just by women, but by everyone. It’s sorely missing from our atomised and adversarial politics.
The airhorn blows. A cry goes up. The field, scattered across streets from Brixton Water Lane to Poets’ Corner, converges toward the sound. The hounds are in full cry. A triumphant ululating from the lead riders, thin tracksuits flapping as they pedal toward Mayall Road. The quarry has been sighted. On bikes, skateboards, scooters or just on foot, the field streams after the leaders.
Overhead on grey terraced rooftops, a scatter of parkour scouts scampers across the sloping tiles, whooping and gesturing. Below them, the quarry flashes in and out of the cover of back-garden bike storage, patio furniture and shrubbery. A few scouts are down from the rooftops now, vaulting fences between gardens. Close pursuit. A delivery van honks. Residents peer out of windows as the hunt streams up the Saturday morning street.
Moments later it’s flushed out: a glimpse of red-brown, the hunt stampeding after it down the tarmac. Then it’s cornered in a newsagent doorway, the hounds swarming. The inevitable end. A Brixton schoolboy, eyes shining with the joy of exertion and bloodshed, is gifted the brush. He holds it aloft in one blood-smeared hand, russet against Herne Hill’s leaden sky.
When the ban was repealed, there were demonstrations throughout the English countryside. It was grossly unjust, the Telegraph howled, yet another sign of government bias toward the cities, that foxhunting was now legal in urban areas but not the countryside. The Johnson government replied serenely that foxes were a predominantly urban pest in 21st-century Britain. Also, as county lines operations had spread city-style drug-dealing throughout rural England, it was only fair in return to encourage outdoor rural pursuits to flourish in the city.
Horrified Guardian editorials inveighed against the education in brutality that would now be coming to London’s already violent youth. But the columnists fell silent when the season started, and knife crime abruptly dropped. United against the mangy pests that raided bins, terrorised domestic cats and occasionally mauled a baby, a critical mass of Londoners embraced the hunt.
Hunts formed along postcode lines, and initially when a hunt crossed multiple postcodes there were stabbings. But the gangs’ youthful energy, physical fitness and fondness for casual violence catapulted them to the heart of London’s great pest control project. Finding themselves suddenly lionised instead of stopped-and-searched, a newfound sense of civic participation put a spring and swagger in their step, and inter-gang rivalry waned.
There was a minor furore shortly after the Repeal Bill passed when, having voted against the Bill, Dr Rosena Allin-Khan MP (Labour) was photographed at the Boxing Day Tooting meet. Polly Toynbee accused her of ‘cheap populism’, while one snarky Spectator columnist noted how clean her Nikes remained even after a gruelling back-garden chase.
The issue split the Labour Party down the middle. On one side stood those who saw the benefits in terms of public health, pest control, crime reduction, race relations, and young males having a healthy outlet for their aggression. On the other stood those appalled by the cruelty meted out to the fox. Innocent animals should not be hunted for fun, they protested. The repeal was emblematic of a culture that had turned its back on progress and was disintegrating into barbarism.
Their opponents replied that the riotous pursuit and bloody death of the odd manky fox was a small price to pay for a reduction in youth knife crime, and that objectors were white middle-class snobs who want to keep London’s multicultural youth in a state of dependence and misery. Would they rather see machete-wielding gangs pursuing foxes or teenagers? The statistics showed it was a straight swap.
The antis retorted that this revolting weaponisation of tragic deaths among troubled urban youth was the first move in a base and bloodthirsty effort to take modern Britain back to the Dark Ages. The next step in the Tories’ grim plan would be tagging further vulnerable groups for torture and sacrifice. Ken Livingstone popped up from somewhere to remind us who else murdered vulnerable groups in order to create a sense of belonging forged in bloodshed.
Jolyon Maugham became, unexpectedly, an anti-hunt sensation, when after months of silence he prioritised his Lib Dem sympathies over past association with urban pest control and wrote a heartfelt op-ed for The New European, explaining why he should have called the RSPCA on that hungover New Year’s Day. Floral kimonos became, briefly and surreally, a symbol for militant veganism.
But with Labour now a rump party of urban liberals, and city hunting wildly popular, the electoral calculus was inexorable. Pollsters nodded sagely when Allin-Khan’s popularity rocketed. The #KillerKhan tweetstorm never got off the starting blocks.
As the wind picks up on Dulwich Road, hunt followers are still milling, elated. The crowd passes hip flasks, relives highlights. Young people mix across culture, ethnicity and caste. Paleo-and-Crossfit machos swap hunt stories with Asian wideboys. Shaven-headed teenagers in tracksuits laugh uproariously with a knot of tweed-wearing neo-trads, the men extravagantly moustachio’d. Locals sidle uncertainly past the panting hounds.
The parkour crew are all here now. The hunt’s athletic elite. Stripped to the waist, defined even in dull autumn daylight, they draw admiring glances but talk mainly to each other. A chill rain begins to spatter. It’s still early; the fitness hardcore moves on toward Parkrun. Knots of people disperse in pursuit of brunch, showers or the visceral pleasures of a post-hunt shag.
In an apartment window above the vintage furniture shop, someone spots a sign hand-stencilled on a sheet. FOR FOX SAKE BAN THE HUNT. Scattered jeers. No one performs compassion for status points these days: that generation is sliding into middle-aged irrelevance. Vandalising monuments is so last year. All of bleeding-edge young London is here, at its most energised and diverse, to thumb their noses at public displays of empathy.
Veganism is tired. Bloodsport is a human instinct. Better to hunt foxes than each other.
Originally published by The Fence
When I was about 12, I discovered The Belgariad, perhaps the high point of 1980s fantasy fiction in the po-faced medieval style. David Eddings’ shepherd-boy-discovers-hidden-magic-and-saves-the-world format may seem hackneyed now, but my tween self was entranced: siblings would try and fail to get my attention before shouting “Fire! Death! Mary!” into my ear to wrest me from that world of infinite possibility and high adventure.
Back then, fantasy was a nerd hobby. Today, though, both fantasy and the moral policing of fantasy seems increasingly mainstream. Fantasy novels can be pulled for wrongthink and even wildly successful fantasy authors such as JK Rowling get dogpiled. Not even actors representing fantasy characters are safe: witness the treatment meted out to Mandalorian actress Gina Carano after being judged by the court of social media to be guilty of heresy.
To those uninterested in fantasy “fandoms”, these may seem absurd dramas. After all, when we’re talking about imaginary worlds populated by imaginary people, who cares about the private opinions of those who create the stories, or represent them on TV?
This dismissive attitude is a mistake. The truth is that a culture’s ideals are always delivered via stories, and in most cultures telling and re-telling these has been taken very seriously indeed. It’s only in our modern world that tales of gods and monsters, rather than taking centre stage in our shared cultural life, have been shoved off into a box called “fantasy fiction” and treated as a mezzobrow hobby for the incorrigibly childish.
It should be clear by now that I have a personal stake here: I’m a full-bore fantasy fiction nerd. Whether it’s solemn sword-and-sorcery, the comic adventures of Terry Pratchett, or the weird worlds of China Miéville or Josiah Banks, if it’s even half-decently written and concerns fantastic kingdoms and impossible adventures I’m there. But I’m also, to a modest extent, interested in literary history. And it’s striking how fantasy fiction popped into existence as a Western genre more or less at the exact point where epic poetry in the classical style stopped being taken seriously.
Rewind a few hundred years, and everyone writing in English sprinkled references to the Greek and Roman gods into their stories and poems, while the Homeric myths occupied a place in the Western imagination almost as central as the Bible. It’s difficult to imagine today but figures such as Athena, goddess of war and wisdom, Circe the sorceress and snake-headed Medusa were common cultural reference points for the educated class.
And it wasn’t just the mythic memes of antiquity that larded our literature — it was the forms as well. The hero’s quest, as set out in The Iliad and The Odyssey, and later in Virgil’s Aeneid, formed a template for heroic narrative that continued almost unchanged into the 18th and early 19th centuries.
Countless authors borrowed, imitated, translated and ironically reworked the epic mode, from Spenser’s hallucinatory Faerie Queene (1590) to Milton’s barnstorming retelling of the Christian story, Paradise Lost (1667), to arguably the last effort at epic poetry, Byron’s Don Juan (1819). But it couldn’t last. In Don Juan, Byron captured something sad but unmistakably true: the classical epics were losing their aura, leaving behind only a sense of lost grandeur. As Byron’s hero laments, while trapped on a Greek island (with only a nubile pirate princess for company):
THE isles of Greece! the isles of Greece
Where burning Sappho loved and sung,
Where grew the arts of war and peace,
Where Delos rose, and Phoebus sprung!
Eternal summer gilds them yet,
But all, except their sun, is set.
Byron’s 16,000 lines of satire, sex and mourning for the vanished glories of classical antiquity put paid to epic as a usable form for anyone with a desire to be taken seriously as a writer. Or was it science? Imagine a looming form that by candlelight seems a shadowy, terrible monster — then turns out, with the lights switched on, to be just a coat-stand. With scientists explaining away ever more of the world’s mysteries by the light of reason, maybe the old gods just started to look a bit silly.
So a few short decades later, the epic style migrated into the first fantasy novels: George MacDonald’s Phantastes: A Faerie Romance for Men and Women (1858), and the children’s adventure The Princess and the Goblin (1871). Even as epic poetry died off, modern fantasy fiction was born.
The high point for our rational world — Peak Reason, if you will — was probably the end of the Cold War, when it seemed all the grand battles between good and evil had been won. I don’t think it’s a coincidence that it was in the 1980s that the fantasy market went supernova.
Remembering my adolescence in the 1990s, the everyday world seemed flat, dull and stripped of all enchantment. But fantasy blossomed: after my baptism-by-Belgariad, I fell feet-first into the then freshly-published greats of modern fantasy, by now-classic authors such as Robin Hobb, Tad Williams and of course JK Rowling. Epic imagination scaled new heights, even as geopolitics sought perpetual peace under the “liberal international order”, the Communist threat evaporated, and the scope for heroic achievement in the “real” world seemed as vanished as Byron’s vision of classical Greece.
Those who went on dreaming of earth-shattering battles or heroes plucked from obscurity to save the universe have spent the Age of Reason with their heads down. Tolkien’s classic of 20th-century fantasy, The Lord of the Rings, is one of the best-selling books of all time; but one 1955 letter to WH Auden, Tolkien ruefully describes being “scourged with such terms as ‘pubescent’ and ‘infantilism’”. And everyone loves dragging millennials about the Harry Potter thing.
But while science has plenty to say about the “what” and “how” of our world, it has far less to offer on “why”. We may laugh at the tweeness of being a Harry Potter obsessive in your thirties, or someone with a mortgage and two kids cosplaying at a Star Wars convention. But we’re not so different from our fireside-storytelling ancestors, in craving stories that help us get at the dark, strange questions — or the big ones about power, empire, good and evil.
And while mythology seemed temporarily defeated by the End of History, today successive crises of finance, terrorism and pandemic have shown our world to be far more dangerous and unpredictable than we once imagined. So even as history has come roaring back, we shouldn’t be surprised to see gods and monsters doing the same. They might take the form of Baby Yoda memes or emetic Gryffindor avatars, but they’re playing the same role as the Greek and Roman pantheon centuries ago: providing a common narrative language for debates about the big questions.
From this perspective it’s easier to see how quarrels over whether or not an actress thinks you should wear a mask, or what JK Rowling thinks about transgender women, aren’t ludicrous culture war sideshows at all. Rather, they’re border skirmishes over the content of our moral operating system. The woke grasp this fact instinctively, which is why they reserve a special fury for policing heresy in our emerging new mythologies.
But the new would-be guardians of our epic mythologies are likely to find in time that their subject has a habit of escaping their control. The long history of stories that survive and replicate tells us it’s not the morally correct ones that make the cut, but the ones that ring true. No one today reads that classic of 17th-century woke literature, John Bunyan’s The Pilgrim’s Progress, unless they absolutely have to. Meanwhile, it’s a testament to the continued power of the old pantheons, that having disappeared from highbrow literature, they’ve since reappeared in (among other things) PlayStation games and the Marvel universe.
And the power of such gods lies partly in their refusal to be domesticated: they’re two-faced, ambivalent, bloody, capricious and awe-inspiring. They’re not at all inclusive. They carry a payload of intuitions about — for example — the persistence of power, violence and hierarchy, the often-untidy dynamics between the sexes, and the obnoxiousness of heroic personalities, that don’t sit comfortably with the sanitised modern imagination. Stories are too unruly to be easily contained by moral correctness.
It was the pursuit of Reason that chased the old gods into the shadows of children’s literature. But today, our faith in Reason is well on its way to collapsing. David Goodhart wrote recently about how, as media control has decentralised in the internet age, what looked like a consensus on “objective” discourse has been upended by a tidal wave of emotionally inflected personal testimony. Even the New York Times, which has for decades styled itself as the objective ‘paper of record’, has been convulsed by civil war over whether it should instead embrace more polemical, politicised stances.
As the lights go out and we see the world by firelight again, expect to discover the old gods striding, full-sized, across our imaginations again. We may find their return a mixed blessing.
London ravaged by disease. Social and sexual mores collapsing. Shifting political alliances and a wobbling constitution. A Babel of competing voices vying to dominate new media channels, driving public discourse to fever pitch. It’s not the first time we’ve been here.
Today our artists embrace (and sometimes accelerate) the vibe. Sculptors are more interested in subverting statuary than glorifying anything; painters warn of an oncoming apocalypse in two-storey murals and most music is about getting laid. But back at the dawn of the modern world, when politics, culture, mores and faith were as much in flux as they are today, the 18th century’s artists took a more aspirational approach.
The cultural sphere they depicted was every bit as harmonious as the world that produced it was volatile. But while today we still listen to the measured strains of Handel, and marvel at the elegant proportions of a building by Inigo Jones, the poets of the same era are ignored. Of these, the most criminally underrated is also, perhaps, the one whose work offers the most intriguing clues for the modern world: Alexander Pope.
Pope was born the same year as modern Britain: 1688, when a group of English statesmen deposed James II as King of England, in favour of his son-in-law William of Orange. The reasons for James’ deposition were complicated, but included his Roman Catholicism as well as his insistence on the king’s divine right to abolish Parliament and govern centrally via decree.
Unenthusiastic about absolute monarchy, and nervous of future kings trying it on again, Parliament slapped new constraints on royal power — and the upshot was the constitutional monarchy we’ve lived with ever since.
As the old order liquefied at the end of the 17th century, and the fight began in earnest for power at the beginning of the 18th, aristocrats and a new class of emerging industrialists poured in to fill the vacuum left behind by an absolute ruler. These politicos increasingly split between “Tory” defenders of James II, and “Whig” proponents of Protestantism, in a political configuration that gradually took the form that would become our modern adversarial Parliament.
This binary antagonism, every bit as values-driven and visceral as the Leavers and Remainers of today, drove a febrile “us and them” political discourse. And in a forerunner of today’s clickbait-for-profit content machine, the flames were fanned by advances in printing technology, that made the written word suddenly cheap and plentiful. Presses sprang up like mushrooms, and publishers grew rich selling the scandals, libels and “fake news” of the day.
Modern politicos blame social media for a decline in public civility. But compared to the grotesque caricatures, insulting posters, inflammatory street speakers and assassination plots against senior Tories that characterised politics in the early 18th century, what gets painted today as declining standards of politeness appears more like a return to form.
New governing elites, having displaced an absolute monarch less than a generation before, were sharply aware of how fragile public consent was for their newfangled constitutional monarchy — and how much potential hostile presses had to shatter that consent. In a move that foreshadows modern drives worldwide to regulate social media, new laws pushed to suppress dissent: the 1706 Star Chamber Case De Libellis Famosis ruled that accusations against the monarch or government could constitute seditious libel even if they were true.
Pope was in many ways an outsider, a condition that today we associate with a subversive mindset. Like the deposed James II, he was Catholic, and also a Tory in a hegemonically Whig era. But he was as preoccupied with order and stability as the Star Chamber, and — albeit in a different way — every bit as critical as they were of the newly democratic world of letters.
Rather than the law, though, Pope’s battleground was literature, where he emerged as a fierce defender of high culture and classical tradition against the pandemonium of “Grub Street”. First published 1728, The Dunciad pillories the hacks of “Grub Street”, in ironically high style, as a throng of “Dunces” under the Queen of Dullness herself.
It’s perhaps the most barbed and sparkling feature-length piece of literary shade ever thrown, by turns cultivated and scabrous. Where Twitter today might just call someone a shit writer, Pope depicts one rival as powered by the spatterings of Jove’s own chamberpot:
Renew’d by ordure’s sympathetic force,
As oil’d with magic juices for the course,
Vig’rous he rises; from th’effluvia strong;
Imbibes new life, and scours and stinks along; (Dunciad II, 103-6)
Without a working knowledge of Pope’s political and literary world, getting The Dunciad’s jokes is bit like someone from the year 2320 to try and follow the jokes on Have I Got News For You. But it’s hard not to see an echo in it of our access-to-all digital “publishing” environment, and the impact it’s had on the contemporary discourse:
‘Twas chatt’ring, grinning, mouthing, jabb’ring all,
And Noise, and Norton, Brangling, and Breval,
Dennis and Dissonance; and captious Art,
And Snip-snap short, and Interruption smart. (Dunciad II, 231-34)
Pope’s blend of wit, erudition and waspishness made him a sharp satirist of contemporary chaos, but his happier visions were of tradition and harmony. London, in Windsor-Forest (1713) was envisioned as a gilded, ordered, place and the rightful heir of antiquity. Faced with its glory, the Muses would quit singing about the glories of ancient Rome, and praise England’s capital instead:
Behold! Augusta’s glitt’ring Spires increase,
And Temples rise, the beauteous Works of Peace. (Windsor-Forest, 377-8)
“Augusta”, a Roman name for London, gives Pope and his contemporaries the name by which we know them today: the Augustans. And yet London in Pope’s day was not a vision of order and beauty at all, but famous for slums, licentiousness, corruption and STDs.
The print boom extended to a flourishing trade in porn, with smutty publications bought not just for private consumption but to read aloud in pubs and coffee houses. And prefiguring Frank Ski by some centuries, there really were whores in all kinds of houses: Covent Garden was a byword for the sex trade, from the low-class “flash-mollishers” and theatre-visiting “spells” to brothel-operating “bawds” and “Covent Garden Nuns”. Prominent prostitutes, such as Sally Salisbury (1692-1724) became celebrities: Salisbury’s noted clients including Viscount Bolingbroke, and even (according to rumour) the future George II.
On top of this gossipy, salacious and politicised backdrop, urban living conditions in the city were filthy and disease-ridden: more people died in London in the 1700s than were baptised every year. The century was characterised by near-continuous military engagement. So on the face of it, nothing makes sense about Pope’s depiction in the 1733 Essay on Man, of all the cosmos as “the chain of Love/Combining all below and all above”, in which “Whatever IS, is RIGHT”.
This seems especially strange today, in the light of our modern preference for art that’s “representative” of demographics or otherwise reflective of “the real world”. But Pope’s fixation on order, hierarchy and beauty make sense, because he feared that the alternative to an idealised order would be infinitely worse:
Let Earth unbalanc’d from her orbit fly,
Planets and Suns run lawless thro’ the sky,
Let ruling Angels from their spheres be hurl’d,
Being on being wreck’d, and world on world,
Heav’n’s whole foundations to their centre, nod,
And Nature tremble to the throne of God:
All this dread ORDER break – for whom? For thee?
Vile worm! Oh Madness, Pride, Impiety! (Essay on Man, Ep. I, 251-7)
Modern tastes run more to deconstructing than glorifying canonical art or the social hierarchies it idealises. Today we’re all about writing doctorates on marginalia, humanising a stammering monarch, or revealing the sexual licence beneath the aristocratic facade. But from Pope’s perspective, it was order that needed defending, as the only real defence against tyranny:
What could be free, when lawless Beasts obey’d
And ev’n the Elements a Tyrant sway’d? (WF, 51-2)
Read against the corruption, volatility and rampant, clap-infested shagging of Georgian high society, the restrained vituperation, classical learning and formal orderliness of Pope’s writing could be seen as a paradox. Or, perhaps, a state of denial. But what if it was more a set of aspirations that succeeded — just not straight away?
The ensuing century, dominated by Victoria and Albert, is perhaps Peak Order for modern Britain. If Boswell’s diaries, in the latter half of the 18th century, record 19 separate instances of gonorrhea, Victoria’s ascent to the British throne in 1837 was characterised by a society-wide backlash against the excesses of the preceding era.
Whether methodically colouring the globe in red, or imposing strict codes of sexual conduct, public-spiritedness and emotional reserve at home, the Victorians reacted against the perceived licentiousness of the Georgian era — by delivering the kind of order that Alexander Pope both depicted in his writing and also, in his own political era, never saw realised.
In the time since Peak Order we’ve all become somewhat more free-and-easy again. But we should be wary of viewing this either as evidence of moral progress, or (depending on your outlook) of a decline that’s likely to continue indefinitely. Our age has its digital Grub Street, its own pandemic, its unstable political settlement, and its patronage politics. So perhaps it may yet produce its own Alexander Pope, and with it a new poetics of order — for a future none of us will live long enough to see.
Originally published at UnHerd
I talked about relational meaning and its implications for how we understand and order human interactions with Palladium’s Wolf Tivy. Have a listen here.
Earlier this year, mining company Rio Tinto dynamited a 46,000-year-old Aboriginal sacred site in the Pilbara region of Australia, in pursuit of the iron ore deposits that lay beneath the sacred caves. The decision triggered outrage from Aboriginal communities and the wider world alike. Pressure from investors concerned about the resulting PR disaster eventually forced the CEO to resign.
But that’s not much of a victory for those to whom the now-destroyed site was sacred. As a place of pilgrimage, continuously inhabited since before the last Ice Age, its religious significance had accumulated over millennia of repeated visits, inhabitation and ritual. The holiness of Juukan Gorge lay in the unimaginably long-term accretion of memories, social patterns, and shared cultural maps by countless generations of the Puutu Kunti Kurrama and Pinikura peoples.
Strip mining, the method of resource extraction used to reach much of Pilbara’s iron ore, was the subject of a blistering 1962 Atlantic essay by Harry Caudill. Titled ‘Rape of the Appalachians,’ it describes a process as violent as the analogy suggests, in which entire mountaintops are removed in search of coal deposits. But when you consider the role played by commerce, it’s more accurate to describe the process as prostitution.
It’s not unusual for those looking to destigmatize prostitution to argue that selling sexual access to one’s own body should be morally acceptable, precisely because it’s no worse than coal mining. So here we have two sides of a disagreement, both of whom see commonalities between prostitution and mining, even as they disagree over whether the action itself is good or bad.
How would we characterize what prostitution and mining have in common? Resource extraction, perhaps. Dynamiting Appalachian mountaintops has obvious tradeoffs, but on the upside you get to extract coal from the exposed rock, which you can then use to generate electricity. We accept the environmental destruction, deterioration in air quality, and changed landscape contours (or at least mostly choose to overlook them), because the alternative—no electricity—appears worse.
Selling access to female bodies is also a form of resource extraction. The product may be subtler—orgasm, the illusion of intimacy, darker types of wish-fulfilment or, in the case of commercial surrogacy, a human baby—and the tradeoffs less visibly destructive than a landscape reshaped. But the dynamic is similar. In each case, a woman rents access to something that we consider to belong to each individual alone—her body—and earns money in return. The American Civil Liberties Union, which has supported the decriminalization of prostitution since 1975, recently argued for de-stigmatizing “those who choose to make a living based by self-governing their own bodies.” Earning money independently is good. Self-government over our own resources is good. So on what basis can we criticize people who choose to sell access to their own bodies?
In his 1954 lecture ‘The Question Concerning Technology,’ Martin Heidegger argued that when we organize life under the rubric of technology, the world ceases to have a presence in its own right and is ordered instead as ‘standing-reserve’—that is, as resources to be instrumentalized. Coal and iron ore, the products of technology themselves, and even human sexual desire then come to be seen as part of the standing-reserve. It becomes increasingly difficult to see reasons why there should exist any limits on extracting such resources.
Today, it feels as though we’ve always been engaged in this inexorable onward march. From a more mainstream perspective, what Heidegger is describing is simply the process we now call economic development. It is the transition from pre-industrial societies—characterized by primitive and localised forms of exchange, low workforce mobility, and in many cases by extreme poverty—to longer and more complex supply chains, technological innovation, more trade, more stuff, more wealth, and more personal freedom.
But as Austro-Hungarian economist Karl Polanyi argued in The Great Transformation, for much of human history trade occupied a much less central place in human relations than it does today: “man’s economy, as a rule, is submerged in his social relationships.” Polanyi showed how in Britain, economic development and the emergent market society was driven by the Enclosure Acts between 1750 and 1860. Prior to enclosure, much of Britain comprised subsistence-farming peasants living on land held and worked in common.
Enclosures, justified by a need for greater farming efficiency, stripped the peasantry of any right to common land in favor of a private property model. Pre-enclosure, the necessities of life might have been bare, but many of those necessities existed outside the realm of ownership and trade. A peasant might spend his or her whole life in a tied cottage, with a right to common land, working to produce food but with very little need to buy or sell anything. Enclosure reordered whole swathes of human life from the shared, social realm to that of standing-reserve: that is, the realm of private property and transactional exchange.
Post-enclosures, what had previously been held in common—whether land or labor—was now privatized as standing-reserve for exploitation by free individuals. In the process, millions of human lives were arguably made much freer. The working poor were liberated from the feudal ties often implied by subsistence farming, free to move if they pleased, and free to sell their own labor for money.
But this development was never simply the voluntary spread of a new, enlightened way of making everyone better-off. Like mining, it came with tradeoffs: peasant resistance to the Enclosure Acts suggests that for those people, at least, something was lost. And if enclosure opened up domestic markets in goods such as housing and food, it did not rely on the consent of those British peasants forcibly displaced from subsistence lifestyles into waged factory work.
The violence involved in opening up colonial markets likewise rejected the benign invisible hand. In February 1897, for example, not long after the completion of the enclosures in Britain itself, British imperial officials responded to the Oba of Benin’s refusal to open up trade in palm oil and rubber from his thriving city-state on the Niger Delta. Their answer was the Punitive Expedition, in which 5,000 British troops armed with machine guns razed Benin, massacring its inhabitants, flattening its temples, and looting the bronzes that inscribed its most treasured cultural memories. A month after the Punitive Expedition, a golf course had been laid over the city’s site, with the ninth hole where the most sacred tree had stood.
Most histories of the present characterize the story of economic development as an upward one of human progress, that has liberated millions from indentured labour into greater agency as free individuals in market society. And there’s something in this story of freedom; I wouldn’t swap my life today for that of a medieval subsistence peasant. But, like the extraction of Appalachian coal, nothing comes without tradeoffs. And while it’s easy enough to describe historical events in our transition from a largely relational society to a largely transactional one, the cost of moving to a market society is more difficult to count.
It’s perhaps easier to find a way into this blind spot via a more recent large-scale displacement of humans from a relational to a market existence. The migration of women from the domestic sphere to the workplace began in earnest in the 20th century, and it’s perhaps not a coincidence that it gathered pace around the time the economic gains available via overseas colonial expansion began to falter. I’ve never been a subsistence peasant or Aboriginal nomad, but for a few years I did step a small distance outside the market society as a full-time mother. And what I learned there about how, and why, this form of work is invisible today helps to illuminate the tradeoffs demanded by the market society. It also offers clues as to how we might yet stand for things crucial to humans but indefensible within a transactional worldview, such as ecosystems, sacred places, or even a view of dating that isn’t a sexual marketplace.
For something to be treated as standing-reserve, it must be possible to own it. Our social norms demand that we claim ownership of a resource before exploiting it. Selling my labor in the marketplace presumes that I can dispose of my time as I see fit, that no one else has a claim on my time or my body—in short, that I’m a free individual.
But to be a mother is quintessentially to experience not entirely belonging to yourself. It begins in pregnancy, with the realization that what you eat or drink affects your unborn child; it continues with breastfeeding, as you make the food that nourishes your child with your own body; it goes on far beyond infancy, in the need your children have to be physically close to you. When you see how powerfully your small child craves your presence, it’s very difficult to sustain the illusion of belonging only to yourself.
To the extent that something belongs to others as well as to ourselves—such as common land in 18th century Britain—it will resist being privatized for use as standing-reserve. So caring for my child can’t easily be viewed as a transaction, because it’s a relationship in which we aren’t exactly individuals. That is, we don’t belong only to ourselves, but to each other as well. And when you don’t belong solely to yourself, work can be understood not as a transaction—my labor exchanged for your money—but as relational. In other words, it is less oriented toward resource extraction and exchange than sustaining interdependent patterns of life.
This in turn helps explain why the politics of our market society has such a blind spot where motherhood is concerned: the market society’s notion of liberation into the standing-reserve is deeply at odds with the work of caring. Sustaining interdependency isn’t about fleeting transactional logic. It’s about maintaining a valuable relationship. I don’t care for my child or my partner because I have a utilitarian goal in mind, but because we belong to each other and that makes caring for them a necessity for my existence too.
Despite being in a sense repetitive and goal-less, caring is also pregnant with meaning. As the pioneering biosemioticist Wendy Wheeler puts it in Information and Meaning, repetition and pattern are central to communication throughout the organic and even the inorganic world. Organisms and natural systems don’t just respond to one-off signals, but rather exist in emergent, interdependent dialogue with the signals sent by other organisms and environmental factors around them—what Jakob von Uexküll calls an organism’s Umwelt. Thus, information in the natural world does not exist in some abstract sense, but only in the context of how it’s received within larger feedback loops. From the smallest microbiota to complex human civilisations, meanings are fundamentally relational, contextual, and pattern-based.
Seen this way, it’s easier to understand why non-transactional, relational spheres of life and particularly family, remain Americans’ most potent sources of meaning. For individuals, meaning is to be found less in peak experiences, one-offs, the exceptional or abstract; it hides in the repetitive, the everyday, and the relational. At a collective level, meaning coils through those pattern-languages transmitted via tradition, whether in vernacular architecture, folk music or oral histories. It lies thick in sacred places: humans have long used pattern, repetition, and the expected as the core of ritual religious and spiritual practices.
The philosopher Adam Robbert connects meaning-making with askēsis, a Greek term that refers to the act of practice and discipline as itself a form of extended cognition, that enables the expansion of meaning-making beyond the rational sphere via the bringing-together of attention and repetition. We can understand motherhood as a kind of relational askēsis, whose core is the attentive, attuned pattern-work of sustaining a child’s Umwelt while they are too young to do it themselves. This is a central reason why many women are willing to sacrifice social status and earning power to work part-time or stay at home with young children: it’s as satisfyingly rich in meaning-as-pattern as it is starved of social status and pecuniary reward.
But the central concern of mothering with pattern, sameness, and contextual meaning as opposed to information devalues it in the order of standing-reserve, even as it delivers untold riches on its own terms. Information theory, a core science underpinning much of our technology, explicitly excludes the realm of pattern and sameness as ‘redundancy,’ preferring to focus on the unexpected. Our contemporary culture is quintessentially one of information theory: we celebrate the new, the innovative, the individual who doesn’t follow the rules. I can’t think of many movies where the hero defies calls to go his own way and instead saves the world by embracing convention.
And yet meaning, as Wheeler emphasizes, “is made up of pattern, repetition, the expected.” Information theory is thus blind to it, as she further points out: “What information engineers count as redundancy, living organisms in their systems count as meaning.” In this worldview, the tradeoff between motherhood and the workplace is a brutal one. No matter how meaningful life with a baby seems in its relational context, we have no vocabulary for understanding that, save as redundancy. It’s no surprise to discover that market society frames caring for children as a punishment: “the motherhood penalty.”
The transactional world has little facility for repetition, pattern, or the expected; this is ‘redundancy’ to be dismissed in pursuit of the special, the distinct, the signal. This blindness to meaning-as-pattern, visible in the devaluation of motherhood and trust relationships, is similarly evident in contemporary architecture’s indifference to those vernacular pattern-languages in local built environments, that encode ways of life specific to different places. You can see it again in the treatment of folk music as second-class and unoriginal, the dismissal of religious practice as dogma, or the indifference to accumulated sacredness that allowed the destruction of Juukan Gorge.
Within the worldview that reads motherhood as a punishment, ecologies of meaning accumulated via everyday pattern, human relationship, or religious ritual are at best yet-to-be-monetized resources. If they resist this transformation, they are obstacles to be ignored or dynamited. Bringing these pieces together, it’s now easier to see what’s lost under the rubric of information theory and standing-reserve. To see the world in terms of standing-reserve means seeing it as transactions rather than relationships, and information rather than meaning: as Heidegger puts it, “shattered,” and confined to a “circuit of orderability.”
This shattered world is the same one the market society mindset calls ‘open’: openness to new forms, after all, means weak adherence to existing ones. To borrow Oscar Wilde’s famous phrase, then, seeing the price of everything by definition means seeing the value of nothing. Reframing the world in transactional terms, as ‘open’ resources that can be instrumentalized, necessitates the destruction of its meanings. Strip-mining self-evidently degrades the environment being mined. After demutualization, it took less than two decades for Britain’s building societies to go from embedded, localized community micro-lenders to casino-banking basket cases. And people who sell sexual access to their own bodies find it difficult to form and maintain intimate partner relationships.
Likewise, treating human gestation as a service in commercial surrogacy interrupts the biologically-based symbiosis between mother and child that makes such relationships resistant to marketization. Instead, surrogacy contracts treat the baby as separate from its mother, a product that can be commissioned. Humans are thus shattered and reordered as objects, as in this case of a disabled child rejected both by her commissioning ‘parents’ and also by her Ukrainian gestational mother, as though she were a faulty smartphone.
Here we begin to see more clearly who pays when we replace meaning with information and relationship with transaction: anyone in need of care, and anyone leading an ordinary life. The winners in the information world are those whose lives are oriented toward peak experiences, agency, variety, surprise, and control. To the extent that you find fulfilment in pattern, repetition, and the quotidian, a technological and economic order blind to meaning-as-pattern and hyperfocused on the unexpected will be, by definition, unable to see you.
But we’re running out of relational resources to convert and consume. Much as on current trends many key natural resources will be exhausted within a few decades, there are signs that in our civilization, the relational commons that underpins ordinary human life is approaching a point so shattered that the capacity of society to function is increasingly compromised. Certainly where I live in Britain, the weak institutional response to COVID-19 has revealed a nation in which social solidarity may be present on a local level, but is increasingly, acrimoniously, absent at scale.
Pursuing resilience in this context means seeking out the relational, and looking to strengthen it: that means standing up for the interests of women, babies, the everyday, the natural world—and the value of norms, custom, and religious faith. From this, it follows that defending women and the environment means not embracing but resisting the logic of transaction. In that case, communities with some religious basis for sustaining relational resources as a sacred domain will prove more resilient than the ‘liberatory’ vision of market society and standing-reserve—precisely because they reject the appetitive logic of transaction.
From a transactional point of view, this is at best a romanticization of some imaginary lost Eden, and likely a manifesto for ending innovation and demand to return to pre-industrial society. But a defense of ordinary-ness, pattern and repetition does not imply turning back the clock, or levelling all humans to identical cellular automata. Nor is it a case against extraordinary people: the natural world, after all, has megafauna as well as microbiota.
Making the case for meaning as well as information is not to claim that we should revert to Tudor times, all be the same, or all spend our lives raising children. But it’s to defend pattern, repetition, and ordinariness as valuable in their own right, whether as the medium for future rituals and sacred places to emerge, as the domain of social life, or simply as bulwarks against the voracity of a transactional worldview that would commodify even our deepest social instincts. It’s to argue for our radical interdependence with our Umwelt. And it’s to affirm that in order for a society to thrive, sacred things must not just be defended as exempt from standing-reserve, or moved to a museum like the looted Benin bronzes, but continually and actively re-consecrated.