I thoroughly enjoyed this challenging but very interesting chat with Simeon Burke on faith, motherhood, feminism, why I don’t believe in progress and why the term ‘post-liberal’ doesn’t really make sense because all politics is post-liberal now.
Had the most wonderful epic chat with the delightful Benjamin Boyce, where we roamed across such terrain as the psychotic side-effects of postmodernism, why nihilism isn’t the answer, why I don’t believe in progress and what’s left out of the internet’s parody of the social. It’s on YouTube:
Scuba diving is both magical and terrifying. Put on your gear, slip under the surface, and find yourself freed from gravity. In the glory days Before Coronavirus, I remember diving through the clear waters of coastal Turkey, drifting on warm currents and rolling to stare at the sunshine playing on the surface, from underneath.
But even as I rippled through the deep, marvelling at flashing schools of fish, there was a trade-off: constant self-control. Don’t breathe out through your nose. Don’t sneeze. Never, ever panic. For a short while it’s possible to pretend that you have the freedom of such an alien world, but in truth you’re only ever a tourist, granted safe passage thanks to technology, training and self-discipline.
Something about this sense of crossing an uncrossable threshold surely also powers our obsession with mermaids. And it is an obsession: mermaids are everywhere. Monique Roffey’s novel The Mermaid of Black Conch: A Love Story recently won the Costa Book Prize, while “mermaiding” — swimming in the sea wearing a “mermaid tail” — has gained a cult following in Australia. And you only need to browse the girls’ clothing selection in a high-street shop to find countless cartoon girls with fish-tails, sequinned and sparkly, smiling at you from t-shirts, dresses, wellies, duvet sets, pencil cases and the like.
As a parent of a four-year-old, I’m more familiar than I’d like with mermaid content, and Disney is a rich source. Sofia the First: A Mermaid Tale is a favourite with my daughter, who is entranced by the moment when Sofia is magically transformed into a mermaid and dives underwater. There, she swims in circles exclaiming: “This is incredible!”. And it is. The rest of the story is almost an afterthought, with the whole narrative punch condensed into that moment of metamorphosis, and the dive into a new and mysterious realm.
If mermaids offer an enchanting dream of transformation, perhaps it’s no surprise that the transgender movement enthuses about the special place mermaids have in their iconography. Activist Janet Mock links this to Ariel, heroine of the 1989 Disney film The Little Mermaid, who chafes at her underwater life and longs to visit the world beyond.
Ariel falls in love with a human, Prince Eric, and persuades the sea-witch Ursula to give her human legs, in exchange for her voice. Of course, being Disney, it all ends happily: Ariel gets her transformation at the end and marries the prince. It’s an elegant, arresting fantasy of pursuing and realising a seemingly impossible vision, and encapsulates perfectly the Disney motto: “Where Dreams Come True”.
Today, it’s increasingly accepted that we should support each individual in pursuit of their dreams — even to the extent, as in Ariel’s case, of accommodating those who radically alter their bodies to align with inner identity. So perhaps we shouldn’t be surprised that in the 31 years since The Little Mermaid was released, the association between mermaids and those who pursue an identity at the cost of physical transformation has only deepened. Six years after the film’s release, a charity was founded with the aim of supporting transgender youth — and given the name Mermaids. Meanwhile, Starbucks (whose logo is a mermaid) ran a 2019 campaign in partnership with Mermaids, celebrating the moment a young transgender person hears their preferred name spoken by a Starbucks barista and, for the first time, identity supersedes body.
But just as scuba divers gain the enthralling freedom of the deep only via technology and absolute self-control, a delve into the deeper iconography of mermaids suggests that crossing a threshold as un-crossable as that between air and water isn’t as straightforward matter of making Dreams Come True. As T.S. Eliot hinted in 1911, the dream of oneness with the ocean always comes with a price, or else comes to an end:
We have lingered in the chambers of the sea
By sea-girls wreathed with seaweed red and brown
Till human voices wake us, and we drown.
The modern mermaids of pre-teen iconography are both ultra-glam and sexless, sporting revealing, shimmery shell bikinis and jewelled hair — even as the iconography swims smoothly past the question of what’s going on below the waist. As transgender icon Amiyah Scott puts it to Janet Mock: “With mermaids, the bottom is kind of like an unknown and I like that.” It’s not really done to speculate about how mermaids make more mermaids.
For a culture that simultaneously offers pre-teen girls Playboy-branded merch and rages about paedophilia, this style of mermaid perfectly combines an alluring, hyper-feminine aesthetic with a convenient evasion of the sexual dynamic that hyper-femininity is meant to evoke in adults. But the deep history of mermaids — and their element, the ocean itself — tackles those far darker and more turbulent feminine sexual associations in a way that’s far less sanitised.
Once you’re out on the open sea of unbounded female desire, the mermaids of legend aren’t pretty and sexless at all. They’re alluring, slippery and apt to steal your loved ones. In one Cornish folk tale, chorister Matthew Trewalla followed a mysterious woman out of the Sunday service in the mining town of Zennor, straight to the ocean where he vanished.
As the story goes, Trewalla was never seen again — until spotted by a ship’s captain some years later. He had transformed to a merman, swimming alongside his mermaid wife and mer-children. Zennor’s mermaid tempts Trewalla to turn his back not just on friends and family but on land itself. She’s far closer to the temptress of seafaring lore, who sings to passing ships and causes them to run aground.
In Old Norse mythology, sea-maidens are more menacing still: the nine daughters of Aegir the sea-god and Rán, goddess of the drowned, are waves on the ocean. Each of these nine sea-maidens has a different aspect — such as the frothing one, the billowing one, the welling one — “through which one can see heaven”. The seafaring Vikings who told these stories were intimately familiar with, and healthily afraid of, an ocean seen as both feminine and deeply dangerous.
This hostile undercurrent to the association between women and the sea comes out even today. There’s no shortage of not very polite modern euphemisms for women’s genitalia referencing seafood, for example, while in drag culture someone is said to be ‘fishy’ if they pass as a woman.
So is crossing to the other side desirable or detestable? And what’s the price of a visit? In the movie of The Little Mermaid, there is no price: Ariel’s father Neptune uses his magical trident to transform his daughter permanently into a human, whereupon she leaves permanently for the land and marries her prince. There’s no sense in the movie that this is anything other than an unambiguously happy ending. But while older mermaid tales evoke that same longing to cross the boundary, either seaward or landward, they usually carry a far greater sense of loss or danger than this “Dreams Come True” retelling.
The hauntingly sad Hans Christian Andersen story that inspired Disney couldn’t be further from wish-fulfilment. As in the film, the mermaid falls in love with a prince she rescues from a storm. But in exchange for giving her legs, the sea-witch doesn’t just steal her voice but cuts out her tongue. Even as the magic grants her a pair legs, walking on them is agony. And though the prince is fond of the transformed mermaid, he loves someone else. There’s no happy ending: the mermaid knows his marriage will break her heart, but though her sisters beg her to break the spell by murdering the prince, the mermaid loves him too much to save herself in this way. Instead, she throws herself into the water and dissolves into foam.
From the Disney perspective, this is all a bit grim. After all, we can all be whatever we want if only we believe. Can’t we? Far from offering a happy tale of dreams that come true, though, Andersen’s story reads like a bleak cautionary tale about struggling against your own natural limits.
Disney animations have a way of crowding out earlier and more ambiguous fairy tales, and it’s a safe bet the founders of Mermaids hadn’t read Andersen’s story when they named their organisation. In their search for a positive depiction of youth gender transition, it seems unlikely they had in mind constant physical pain, the loss of one’s authentic voice and a lifetime of being passed over as a sexual partner.
So even as the ancient history of mermaids has mixed feelings about the beauty and peril of femininity, the modern mermaid reboot is just as ambivalent about what’s real and what’s artificial — and just how far artifice can help us realise our desires. Perhaps it’s fitting that even as we’ve filled our real-life oceans with plastic, we should make such a concerted effort to give the archetypal oceanic feminine a plastic-toy — or plastic-surgery — makeover.
But despite this embrace of the mermaid as a poster-girl for a consumer approach to identity, the mermaid as symbol isn’t so easily sanitised, or persuaded of her own happy ending. Some mermaids decide they don’t like the land after all, even if they’re no longer quite at home in the sea either. Some find new voices, and use them.
For despite all the toys, t-shirts and upbeat Disney stories, darker, older currents still roll beneath the safe and sparkly modern mermaid. These currents invite us to wonder: maybe some dreams bring no relief, even when they come true. Maybe some kinds of restlessness can’t be cured, only navigated.
The airhorn blows. A cry goes up. The field, scattered across streets from Brixton Water Lane to Poets’ Corner, converges toward the sound. The hounds are in full cry. A triumphant ululating from the lead riders, thin tracksuits flapping as they pedal toward Mayall Road. The quarry has been sighted. On bikes, skateboards, scooters or just on foot, the field streams after the leaders.
Overhead on grey terraced rooftops, a scatter of parkour scouts scampers across the sloping tiles, whooping and gesturing. Below them, the quarry flashes in and out of the cover of back-garden bike storage, patio furniture and shrubbery. A few scouts are down from the rooftops now, vaulting fences between gardens. Close pursuit. A delivery van honks. Residents peer out of windows as the hunt streams up the Saturday morning street.
Moments later it’s flushed out: a glimpse of red-brown, the hunt stampeding after it down the tarmac. Then it’s cornered in a newsagent doorway, the hounds swarming. The inevitable end. A Brixton schoolboy, eyes shining with the joy of exertion and bloodshed, is gifted the brush. He holds it aloft in one blood-smeared hand, russet against Herne Hill’s leaden sky.
When the ban was repealed, there were demonstrations throughout the English countryside. It was grossly unjust, the Telegraph howled, yet another sign of government bias toward the cities, that foxhunting was now legal in urban areas but not the countryside. The Johnson government replied serenely that foxes were a predominantly urban pest in 21st-century Britain. Also, as county lines operations had spread city-style drug-dealing throughout rural England, it was only fair in return to encourage outdoor rural pursuits to flourish in the city.
Horrified Guardian editorials inveighed against the education in brutality that would now be coming to London’s already violent youth. But the columnists fell silent when the season started, and knife crime abruptly dropped. United against the mangy pests that raided bins, terrorised domestic cats and occasionally mauled a baby, a critical mass of Londoners embraced the hunt.
Hunts formed along postcode lines, and initially when a hunt crossed multiple postcodes there were stabbings. But the gangs’ youthful energy, physical fitness and fondness for casual violence catapulted them to the heart of London’s great pest control project. Finding themselves suddenly lionised instead of stopped-and-searched, a newfound sense of civic participation put a spring and swagger in their step, and inter-gang rivalry waned.
There was a minor furore shortly after the Repeal Bill passed when, having voted against the Bill, Dr Rosena Allin-Khan MP (Labour) was photographed at the Boxing Day Tooting meet. Polly Toynbee accused her of ‘cheap populism’, while one snarky Spectator columnist noted how clean her Nikes remained even after a gruelling back-garden chase.
The issue split the Labour Party down the middle. On one side stood those who saw the benefits in terms of public health, pest control, crime reduction, race relations, and young males having a healthy outlet for their aggression. On the other stood those appalled by the cruelty meted out to the fox. Innocent animals should not be hunted for fun, they protested. The repeal was emblematic of a culture that had turned its back on progress and was disintegrating into barbarism.
Their opponents replied that the riotous pursuit and bloody death of the odd manky fox was a small price to pay for a reduction in youth knife crime, and that objectors were white middle-class snobs who want to keep London’s multicultural youth in a state of dependence and misery. Would they rather see machete-wielding gangs pursuing foxes or teenagers? The statistics showed it was a straight swap.
The antis retorted that this revolting weaponisation of tragic deaths among troubled urban youth was the first move in a base and bloodthirsty effort to take modern Britain back to the Dark Ages. The next step in the Tories’ grim plan would be tagging further vulnerable groups for torture and sacrifice. Ken Livingstone popped up from somewhere to remind us who else murdered vulnerable groups in order to create a sense of belonging forged in bloodshed.
Jolyon Maugham became, unexpectedly, an anti-hunt sensation, when after months of silence he prioritised his Lib Dem sympathies over past association with urban pest control and wrote a heartfelt op-ed for The New European, explaining why he should have called the RSPCA on that hungover New Year’s Day. Floral kimonos became, briefly and surreally, a symbol for militant veganism.
But with Labour now a rump party of urban liberals, and city hunting wildly popular, the electoral calculus was inexorable. Pollsters nodded sagely when Allin-Khan’s popularity rocketed. The #KillerKhan tweetstorm never got off the starting blocks.
As the wind picks up on Dulwich Road, hunt followers are still milling, elated. The crowd passes hip flasks, relives highlights. Young people mix across culture, ethnicity and caste. Paleo-and-Crossfit machos swap hunt stories with Asian wideboys. Shaven-headed teenagers in tracksuits laugh uproariously with a knot of tweed-wearing neo-trads, the men extravagantly moustachio’d. Locals sidle uncertainly past the panting hounds.
The parkour crew are all here now. The hunt’s athletic elite. Stripped to the waist, defined even in dull autumn daylight, they draw admiring glances but talk mainly to each other. A chill rain begins to spatter. It’s still early; the fitness hardcore moves on toward Parkrun. Knots of people disperse in pursuit of brunch, showers or the visceral pleasures of a post-hunt shag.
In an apartment window above the vintage furniture shop, someone spots a sign hand-stencilled on a sheet. FOR FOX SAKE BAN THE HUNT. Scattered jeers. No one performs compassion for status points these days: that generation is sliding into middle-aged irrelevance. Vandalising monuments is so last year. All of bleeding-edge young London is here, at its most energised and diverse, to thumb their noses at public displays of empathy.
Veganism is tired. Bloodsport is a human instinct. Better to hunt foxes than each other.
Originally published by The Fence
Who would have guessed that a weekend hobby for outdoorsy nerds could spawn an era-defining political metaphor?
LARP, or live action role-playing, is an offshoot of the fantasy roleplaying subculture. It involves dressing up in costume and acting out a fantasy-fiction game narrative in real time and space, sometimes over several days. A witty friend once characterised the experience as something akin to ‘cross-country pantomime’.
Thanks to lockdown, no one’s LARPing this year — at least not in the cross-country pantomime sense. But the word ‘LARP’ has escaped into the wild: far from being the preserve of fantasy fans, I’ve noticed it appearing with increasing frequency in political discourse.
When riot police finally pushed activists out of the Chapel Hill Autonomous Zone following the murder of one joyriding teenager and serious wounding of another by CHAZ ‘security’, resistance to the advancing riot shields was so paltry it prompted contemptuous accusations of ‘revolutionary larping’. Weird Christian Twitter (it’s a thing) hosts arguments where people are accused of larping more traditionalist theologies than they truly espouse. Still further out on the fringes, the QAnon conspiracy rabbit hole (don’t go there) is fiercely defended by its True Believers against accusations that it is, in fact, a bunch of people LARPing.
Around the time my friends were discovering LARP, I got into LARP’s Very Online cousin, Alternate Reality Gaming (ARGs). An artefact of the age before Facebook and Twitter colonised most of the internet, ARGs are a hybrid of online treasure hunt, mystery story, and live-action immersive theatre. The first mass-participation ARG was a promotional stunt for the 2001 film AI, and featured a series of fiendish clues for participants to crack and follow, which unlocked further elements of story including live-action segments.
For a moment in the mid-Noughties, ARGs looked like the future of storytelling. The idea of internet communities over-writing stable systems of meaning with playful new narratives that danced back and forth between the web and real world felt refreshing and subversive. With hindsight, though, the phenomenon was just a more-than-usually-creative moment in a widespread unmooring of reality that’s been under way for decades.
It’s not all the fault of the internet. In 1955, the philosopher J L Austin developed a theory of ‘performative’ language: that is, language that does something to reality in the act of being spoken. ‘I pronounce you man and wife’ is an example of performative speech — words that effect change through the act of being spoken.
Then, in 1993, the queer theorist Judith Butler borrowed the concept of ‘performative’ language wholesale and applied it to sex and gender, arguing that the identities ‘man’ and ‘woman’ — along with the bodies and biologies associated with those identities — are performative. In taking these roles on, Butler claimed, we make them real.
While these ideas pre-date mass adoption of the internet, the notion that we participate in creating our own realities has been wildly accelerated by social media. Online, it’s easy to get the impression that we can reinvent ourselves entirely, independent of our bodies or other dull ‘meatspace’ constraints. Unsurprisingly, Butler’s conception of sex and gender as performance has long since escaped the petri dish of academia and, like the concept of LARPing, is evolving rapidly in the wild.
Strikingly, the word ‘performative’ has also mutated. Today, it isn’t used as Butler did, to mean “a performance that shapes reality”, but the opposite: an insincere performance for social kudos. So, for example, celebrity endorsements of social justice orthodoxies are often accused of being ‘performative’. It means much the same as ‘larping’, but with an added payload of cynicism. So where ‘LARPing’ means “playacting at something you wish you were”, ‘performative’ means “playacting at something you don’t really believe”.
Meanwhile, the LARP is no longer confined to cheery camping trips accessorised with pretend armour. Back in the noughties, online communities refactoring reality to fit a fantasy storyline felt like a fun game, but as I stare into the sucking void of the QAnon conspiracy, that perspective now seems hopelessly naïve. It’s not a game today: it’s how we do politics.
Liberal commentators spend a great deal of energy trying to explain why this is bad. Countless writers ‘fact-check’ Trump’s bloviations, seemingly unaware that from the perspective of reality-as-ARG, the fact that Trump is lying doesn’t matter. Nor does it really matter whether QAnon is real or not. Reality is, to a great extent, beside the point.
Laurie Penny got closer to the truth in this 2018 piece, where she characterises the very notion of a ‘marketplace of ideas’ as being a kind of LARP: “a Classical fever-dream of a society where pedigreed intellectuals freely exchange ideas in front of a respectful audience”. The reality, she argues, is that this ‘marketplace of ideas’ is less free, rational exchange than dick-swinging theatre.
Those who like to imagine this pessimistic perspective is new, wholly the fault of the Orange Man (or perhaps Facebook), should recall the words of an unnamed aide to George W Bush, quoted in 2004 on the relationship between facts, reality and the military invasion of Iraq:
The aide said that guys like me were “in what we call the reality-based community,” which he defined as people who “believe that solutions emerge from your judicious study of discernible reality.” I nodded and murmured something about enlightenment principles and empiricism. He cut me off. “That’s not the way the world really works any more,” he continued. “We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality — judiciously, as you will — we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We’re history’s actors . . . and you, all of you, will be left to just study what we do.”Ron Suskind, New York Times
Though his approach was less overtly hubristic, Tony Blair’s embrace of spin reflected a similar belief in his own ability to manipulate public narratives. ‘Political communications’ has only grown in significance since those days, and taken a self-referential turn. Today it’s as common for commentators to criticise a politician for performing badly at a presser — for poor-quality larping, or bad theatre in Penny’s formulation — as for saying things that are immoral or factually wrong.
Donald Trump is distinct from George W Bush not so much in disdaining facts as in lacking the religious conviction Bush deployed to fill in the gaps left behind by their disregard. But both, in different ways, embodied or embody the idea that what you believe is what is. If you LARP hard enough, this view says, your larp will come true.
Boris Johnson’s administration has something of the same cavalier attitude to the relationship between facts and rhetoric. To date, the handling of coronavirus has routinely over-promised and under-delivered, while seeming indifferent to the disorienting effect on public life of a string of announcements only very loosely tethered to everyday experience.
It’s not a coincidence that this larpification of politics has evolved in tandem with a public fixation on ‘evidence-based policy’. The political polarity of absolute LARP — blatant political lying — and absolute insistence on evidence are two sides of the same loss of faith in a common understanding of reality.
If you’re not convinced, consider SAGE, the government’s scientific advisory committee. Then consider ‘Independent SAGE’, a kind of counter-SAGE comprising scientists every bit as eminent as those on SAGE. This august body produces its own carefully evidence-based reports, which are then used as a foundation from which to disagree with whatever positions the Tories choose to adopt from official SAGE.
Who do we believe? That’s politics. If the Brexit debate hadn’t already killed your faith in ‘the evidence’, the competing claims of SAGE and counter-SAGE should be the death-blow. There is no dispassionate foundation of facts we can deploy to take the politics out of political decisions. The original LARPers might have a fun weekend, then pack up and go back to their normal lives; but in its political sense, there’s no outside to the game. It’s larping all the way down.
Some parts of our culture are coping better with this shift than others. Among the worst performers, to my eye, are mainstream liberals of both left and right. Many have responded to the larpification of everything by concluding that in losing objectivity we’ve lost reality. Some then imagine they can make reality whatever they want it to be by sheer force of will (the Trump/Bush approach). Others suggest we can fix things by supplying enough facts to prove whatever we already believed (the SAGE/counter-SAGE approach). Others, such as Laurie Penny, try to refuse to play.
But we haven’t lost reality, just the fixed vantage point we pretended we had from which to evaluate it. What we have instead is a kind of reality-shaping free-for-all, and there’s no opting out.
As most of us flounder, disoriented, we’re starting to see subcultures adapting. The old story about the Inuit having 50 words for snow is (appropriately) itself probably fake news. But much as a snow-dwelling people might be expected to develop specialist terminology for different types of frozen precipitation, we should understand the emergence of words like ‘larp’ and ‘performative’ as analogous. We’re developing a specialist vocabulary for types of unreality.
We’re also having to find new ways to talk about the reality that, inconveniently, refuses to go away completely. The grim story of the Iraq invasion and its destructive and bloody aftermath gave the lie to Bush’s messianic faith in his capacity to create a new reality to order. Humans still can’t really change sex. And no amount of fiddling the statistics can bring back those people who’ve already died of coronavirus.
The political future turns on our being able to get used to parsing our new Babel for what’s actually happening, and what actually matters. We have to get used to doing this without trying to eliminate the element of LARP (spoiler: can’t be done) or pretending we can abolish reality (ditto).
But there’s no putting the genie back in the bottle. If the ground is moving under all our feet now, the way forward is learning how to dance.
As activists campaign to cancel Britain’s best-loved children’s author for stating her belief that biological sex exists, transgender rights has become one of today’s most toxic cultural flashpoints. Even most cautious approach to the topic feels like walking on eggshells.
My own journey on the subject has taken me some distance from my 00s social life, that for a while revolved around the (then embryonic) London female-to-male trans ‘scene’. There, I explored my own gender identity and presentation, while evangelising queer theory to all and sundry. But as time went on I felt a growing disquiet at the direction the movement was taking.
Back then, transgender identities felt like a progressive challenge to restrictive sex roles. But as such identities have become more mainstream, and normative sex roles ever less relevant to how we work and live our lives in the modern world, it has increasingly felt to me like the cutting edge of the movement is morphing into something less liberatory than hubristic: a radical assertion of the primacy of mind over matter.
Humans are a sexually dimorphic species. Sex roles (commonly referred to as ‘gender’) can’t be wholly separated from biological sex. In asserting that sex roles have no relationship to sex, but are in fact free-floating identities, the trans movement in effect declares that sex is unimportant. Some even question its objective existence. Thus severed from any material grounding, the social roles ‘man’ or ‘woman’ become costumes or attitudes, no more objectively definable than (say) ‘goth’.
This can feel liberating to those who find the social norms associated with the sexes uncomfortable. But for females in particular, biological sex is far from a trivial detail.
I still remember the first time my brother defeated me in an arm wrestle, when I was 10 and he was 12. I was furious that something as unasked-for as a biological difference between us had now permanently rendered us unequal in a playfight, where previously we’d been evenly matched.
On a more serious note, crime and violence are sexed: males commit some 96% of all murders, 99% of all sex crimes and 100% of all rapes (because you need a penis to rape). Meanwhile females are less physically strong, and our anatomy and reproductive capacity makes us vulnerable to assault, rape and the many risks associated with pregnancy.
Inner identity does nothing to mitigate the threats women face due to being female. It’s physiological femaleness that marks girls out for FGM; physiological femaleness that gets women kidnapped for bride trafficking into China; physiological femaleness that got 14-year-old Rose Kalemba raped at knifepoint to create a porn video monetised by PornHub for years while she was still at school.
Even if we pretend it isn’t there, or is subordinate to our identities, the sexed nature of our bodies continue to exist and shape women’s interactions. So a movement that asserts the absolute primacy of mind over matter, identity over biological sex, is bound to feel jarring to women – because experience tells us it’s not true. And claiming it is disadvantages us at a fundamental level: if we’re all just free-floating minds, women no longer have any basis for talking about the ways our bodies make us vulnerable.
This of course doesn’t mean that we’re nothing but the sum of our biology. Humans are thinking, feeling, social and visionary creatures as much as we are evolved animals. All these aspects of us exist in tension.
Acknowledging the role of our bodies in shaping who and what we are as humans isn’t to reduce humans to the status of mere flesh. Nor is it a claim, as some activists suggest, that male and female humans alike should stuff our personalities into narrow masculine or feminine boxes to match our physiology.
Rather, it’s a recognition that the human condition is a mess of competing urges, aims and longings, and sometimes we can’t have what we want. Sometimes the world outside us refuses to be reshaped according to our innermost desires. That this is true is more distressing for some than others, but it’s still true.
For that subset of people who experience an inner identity that feels at odds with their physiology, that tension between psychic and embodied reality is painfully acute. We should meet this obvious distress with compassion and courtesy, not stigma or discrimination. We should as a society shield transgender people from unjust treatment. But that doesn’t mean we must accept the idea that bodies don’t matter.
I joined Palladium’s Wolf Tivy to talk about my theory that what gets called ‘postmodern’ today is really the last stand of high modernism, and the interpersonal implications of genuinely decentering meaning and subjectivity without collapsing into nihilism. Have a listen here.
While we live, we all present different facets of ourselves to different people. Whether in our friendships, work, family or at different times in our lives, we encounter others. All remember us slightly differently, according to their perspective.
While we live, our physical presence holds that multiplicity together. After we die, though, the memories begin to come apart. When my step-grandfather married my grandmother, he already had two children with his first wife. But she had already left him and moved to a different country; he was stepfather to my mother and aunts instead.
He was a big character: an aristocrat of the Greatest Generation, the subject of several films about his war exploits, well-loved farmer, and patriarch to two families. At his funeral, the many facets of his life were already coming apart. Each version of his memory was fiercely defended by the mourner to whom it belonged. Long-standing quarrels, no longer held in check by his living presence, began trickling back into the open. It was not an easy day.
Today, we are all mourners at the funeral of a character on a scale that dwarfs even my roaring, hectoring, pedantic, affectionate, and irascible step-grandfather. We are gathered to mourn teleology itself—the belief that life has objective meaning and direction. What we call the culture war is the aggregate of those quarrels now breaking out between the gathered mourners over their divergent memories of the deceased.
Were we progressing toward universal peace, justice, and equality? Was it the resurrection and the life of the world to come? Perhaps it was the end of history in universal liberal democracy? We cannot agree.
The death of teleology represents a collective cultural trauma that accounts for, among other things, the increasingly unhinged debates around social justice within elite universities, and the reactive phenomenon of the aggressively transgressive online far-right.
But it doesn’t have to be like this. Post-structuralism killed teleology, but it did so in error, by taking a wrong turn; it is this wrong turn that has left us so traumatized.
What is commonly referred to as postmodernism is not in fact post-modern but rather represents a last-ditch attempt by modernism to resist the implications of the post-structuralist mindset whose inevitability is now indicated by fields as diverse as physics, ecology, and psychotherapy.
Deconstruction is not the end: reconstruction is possible, indeed essential.
To situate myself a little in this story: I belong to a generation that is marginal, facing two directions, in several ways that are relevant to my argument. Born in 1979, I sit at the tail end of Generation X. I am old enough to remember the days before the internet, but young enough to be more or less a digital native. I got my first cell phone and email address as an undergraduate at Oxford. I researched my undergrad essays sitting in actual libraries reading physical books, but wrote them on a word processor. I can remember life before social media.
I also received, prior to undergraduate life, a recognizably classical education. This was, in the old-fashioned way, designed to deliver a whistle-stop tour of the march of civilizations from Ancient Egypt via the classical era to Western Christendom, with at least a vague grasp of the cultural and historical highlights of each.
The overall impression delivered was of an evolution of societies, consciousnesses, and cultures over a vast sweep of time and different human epochs that nonetheless seemed to have at least some narrative continuity and directionality. Everything else we learned seemed at least to an extent framed by that sense of situatedness within a larger narrative of human cultural evolution, whose direction was a mystery but did at least seem to be headed somewhere.
Then, in my first year as an English Literature undergraduate, I encountered critical theory—and the entire organizing principle for my understanding of reality fell apart.
To summarize: Saussure proposed that instead of treating language as transparent, its meaning rising off the page without any need for elucidation, we should split language into ‘sign’ and ‘signified.’ That is, what a word means is separable from the word that means it. We can thus, he argued, institute a new discipline of ‘semiotics’: the study of signs—a study that reaches far beyond language and was immediately influential in the social sciences.
This insight was developed by Jacques Derrida, whose simple but devastating observation was that if this is the case, we cannot define any given ‘signified’ except with reference to further signs, which then in turn themselves require definition with reference to further signs. It’s turtles all the way down. We have no means, through language, of arriving at any kind of truth that we are able to experience directly. Furthermore, the concerted efforts by millennia of culture to obscure the fact that it’s turtles all the way down is in fact a cunning effort to shore up entrenched interests, and to conceal the operations of power.
In this view, recourses to authority are null and void. There is no solid foundation, no God, no truth, no authority. Only power, and a form of consensus reality arrived at through the glacial accretion of a billion tiny operations of power that have, in sum, decreed that the world should be thus and not thus.
For me, the shift from a sense of the world as having some stable narrative trajectory to this perspective, in which meanings were not only networked but fundamentally without foundation, was deeply disturbing. It landed like a psychotic experience. Overnight, the hallowed architecture of Oxford University went from seeming like a benign repository of traditions within which I could find my place, to a hostile incursion into my consciousness of something phallic, domineering, and authoritarian. I remember describing to a friend how, as a woman and sexual minority, I suddenly experienced the ‘dreaming spires’ as ‘barbed penises straining to penetrate the sky.’
I wish I could say it passed, but it did not. What did happen, though, after I left, was that I found an accommodation with the loss of teleology and objectivity from my frame of reference. I did this by theorizing that if to posit anything at all is an act of power, then it was one I was also entitled to attempt. All cognition, meaning-making, interpretation, and perception is conceptually laden and socially-mediated action. It is impossible to ground even perception in anything but action and thus power. But so be it. We live in a society and participate in the flow of power all the time. I developed the idea of ‘temporary certainties,’ or in other words, the idea that even if meanings are not stable, many of them are stable enough for me to act as if they were solid in the pre-Derridean sense. I did not have to deconstruct every minuscule interaction for the operations of power it encoded.
In an effort to evade the monstrous pervasiveness of systems of domination and submission, I experimented with radically non-hierarchical forms of living, power exchange sexualities, non-binary gender presentation. I tried my own operations of power: I changed my name to Sebastian, to see what it felt like, then settled for a while on Sebastian Mary. I co-founded a startup with friends, in which we tried to avoid having a management hierarchy.
My accommodation kind of worked, for a while. But it did not last. It is all very well to theorize about non-hierarchical forms of organization, but in order to get stuff done you need a chain of accountability. And the worst sort of hierarchies have a habit of emerging, too, especially in social situations where they are intentionally obscured or deprecated. Communes, collaborative projects, and the like all find their leaders and followers, or their tyrants and victims. My increasing bitterness as I learned this, in the course of trying to get somewhere with the startup, made me so obnoxious as a co-worker that eventually I was expelled from the project which was, by then, failing anyway.
With that rupture, I lost my social circle, my best friend, and my entire carefully reassembled working theory for how to navigate the rubble of broken teleologies that was my adult life in the ‘00s. Concurrently, the Great Crash of 2008 destroyed the equally teleological fantasy of global liberal-democratic hegemony under international capitalism that had powered the Iraq invasion along with the triumphalism of the Blair years.
In the wreckage, though, something wonderful happened. Two wonderful things, actually. First, I met the man who I would eventually marry, and by degrees let go of the belief that in order to sustain my integrity as a person I had to reject any form of stable loving relationship to an Other in favor of multiple, overlapping, unstable platonic, sexual, or ambiguous friendships. Second, I decided I needed to learn how to do something more useful than floating around London curating experimental art events and screwing up entrepreneurship, and went back to school to train as a psychotherapist.
In the course of that study, I learned where postmodernism took its wrong turn. Implicit in the post-structuralist theories taught to every young humanities student at university is the idea that because meanings have no singular objectively correct grounding, they are therefore of no value. Also implicit is the idea that because of this, no satisfying, authentic or truthful encounter with the Other is ever possible—only an endless recursive hall of mirrors composed either of our own anguished reflections or the invasive pressure against our psyches of another’s desire.
In studying psychotherapy, though, I came to realize that while the same post-structuralist decentering of the self took place in psychoanalytic theory between Freud and his contemporary descendants, therapists had—because they have to—rejected the idea that we can never encounter the other. While much contemporary analytic theory acknowledges the need to excavate and make space for the operations of overdetermined systems such as race, class, or sex, it does not automatically follow from the presence of those things that intersubjective contact and meaningful connection cannot take place.
Just like post-structuralism decentered the observer, intersubjective psychoanalysis radically decenters the analyst. But an intersubjective understanding of the relational space as co-created by client and therapist does not preclude the possibility of therapeutic work taking place. And this in turn speaks powerfully to a claim that however muddled, muddied and overdetermined our encounters with the other may be, yet they still contain the potential to be not just benign but real, true, and transformative.
I suppose I could deconstruct that claim in turn. But I have experienced its truth both as client and also, in the course of my work, as therapist. Through intersubjective encounters in the consulting room, I have been transformed, and have transformed in turn. From this vantage point, the claim of post-structuralism to render meaningless all semiotic systems, and reveal as brute operations of power all encounters with the other, seems not just mistaken but (in the Kleinian sense) paranoid-schizoid. It is the tantrum of a child who, on realizing they cannot have exactly what they want, refuses to have even the next best thing and dismisses everything and everyone as evil.
The alternative to this paranoid-schizoid repudiation of meaning is not to reject meaning as dead or hopelessly suborned by power, but to accept that we are enmeshed, shaped and in turn helping to shape networks of meaning as part of a dynamic dialogue. We are nodes in the social and semiotic system. As such, even the act of contemplating those systems of meaning will have some tiny effect on them. When Derrida said ‘Il n’y a pas d’hors-texte’—”there is no outside-text,” though commonly mistranslated as “there is nothing outside the text”—I took it to mean meaning itself was hopelessly corrupted, and objectivity a bust. Today, I see it more as a radical decentering of my selfhood that opens up new, vibrant possibilities of connectedness.
If we read ‘text’ in the biosemiotic sense as fractal, multi-dimensional, and interconnected systems of signification, both of human culture and the natural world (inasmuch as those things can even be separated), then indeed there is nothing outside the text. But that does not mean the text is wholly illegible, or that it does not exist—simply that in reading, we affect what it says, and in return it changes us. We are unavoidably caught up in perspectival context, without truly objective ground to stand on. But objectivity was always an implicit abdication and obscuration of power and the necessity of choice. It was the idea that we could calculate what to do from objective factors that we didn’t have to take responsibility for. We do have to take responsibility, but that can mean a proactive positive acceptance. We can step up to the challenge of power and perspective, rather than reject it out of guilt and trauma.
Seen thus, a living post-structuralism is a philosophy not of radical alienation but radical interconnection. It is not the death of stable meaning, but the moment a system we thought rigid, immovable, and observable from the outside stirred and opened its eyes to return our gaze. It is also increasingly supported by contemporary studies in—for example—ecology and theoretical physics. If even the hardest of hard sciences now advances a theory of reality that embrace radical uncertainty and the implication of the observer in what is observed, then surely the humanities can do so as well without giving up on meaning altogether?
The great insight of postmodernism is that meaning is unstable, and mediated in infinite complexity by systems of power in which we are decentered but implicated. But the response to this insight from the humanities has been a furious rearguard action by the ideology of fixed meanings that postmodernism itself displaced. Enlightenment rationalism is to postmodernism as Newtonian physics is to general relativity, and it is in the ‘social justice’ ideologies now increasingly hegemonic in elite institutions that Enlightenment rationalism is seeking to make its last stand against the new philosophy of radical interconnection.
If postmodernism claimed that all meanings are unstable, socially constructed, and held in place by operations of power, the defining characteristic of the anti-postmodernism that masquerades as contemporary postmodern thought is its determination to apply that analysis to everything except its own categories and hierarchies. In effect, this system of thought seeks to recoup semiotic stability by replacing the old ‘bad’ hierarchies of Western, patriarchal, heterosexual, etc. dominance with new ‘good’ ones.
All activities, goes the claim, are tainted by the toxic operations of overdetermined systems of oppressive social meaning which speak through us and over us regardless of what little agency we might imagine ourselves to have. So in the political framework of anti-postmodernism, fixed immutable characteristics such as race assign their bearers a position on a rigid hierarchy of ‘marginalization’ which in turn influences their status within the system. The legitimacy of the new, fixed hierarchies of marginalization-as-status rests, we are told, in how they correct for, deconstruct, and overcome previously imposed power inequalities. The chief form of political action is a wholesale effort to dismantle these former inequalities, wherever they may be found.
But in practice, the demand that all historically imposed power relations be deconstructed unwinds the legitimacy of any possible social relationship or institution. All meanings necessitate the exclusion of what-is-not-meant. Making absolute inclusion a central political demand is thus in effect a call for the abolition of meaning. We are never told what form the good life might take, should this project of semiocide ever be completed. But one thing is clear: it can have no social or intersubjective dimension, for that would imply shared meanings, and with shared meanings the operations of power—exclusion, definition, the imposition of significations not wholly self-chosen—inescapably return, as do hierarchies. In this sense, the push for semiocide in the name of social justice is a project whose ultimate aim is an individuation so total it precludes any form of encounter with the Other, except in a multidirectional contest for domination that none can be permitted to win.
From other vantage points within the culture war, the reaction to this doctrine is often mockery, for the doctrine’s self-absorption, incoherence or preoccupation with language and ‘luxury beliefs.’ This is mistaken. Its adherents are motivated by compassionate idealism, but have been misled by a destructive falsehood and are in many cases deeply unhappy. The decentering of the Enlightenment subject brings with it an invitation to a more fluid experience of selfhood as radically inseparable from and in a process of co-creation with all of reality, and yes, with the power structures of the society in which we live. But the contemporary critical theory I am calling anti-postmodernism shows young people this vision of beauty, only to dismiss it as a pack of tendentious, self-interested lies.
It is no wonder today’s young people fling themselves miserably against the bars of whatever structures of meaning are still standing in an effort to knock them down—or perhaps to prop themselves up. Whether it is the SJWs, the frog memers, or the ‘failson’ ironists, they can smell the fresh breeze of meaning, less linear than the rationalists would like but nonetheless real, and yet they have been told they cannot have it, because it is not there, or else comprises only violence and hostility. So, they fight over the broken rubble of the Enlightenment, or with each other, or their ancestors, and starve in the midst of a banquet.
To recap, then: what gets called ‘postmodernism’ today is not postmodernism but the last spasm of the worldview displaced by postmodernism, that saw meanings as fixed, knowable and amenable to human mastery. This anti-postmodernism diverts young people from the astonishing richness of a systems-based, decentered engagement with the world’s semiotic complexity by seeking the only remaining form of mastery it can imagine: a defensive assault on meaning itself.
Instead of embracing the fluidity of systems of meaning, and each subject’s situatedness within that system, young people are taught that the only legitimate foundation for political action—or indeed any kind of social participation—is atomized selfhood, constructed from within and defended with narcissistic brittleness. They are taught to see themselves as solely responsible for discovering, curating, optimizing and presenting this supposedly ‘authentic’ self as their central marketable asset. But they also learn that it is continually under assault by hostile forces of oppressive social meaning whose aim is to keep them—or others like them, or someone anyway—marginalized, abject and on the back foot.
Within this system, it follows that the central locus of political activism must be to disrupt these oppressive forces that marginalize unfavored groups, so as to advance the project of collective liberation to ‘be our authentic selves.’ This is not just a political project but an existential one, for along with marginalizing unfavored groups these forces impose unlooked-for and oppressively overdetermined social meanings on each of us, undermining each young person’s quest for authentic selfhood. Individuals caught up in this worldview genuinely believe they are agitating not just for the liberation of the oppressed but for their very existence.
The fixation of today’s elite graduates on ‘validation’ of ‘identities’ may seem frivolous to older generations. But within a worldview that frames all forms of social meaning as oppressive by definition, the very gaze of the Other is an unacceptable attack on the pristine territory of the self. If we reject the genuinely postmodern ethic of radical semiotic interconnection, and our interwovenness with structures of meaning in society and the natural world, then the movement of these structures in, on and within our individual identities comes to be experienced as violence.
This perspective exists in tormented symbiosis with an Other it can neither tolerate, nor yet wholly dispense with. For the paradox is that the invasive gaze of the Other, laden with unwanted and oppressive shared meanings, is simultaneously the source of suffering and salvation. The gaze of the Other is experienced as a hostile and violent invasion, forever imposing unlooked-for social meanings that constrain the liberty of each sacred self. But it is also the only source of the ‘validation’ that will reassure each individual that their self-creation project is real, true and accepted.
The solution, within this worldview, is an (again paranoid-schizoid in the Kleinian sense) ever more desperate effort to control the thoughts of the Other. We see this in politicized campaigns to control speech in the service of identities. But as any psychotherapist (or parent) will tell you, trying to control the inner life of another is a project that in normal circumstances seems achievable (or indeed desirable) only to young children or the mentally disturbed. That it should become a central political desideratum for a generation of elite young people does not bode well for the future health of public life.
When I started my undergraduate degree 20 years ago, critical theory was one epistemology among several, which we learned about as it were ‘from the outside’ rather than as a framework for understanding other philosophies. Though it affected me severely, in ways I have already described, most of my contemporaries simply learned about the ideas and were largely unaffected. Today, though, this epistemology has eaten and digested the humanities and begun to nibble on science and even mathematics. As a result, for today’s young people, it is increasingly difficult to find a vantage point outside its political ontology from which to evaluate its operations.
We should not be surprised, then, that mental health issues have skyrocketed in elite college-age populations. They are being taught to believe, as a foundational framework for understanding the world, that acceptance in the gaze of the Other is key to validating a selfhood they alone are responsible for creating, curating and optimizing. But they are also being taught that all shared meanings—in other words, anything conferred by the gaze of the Other—represents a hostile act of violence. How is any young adult meant to navigate this catch-22?
It is a mistake to dismiss this as narcissistic—or, at least, to ignore the suffering of those trapped in this bind. To be ‘defined’ by something other than our own desire is in this system to be injured, parts of our authentic self mauled or amputated, whether by social meanings we did not choose or the givens of our embodied existence. This is a phenomenally cruel thing to teach young people, as it leaves them feeling perpetually oppressed by the givens of existence itself.
This analysis also sheds light on the crisis of elite purpose and leadership Natalia Dashan described in her Palladium piece last year. If shared meanings are not only unavailable but actively hostile, how is any young person meant to formulate a legitimate rationale for stepping up? No wonder so many elite graduates dismiss even the possibility of public service in favor either of pecuniary self-interest in finance or tech, or else joining the ranks of activist-bureaucrats seeking to advance the destruction of shared meanings in the name of total inclusion.
But as societies around the globe struggle to get to grips with coronavirus, we no longer have the luxury of sitting about like Shakespeare’s Richard II, mourning a broken model of meaning as the world disintegrates around us. Facing the deaths perhaps of loved ones, and certainly of everything we thought of as ‘normal’ until a few weeks ago, destroying what is left of our structures of social meaning in the name of liberation politics or frog-meme irony is an indulgence we cannot afford. The project of reconstruction is urgent. This project is both an inner and an outer one: reconstruction of an inner life capable of navigating social meanings without experiencing them as violence, and also of our willingness to participate in the external, political analogue of those social meanings, namely institutions, political structures and—yes—hierarchies.
This is not to say that we should shrug at unjust systems of domination. The ‘social justice’ excavation of ‘implicit bias’ is not wholly without merit. It is on all of us to make sincere efforts to meet the Other to the best of our abilities as we find it, and not simply reduce the world out there to our preconceptions. But this effort cannot be so all-encompassing as to destroy what systems of shared meaning we have left. Nor can we afford to see it grind common endeavor to a standstill.
No one knows yet what the world will look like as we emerge from the political and economic convulsions engendered by this global pandemic. One thing is clear, though: the ethic of radically individualist atomization implicit in ‘social justice’ campaigns for the destruction of all shared meaning is woefully inadequate to the challenges we now face. Through its lethal spread and infectiousness, coronavirus has demonstrated vividly how our fates remain bound to one another in infinitely complex ways, however loudly we may assert our right to self-authorship. Faced with the persistence of our social, biological, semiotic, economic, and ecological interconnectedness, we would do well to embrace and make a virtue of it, to salvage those shared meanings that remain to us, and begin the process of building new ones that will sustain us into the future.
This article was originally published at Palladium magazine.
Today’s hottest property: young fogeys. Blue Labour hailed Boris Johnson’s landslide election victory as a rebellion by the country’s ‘culturally conservative’ silent majority. A new conservative magazine seems to appear every week. We have even seen a youth movement for the revival of socially conservative values popping up in that bastion of modern double liberalism, the Conservative Party.
What do they all want? At the more wonkish end of the debate, the argument is broadly that the political push throughout the twentieth century for ever greater social and economic freedom has brought many benefits, but that these have been unevenly distributed and are now reaching the point of diminishing returns.
The pursuit of ever greater freedom and individualism, this strand of thought argues, has delivered rising wealth while hollowing out working-class communities; liberated some women while forcing others to work a double shift and abandon the young and old in substandard care, and provided an infinitude of consumer choice but at the cost of mounting ecological damage. Under the sign of radical individualism, the new communitarians argue, we are all becoming more solitary and self-absorbed. Even charitable giving seems to be in unstoppable decline.
But what, in practice, are the new social conservatives seeking to conserve? Calls for a revival of cultural conservatism, many in the name of Christian values, seem often on closer examination oddly insubstantial. In 2017, UKIP’s leader-for-that-week Stephen Crowther said that the UK is a Christian country, “and we intend to stay that way.” But for Crowther, being a Christian country does not seem to impose any obligation to actually be Christian:
including Christian in our list [of principles] does not imply any requirement for individual faith, but it reflects the Judeo-Christian classical and enlightenment origins on which our laws, our social systems and our cultural norms have been built over two millennia.
Elsewhere in Europe, Hungarian Prime Minister Victor Orbàn describes his brand of authoritarian, identity-oriented politics as ‘Christian democracy’. Only a minority of Hungarians go to church every week – 56% of the country identifies as Catholic, though only 12% attends church regularly – but the identifier ‘Christian’ has nonetheless become central to Orbàn’s politics.
Much as Crowther did, the Orban-supporting Bishop of Szeged, László Kiss-Rigó, bridges this gap with a vague, cultural definition of what actually constitutes a ‘Christian’: “In Europe, even an atheist is a Christian”, he said. It turns out that being ‘Christian’ is less about prayer or doctrine than ‘values’: “We are very happy that there are a few politicians like Orbán and Trump who really represent those values which we Christians believe to be important.”
What exactly are these values, then? Attendees at anti-Islam Pegida rallies in Germany carry crosses and sing carols. Italian right-winger Matteo Salvini punctuates anti-immigration rhetoric by brandishing a rosary, drawing criticism from the very Catholic faith whose symbols he invokes. Try to pin down any actual values this form of Christianity might require of its adherents, and matters are much less clear.
Even those whose stated desire is to defend the place of faith in public and political life seem keen that the faith itself stop short of imposing actual obligations. To take a more moderate example of the new cultural conservatism, the Social Democratic Party took a broadly post-liberal, culturally conservative stance in its 2018 relaunch. The New Declaration made an energetic defence of our right to hold even illiberal religious views openly in public life:
Citizens holding a traditional, patriotic or religious outlook are often bullied and marginalised, stifling the open debate upon which a free and democratic society depends.
Then, about a year later, the SDP lost its only donor over a bitter intra-party dispute about whether or not it should be party policy to ban halal slaughter – a position markedly at odds with the party’s previous defence of religious pluralism. And when the Church of England recently reiterated its long-held position on sex and marriage, prominent SDP member Patrick O’Flynn took to the pages of the Daily Express to mock ‘the otherworldliness of these Men of God’. Instead of insisting on ‘out of touch’ doctrine, O’Flynn suggested, in order to attract more young people to weekly worship the Church should adjust its doctrines on sex and marriage to reflect their values.
In this view of faith, theological positions do not reflect any kind of truth-claim but should be emergent properties of the aggregate ethical positions held by the members of that church. Less ‘Christian democracy’ than ‘democratic Christianity’: whatever the congregants believe becomes the doctrine of the church.
From a religious perspective this makes no sense. To the believer, doctrine is handed down from God Himself. The thought of God’s word being subject to plebiscite is absurd, if not outright blasphemous.
This debate reveals the missing piece in today’s would-be conservative revival. Where do our values come from? What is the proper source of political authority? Progressives gesture at natural rights or an imagined future utopia, but for anyone who remains unconvinced that we are all on a journey somewhere wonderful, some other authority is required.
Edmund Burke suggested the answer lay in a blend of deference to tradition and God’s grand design, tempered by carefully constrained democratic institutions; his Savoyard contemporary, Joseph de Maistre, argued that the only proper form of authority lay in God’s will, delivered via the Pope and an absolute monarch.
The history of modernity has unfolded in the tensions between these competing understandings of political authority. ‘The will of God’, the will of ‘the People’, and the grand designs of various utopias have variously been used to justify all manner of enterprises, with outcomes from the magnificent to the horrific. But our present political difficulties may be in part down to a growing popular discomfort with accepting the legitimacy of any of the above.
Since the election of Donald Trump and the vote to leave the EU, there has been a low but persistent rumble from our moral betters that democracy should maybe have its wings clipped a little, to stop stupid proles making bad decisions. A degree of wing-clipping has in fact long since taken place: John Gray has discussed recently in these pages the way the language and legal mechanism of ‘rights’ is used to shift entire areas of public life from democratic debate to the dry realm of unelected lawyers and judges. But if authority does not reside in the will of the people, nor does it reside with God: it is difficult to imagine a mainstream British politician claiming moral authority on the basis of divine will without being roundly pilloried.
Progress and human rights, then? Every young person who passes through a modern university is taught in no uncertain terms that totalising metanarratives are suspect. At best, they are power moves. Whenever you find one you should ask cui bono? In the case of universal human rights, the answer is probably: lawyers.
This leaves would-be conservatives in a bind. If (with a few honourable exceptions still holding out for direct Vatican rule) political authority rests not in tradition (too restrictive on personal liberty) or democracy (probably rigged) or even God (don’t tell ME what to do!) or even in the lawyers, then what is left? Politics professor Matt McManus argues that the result is a postmodernism of the right as well as of the left: a series of nested calls for a return to authority, tradition and culture that all, on closer inspection, turn out to be largely delivery mechanisms for adversarial but hollow identity politics.
Having come unmoored from its roots either in the past, the divine, or the popular will, McManus suggests that this postmodern conservatism has warped a Burkean belief in tradition into a kind of moral cosplay whose main purpose is less seeking the good life than making a noisy defence of whichever identities its sworn enemies attack. As the postmodern liberal-left demonises heterosexual white males, so postmodern conservatism sets out to defend them; and so on.
Seen in this light, the problem with Orbàn and other borrowers of Christian clothing is not that they do not believe their own words. Inasmuch as they can mean anything, they genuinely identify as Christians. It is more that when all sources of authority are suspect, the only legitimate recourse is to the self: to identity, and identification.
And the problem with identification is that it remains separate from whatever it identifies as. Just like the modern dating marketplace, where commitment is radically undermined by the ease of swiping right, modern cultural conservatism is radically undermined by the fear that without a reliable foundation of authority, and with more identity-choice options only a click away, we are never fully the thing we claim as our identity.
Without a sense of confidence in the roots of its political legitimacy, conservative values dissolve from concrete obligations to consumer accessories. This in turn is why Orbànist ‘Christian democracy’ and many of its populist cousins find their most compelling realisation not in religious doctrine or observance, but in defining themselves against their outgroup. If “even an atheist is a Christian” then either no one is a Christian, or everyone is. The only way of defining what a Christian is, is in terms of what it is not: foreigners.
But if this is so, then in a postmodern environment, shorn of recourse to authority, cultural conservatism is a waste of energy. It cannot define what it wants. All is insubstantial; there is no exit from the Matrix, nothing left to conserve.
Does it follow from this that those who long for place, limits, love, family, faith and meaning should just sit in the rubble and watch it all burn? I do not think so. But when there is nothing solid to go back to, anyone attracted to what is left of the ideology that used to be called ‘conservative’ needs to find a new name for their yearning. ‘Constructionists’, perhaps. There is a lot of building to do.