Why won’t our artists aim higher?

London ravaged by disease. Social and sexual mores collapsing. Shifting political alliances and a wobbling constitution. A Babel of competing voices vying to dominate new media channels, driving public discourse to fever pitch. It’s not the first time we’ve been here.

Today our artists embrace (and sometimes accelerate) the vibe. Sculptors are more interested in subverting statuary than glorifying anything; painters warn of an oncoming apocalypse in two-storey murals and most music is about getting laid. But back at the dawn of the modern world, when politics, culture, mores and faith were as much in flux as they are today, the 18th century’s artists took a more aspirational approach.

The cultural sphere they depicted was every bit as harmonious as the world that produced it was volatile. But while today we still listen to the measured strains of Handel, and marvel at the elegant proportions of a building by Inigo Jones, the poets of the same era are ignored. Of these, the most criminally underrated is also, perhaps, the one whose work offers the most intriguing clues for the modern world: Alexander Pope.

Pope was born the same year as modern Britain: 1688, when a group of English statesmen deposed James II as King of England, in favour of his son-in-law William of Orange. The reasons for James’ deposition were complicated, but included his Roman Catholicism as well as his insistence on the king’s divine right to abolish Parliament and govern centrally via decree.

Unenthusiastic about absolute monarchy, and nervous of future kings trying it on again, Parliament slapped new constraints on royal power — and the upshot was the constitutional monarchy we’ve lived with ever since.

As the old order liquefied at the end of the 17th century, and the fight began in earnest for power at the beginning of the 18th, aristocrats and a new class of emerging industrialists poured in to fill the vacuum left behind by an absolute ruler. These politicos increasingly split between “Tory” defenders of James II, and “Whig” proponents of Protestantism, in a political configuration that gradually took the form that would become our modern adversarial Parliament.

This binary antagonism, every bit as values-driven and visceral as the Leavers and Remainers of today, drove a febrile “us and them” political discourse. And in a forerunner of today’s clickbait-for-profit content machine, the flames were fanned by advances in printing technology, that made the written word suddenly cheap and plentiful. Presses sprang up like mushrooms, and publishers grew rich selling the scandals, libels and “fake news” of the day.

Modern politicos blame social media for a decline in public civility. But compared to the grotesque caricatures, insulting posters, inflammatory street speakers and assassination plots against senior Tories that characterised politics in the early 18th century, what gets painted today as declining standards of politeness appears more like a return to form.

New governing elites, having displaced an absolute monarch less than a generation before, were sharply aware of how fragile public consent was for their newfangled constitutional monarchy — and how much potential hostile presses had to shatter that consent. In a move that foreshadows modern drives worldwide to regulate social media, new laws pushed to suppress dissent: the 1706 Star Chamber Case De Libellis Famosis ruled that accusations against the monarch or government could constitute seditious libel even if they were true.

Pope was in many ways an outsider, a condition that today we associate with a subversive mindset. Like the deposed James II, he was Catholic, and also a Tory in a hegemonically Whig era. But he was as preoccupied with order and stability as the Star Chamber, and — albeit in a different way — every bit as critical as they were of the newly democratic world of letters.

Rather than the law, though, Pope’s battleground was literature, where he emerged as a fierce defender of high culture and classical tradition against the pandemonium of “Grub Street”. First published 1728, The Dunciad pillories the hacks of “Grub Street”, in ironically high style, as a throng of “Dunces” under the Queen of Dullness herself.

It’s perhaps the most barbed and sparkling feature-length piece of literary shade ever thrown, by turns cultivated and scabrous. Where Twitter today might just call someone a shit writer, Pope depicts one rival as powered by the spatterings of Jove’s own chamberpot:

Renew’d by ordure’s sympathetic force,
As oil’d with magic juices for the course,
Vig’rous he rises; from th’effluvia strong;
Imbibes new life, and scours and stinks along; (Dunciad II, 103-6)

Without a working knowledge of Pope’s political and literary world, getting The Dunciad’s jokes is bit like someone from the year 2320 to try and follow the jokes on Have I Got News For You. But it’s hard not to see an echo in it of our access-to-all digital “publishing” environment, and the impact it’s had on the contemporary discourse:

‘Twas chatt’ring, grinning, mouthing, jabb’ring all,
And Noise, and Norton, Brangling, and Breval,
Dennis and Dissonance; and captious Art,
And Snip-snap short, and Interruption smart. (Dunciad II, 231-34)

Pope’s blend of wit, erudition and waspishness made him a sharp satirist of contemporary chaos, but his happier visions were of tradition and harmony. London, in Windsor-Forest (1713) was envisioned as a gilded, ordered, place and the rightful heir of antiquity. Faced with its glory, the Muses would quit singing about the glories of ancient Rome, and praise England’s capital instead:

Behold! Augusta’s glitt’ring Spires increase,
And Temples rise, the beauteous Works of Peace. (Windsor-Forest, 377-8)

“Augusta”, a Roman name for London, gives Pope and his contemporaries the name by which we know them today: the Augustans. And yet London in Pope’s day was not a vision of order and beauty at all, but famous for slums, licentiousness, corruption and STDs.

The print boom extended to a flourishing trade in porn, with smutty publications bought not just for private consumption but to read aloud in pubs and coffee houses. And prefiguring Frank Ski by some centuries, there really were whores in all kinds of houses: Covent Garden was a byword for the sex trade, from the low-class “flash-mollishers” and theatre-visiting “spells” to brothel-operating “bawds” and “Covent Garden Nuns”. Prominent prostitutes, such as Sally Salisbury (1692-1724) became celebrities: Salisbury’s noted clients including Viscount Bolingbroke, and even (according to rumour) the future George II.

On top of this gossipy, salacious and politicised backdrop, urban living conditions in the city were filthy and disease-ridden: more people died in London in the 1700s than were baptised every year. The century was characterised by near-continuous military engagement. So on the face of it, nothing makes sense about Pope’s depiction in the 1733 Essay on Man, of all the cosmos as “the chain of Love/Combining all below and all above”, in which “Whatever IS, is RIGHT”.

This seems especially strange today, in the light of our modern preference for art that’s “representative” of demographics or otherwise reflective of “the real world”. But Pope’s fixation on order, hierarchy and beauty make sense, because he feared that the alternative to an idealised order would be infinitely worse:

Let Earth unbalanc’d from her orbit fly,
Planets and Suns run lawless thro’ the sky,
Let ruling Angels from their spheres be hurl’d,
Being on being wreck’d, and world on world,
Heav’n’s whole foundations to their centre, nod,
And Nature tremble to the throne of God:
All this dread ORDER break – for whom? For thee?
Vile worm! Oh Madness, Pride, Impiety! (Essay on Man, Ep. I, 251-7)

Modern tastes run more to deconstructing than glorifying canonical art or the social hierarchies it idealises. Today we’re all about writing doctorates on marginalia, humanising a stammering monarch, or revealing the sexual licence beneath the aristocratic facade. But from Pope’s perspective, it was order that needed defending, as the only real defence against tyranny:

What could be free, when lawless Beasts obey’d
And ev’n the Elements a Tyrant sway’d? (WF, 51-2)

Read against the corruption, volatility and rampant, clap-infested shagging of Georgian high society, the restrained vituperation, classical learning and formal orderliness of Pope’s writing could be seen as a paradox. Or, perhaps, a state of denial. But what if it was more a set of aspirations that succeeded — just not straight away?

The ensuing century, dominated by Victoria and Albert, is perhaps Peak Order for modern Britain. If Boswell’s diaries, in the latter half of the 18th century, record 19 separate instances of gonorrhea, Victoria’s ascent to the British throne in 1837 was characterised by a society-wide backlash against the excesses of the preceding era.

Whether methodically colouring the globe in red, or imposing strict codes of sexual conduct, public-spiritedness and emotional reserve at home, the Victorians reacted against the perceived licentiousness of the Georgian era — by delivering the kind of order that Alexander Pope both depicted in his writing and also, in his own political era, never saw realised.

In the time since Peak Order we’ve all become somewhat more free-and-easy again. But we should be wary of viewing this either as evidence of moral progress, or (depending on your outlook) of a decline that’s likely to continue indefinitely. Our age has its digital Grub Street, its own pandemic, its unstable political settlement, and its patronage politics. So perhaps it may yet produce its own Alexander Pope, and with it a new poetics of order — for a future none of us will live long enough to see.

Originally published at UnHerd

Our humanity depends on the things we don’t sell

Earlier this year, mining company Rio Tinto dynamited a 46,000-year-old Aboriginal sacred site in the Pilbara region of Australia, in pursuit of the iron ore deposits that lay beneath the sacred caves. The decision triggered outrage from Aboriginal communities and the wider world alike. Pressure from investors concerned about the resulting PR disaster eventually forced the CEO to resign.

But that’s not much of a victory for those to whom the now-destroyed site was sacred. As a place of pilgrimage, continuously inhabited since before the last Ice Age, its religious significance had accumulated over millennia of repeated visits, inhabitation and ritual. The holiness of Juukan Gorge lay in the unimaginably long-term accretion of memories, social patterns, and shared cultural maps by countless generations of the Puutu Kunti Kurrama and Pinikura peoples.

Strip mining, the method of resource extraction used to reach much of Pilbara’s iron ore, was the subject of a blistering 1962 Atlantic essay by Harry Caudill. Titled ‘Rape of the Appalachians,’ it describes a process as violent as the analogy suggests, in which entire mountaintops are removed in search of coal deposits. But when you consider the role played by commerce, it’s more accurate to describe the process as prostitution. 

It’s not unusual for those looking to destigmatize prostitution to argue that selling sexual access to one’s own body should be morally acceptable, precisely because it’s no worse than coal mining. So here we have two sides of a disagreement, both of whom see commonalities between prostitution and mining, even as they disagree over whether the action itself is good or bad.

How would we characterize what prostitution and mining have in common? Resource extraction, perhaps. Dynamiting Appalachian mountaintops has obvious tradeoffs, but on the upside you get to extract coal from the exposed rock, which you can then use to generate electricity. We accept the environmental destruction, deterioration in air quality, and changed landscape contours (or at least mostly choose to overlook them), because the alternative—no electricity—appears worse. 

Selling access to female bodies is also a form of resource extraction. The product may be subtler—orgasm, the illusion of intimacy, darker types of wish-fulfilment or, in the case of commercial surrogacy, a human baby—and the tradeoffs less visibly destructive than a landscape reshaped. But the dynamic is similar. In each case, a woman rents access to something that we consider to belong to each individual alone—her body—and earns money in return. The American Civil Liberties Union, which has supported the decriminalization of prostitution since 1975, recently argued for de-stigmatizing “those who choose to make a living based by self-governing their own bodies.” Earning money independently is good. Self-government over our own resources is good. So on what basis can we criticize people who choose to sell access to their own bodies? 

In his 1954 lecture ‘The Question Concerning Technology,’ Martin Heidegger argued that when we organize life under the rubric of technology, the world ceases to have a presence in its own right and is ordered instead as ‘standing-reserve’—that is, as resources to be instrumentalized. Coal and iron ore, the products of technology themselves, and even human sexual desire then come to be seen as part of the standing-reserve. It becomes increasingly difficult to see reasons why there should exist any limits on extracting such resources.

Today, it feels as though we’ve always been engaged in this inexorable onward march. From a more mainstream perspective, what Heidegger is describing is simply the process we now call economic development. It is the transition from pre-industrial societies—characterized by primitive and localised forms of exchange, low workforce mobility, and in many cases by extreme poverty—to longer and more complex supply chains, technological innovation, more trade, more stuff, more wealth, and more personal freedom. 

But as Austro-Hungarian economist Karl Polanyi argued in The Great Transformationfor much of human history trade occupied a much less central place in human relations than it does today: “man’s economy, as a rule, is submerged in his social relationships.” Polanyi showed how in Britain, economic development and the emergent market society was driven by the Enclosure Acts between 1750 and 1860. Prior to enclosure, much of Britain comprised subsistence-farming peasants living on land held and worked in common. 

Enclosures, justified by a need for greater farming efficiency, stripped the peasantry of any right to common land in favor of a private property model. Pre-enclosure, the necessities of life might have been bare, but many of those necessities existed outside the realm of ownership and trade. A peasant might spend his or her whole life in a tied cottage, with a right to common land, working to produce food but with very little need to buy or sell anything. Enclosure reordered whole swathes of human life from the shared, social realm to that of standing-reserve: that is, the realm of private property and transactional exchange.

Post-enclosures, what had previously been held in common—whether land or labor—was now privatized as standing-reserve for exploitation by free individuals. In the process, millions of human lives were arguably made much freer. The working poor were liberated from the feudal ties often implied by subsistence farming, free to move if they pleased, and free to sell their own labor for money. 

But this development was never simply the voluntary spread of a new, enlightened way of making everyone better-off. Like mining, it came with tradeoffs: peasant resistance to the Enclosure Acts suggests that for those people, at least, something was lost. And if enclosure opened up domestic markets in goods such as housing and food, it did not rely on the consent of those British peasants forcibly displaced from subsistence lifestyles into waged factory work.

The violence involved in opening up colonial markets likewise rejected the benign invisible hand. In February 1897, for example, not long after the completion of the enclosures in Britain itself, British imperial officials responded to the Oba of Benin’s refusal to open up trade in palm oil and rubber from his thriving city-state on the Niger Delta. Their answer was the Punitive Expedition, in which 5,000 British troops armed with machine guns razed Benin, massacring its inhabitants, flattening its temples, and looting the bronzes that inscribed its most treasured cultural memories. A month after the Punitive Expedition, a golf course had been laid over the city’s site, with the ninth hole where the most sacred tree had stood.

Most histories of the present characterize the story of economic development as an upward one of human progress, that has liberated millions from indentured labour into greater agency as free individuals in market society. And there’s something in this story of freedom; I wouldn’t swap my life today for that of a medieval subsistence peasant. But, like the extraction of Appalachian coal, nothing comes without tradeoffs. And while it’s easy enough to describe historical events in our transition from a largely relational society to a largely transactional one, the cost of moving to a market society is more difficult to count. 

It’s perhaps easier to find a way into this blind spot via a more recent large-scale displacement of humans from a relational to a market existence. The migration of women from the domestic sphere to the workplace began in earnest in the 20th century, and it’s perhaps not a coincidence that it gathered pace around the time the economic gains available via overseas colonial expansion began to falter. I’ve never been a subsistence peasant or Aboriginal nomad, but for a few years I did step a small distance outside the market society as a full-time mother. And what I learned there about how, and why, this form of work is invisible today helps to illuminate the tradeoffs demanded by the market society. It also offers clues as to how we might yet stand for things crucial to humans but indefensible within a transactional worldview, such as ecosystems, sacred places, or even a view of dating that isn’t a sexual marketplace.

For something to be treated as standing-reserve, it must be possible to own it. Our social norms demand that we claim ownership of a resource before exploiting it. Selling my labor in the marketplace presumes that I can dispose of my time as I see fit, that no one else has a claim on my time or my body—in short, that I’m a free individual.

But to be a mother is quintessentially to experience not entirely belonging to yourself. It begins in pregnancy, with the realization that what you eat or drink affects your unborn child; it continues with breastfeeding, as you make the food that nourishes your child with your own body; it goes on far beyond infancy, in the need your children have to be physically close to you. When you see how powerfully your small child craves your presence, it’s very difficult to sustain the illusion of belonging only to yourself.

To the extent that something belongs to others as well as to ourselves—such as common land in 18th century Britain—it will resist being privatized for use as standing-reserve. So caring for my child can’t easily be viewed as a transaction, because it’s a relationship in which we aren’t exactly individuals. That is, we don’t belong only to ourselves, but to each other as well. And when you don’t belong solely to yourself, work can be understood not as a transaction—my labor exchanged for your money—but as relational. In other words, it is less oriented toward resource extraction and exchange than sustaining interdependent patterns of life. 

This in turn helps explain why the politics of our market society has such a blind spot where motherhood is concerned: the market society’s notion of liberation into the standing-reserve is deeply at odds with the work of caring. Sustaining interdependency isn’t about fleeting transactional logic. It’s about maintaining a valuable relationship. I don’t care for my child or my partner because I have a utilitarian goal in mind, but because we belong to each other and that makes caring for them a necessity for my existence too. 

Despite being in a sense repetitive and goal-less, caring is also pregnant with meaning. As the pioneering biosemioticist Wendy Wheeler puts it in Information and Meaning, repetition and pattern are central to communication throughout the organic and even the inorganic world. Organisms and natural systems don’t just respond to one-off signals, but rather exist in emergent, interdependent dialogue with the signals sent by other organisms and environmental factors around them—what Jakob von Uexküll calls an organism’s Umwelt. Thus, information in the natural world does not exist in some abstract sense, but only in the context of how it’s received within larger feedback loops. From the smallest microbiota to complex human civilisations, meanings are fundamentally relational, contextual, and pattern-based. 

Seen this way, it’s easier to understand why non-transactional, relational spheres of life and particularly family, remain Americans’ most potent sources of meaning. For individuals, meaning is to be found less in peak experiences, one-offs, the exceptional or abstractit hides in the repetitive, the everyday, and the relational. At a collective level, meaning coils through those pattern-languages transmitted via tradition, whether in vernacular architecture, folk music or oral histories. It lies thick in sacred places: humans have long used pattern, repetition, and the expected as the core of ritual religious and spiritual practices. 

The philosopher Adam Robbert connects meaning-making with askēsis, a Greek term that refers to the act of practice and discipline as itself a form of extended cognition, that enables the expansion of meaning-making beyond the rational sphere via the bringing-together of attention and repetition. We can understand motherhood as a kind of relational askēsis, whose core is the attentive, attuned pattern-work of sustaining a child’s Umwelt while they are too young to do it themselves. This is a central reason why many women are willing to sacrifice social status and earning power to work part-time or stay at home with young children: it’s as satisfyingly rich in meaning-as-pattern as it is starved of social status and pecuniary reward.

But the central concern of mothering with pattern, sameness, and contextual meaning as opposed to information devalues it in the order of standing-reserve, even as it delivers untold riches on its own terms. Information theory, a core science underpinning much of our technology, explicitly excludes the realm of pattern and sameness as ‘redundancy,’ preferring to focus on the unexpected. Our contemporary culture is quintessentially one of information theory: we celebrate the new, the innovative, the individual who doesn’t follow the rules. I can’t think of many movies where the hero defies calls to go his own way and instead saves the world by embracing convention.

And yet meaning, as Wheeler emphasizes, “is made up of pattern, repetition, the expected.” Information theory is thus blind to it, as she further points out: “What information engineers count as redundancy, living organisms in their systems count as meaning.” In this worldview, the tradeoff between motherhood and the workplace is a brutal one. No matter how meaningful life with a baby seems in its relational context, we have no vocabulary for understanding that, save as redundancy. It’s no surprise to discover that market society frames caring for children as a punishment: “the motherhood penalty.” 

The transactional world has little facility for repetition, pattern, or the expected; this is ‘redundancy’ to be dismissed in pursuit of the special, the distinct, the signal. This blindness to meaning-as-pattern, visible in the devaluation of motherhood and trust relationships, is similarly evident in contemporary architecture’s indifference to those vernacular pattern-languages in local built environments, that encode ways of life specific to different places. You can see it again in the treatment of folk music as second-class and unoriginal, the dismissal of religious practice as dogma, or the indifference to accumulated sacredness that allowed the destruction of Juukan Gorge. 

Within the worldview that reads motherhood as a punishment, ecologies of meaning accumulated via everyday pattern, human relationship, or religious ritual are at best yet-to-be-monetized resources. If they resist this transformation, they are obstacles to be ignored or dynamited. Bringing these pieces together, it’s now easier to see what’s lost under the rubric of information theory and standing-reserve. To see the world in terms of standing-reserve means seeing it as transactions rather than relationships, and information rather than meaning: as Heidegger puts it, “shattered,” and confined to a “circuit of orderability.” 

This shattered world is the same one the market society mindset calls ‘open’: openness to new forms, after all, means weak adherence to existing ones. To borrow Oscar Wilde’s famous phrase, then, seeing the price of everything by definition means seeing the value of nothing. Reframing the world in transactional terms, as ‘open’ resources that can be instrumentalized, necessitates the destruction of its meanings. Strip-mining self-evidently degrades the environment being mined. After demutualization, it took less than two decades for Britain’s building societies to go from embedded, localized community micro-lenders to casino-banking basket cases. And people who sell sexual access to their own bodies find it difficult to form and maintain intimate partner relationships

Likewise, treating human gestation as a service in commercial surrogacy interrupts the biologically-based symbiosis between mother and child that makes such relationships resistant to marketization. Instead, surrogacy contracts treat the baby as separate from its mother, a product that can be commissioned. Humans are thus shattered and reordered as objects, as in this case of a disabled child rejected both by her commissioning ‘parents’ and also by her Ukrainian gestational mother, as though she were a faulty smartphone.

Here we begin to see more clearly who pays when we replace meaning with information and relationship with transaction: anyone in need of care, and anyone leading an ordinary life. The winners in the information world are those whose lives are oriented toward peak experiences, agency, variety, surprise, and control. To the extent that you find fulfilment in pattern, repetition, and the quotidian, a technological and economic order blind to meaning-as-pattern and hyperfocused on the unexpected will be, by definition, unable to see you. 

But we’re running out of relational resources to convert and consume. Much as on current trends many key natural resources will be exhausted within a few decades, there are signs that in our civilization, the relational commons that underpins ordinary human life is approaching a point so shattered that the capacity of society to function is increasingly compromised. Certainly where I live in Britain, the weak institutional response to COVID-19 has revealed a nation in which social solidarity may be present on a local level, but is increasingly, acrimoniously, absent at scale. 

Pursuing resilience in this context means seeking out the relational, and looking to strengthen it: that means standing up for the interests of women, babies, the everyday, the natural world—and the value of norms, custom, and religious faith. From this, it follows that defending women and the environment means not embracing but resisting the logic of transaction. In that case, communities with some religious basis for sustaining relational resources as a sacred domain will prove more resilient than the ‘liberatory’ vision of market society and standing-reserve—precisely because they reject the appetitive logic of transaction. 

From a transactional point of view, this is at best a romanticization of some imaginary lost Eden, and likely a manifesto for ending innovation and demand to return to pre-industrial society. But a defense of ordinary-ness, pattern and repetition does not imply turning back the clock, or levelling all humans to identical cellular automata. Nor is it a case against extraordinary people: the natural world, after all, has megafauna as well as microbiota. 

Making the case for meaning as well as information is not to claim that we should revert to Tudor times, all be the same, or all spend our lives raising children. But it’s to defend pattern, repetition, and ordinariness as valuable in their own right, whether as the medium for future rituals and sacred places to emerge, as the domain of social life, or simply as bulwarks against the voracity of a transactional worldview that would commodify even our deepest social instincts. It’s to argue for our radical interdependence with our Umwelt. And it’s to affirm that in order for a society to thrive, sacred things must not just be defended as exempt from standing-reserve, or moved to a museum like the looted Benin bronzes, but continually and actively re-consecrated. 

Originally published at Palladium

The world according to LARP

Who would have guessed that a weekend hobby for outdoorsy nerds could spawn an era-defining political metaphor?

LARP, or live action role-playing, is an offshoot of the fantasy roleplaying subculture. It involves dressing up in costume and acting out a fantasy-fiction game narrative in real time and space, sometimes over several days. A witty friend once characterised the experience as something akin to ‘cross-country pantomime’.

Thanks to lockdown, no one’s LARPing this year — at least not in the cross-country pantomime sense. But the word ‘LARP’ has escaped into the wild: far from being the preserve of fantasy fans, I’ve noticed it appearing with increasing frequency in political discourse.

When riot police finally pushed activists out of the Chapel Hill Autonomous Zone following the murder of one joyriding teenager and serious wounding of another by CHAZ ‘security’, resistance to the advancing riot shields was so paltry it prompted contemptuous accusations of ‘revolutionary larping’. Weird Christian Twitter (it’s a thing) hosts arguments where people are accused of larping more traditionalist theologies than they truly espouse. Still further out on the fringes, the QAnon conspiracy rabbit hole (don’t go there) is fiercely defended by its True Believers against accusations that it is, in fact, a bunch of people LARPing.

Around the time my friends were discovering LARP, I got into LARP’s Very Online cousin, Alternate Reality Gaming (ARGs). An artefact of the age before Facebook and Twitter colonised most of the internet, ARGs are a hybrid of online treasure hunt, mystery story, and live-action immersive theatre. The first mass-participation ARG was a promotional stunt for the 2001 film AI, and featured a series of fiendish clues for participants to crack and follow, which unlocked further elements of story including live-action segments.

For a moment in the mid-Noughties, ARGs looked like the future of storytelling. The idea of internet communities over-writing stable systems of meaning with playful new narratives that danced back and forth between the web and real world felt refreshing and subversive. With hindsight, though, the phenomenon was just a more-than-usually-creative moment in a widespread unmooring of reality that’s been under way for decades.

It’s not all the fault of the internet. In 1955, the philosopher J L Austin developed a theory of ‘performative’ language: that is, language that does something to reality in the act of being spoken. ‘I pronounce you man and wife’ is an example of performative speech — words that effect change through the act of being spoken.

Then, in 1993, the queer theorist Judith Butler borrowed the concept of ‘performative’ language wholesale and applied it to sex and gender, arguing that the identities ‘man’ and ‘woman’ — along with the bodies and biologies associated with those identities — are performative. In taking these roles on, Butler claimed, we make them real.

While these ideas pre-date mass adoption of the internet, the notion that we participate in creating our own realities has been wildly accelerated by social media. Online, it’s easy to get the impression that we can reinvent ourselves entirely, independent of our bodies or other dull ‘meatspace’ constraints. Unsurprisingly, Butler’s conception of sex and gender as performance has long since escaped the petri dish of academia and, like the concept of LARPing, is evolving rapidly in the wild.

Strikingly, the word ‘performative’ has also mutated. Today, it isn’t used as Butler did, to mean “a performance that shapes reality”, but the opposite: an insincere performance for social kudos. So, for example, celebrity endorsements of social justice orthodoxies are often accused of being ‘performative’. It means much the same as ‘larping’, but with an added payload of cynicism. So where ‘LARPing’ means “playacting at something you wish you were”, ‘performative’ means “playacting at something you don’t really believe”.

Meanwhile, the LARP is no longer confined to cheery camping trips accessorised with pretend armour. Back in the noughties, online communities refactoring reality to fit a fantasy storyline felt like a fun game, but as I stare into the sucking void of the QAnon conspiracy, that perspective now seems hopelessly naïve. It’s not a game today: it’s how we do politics.

Liberal commentators spend a great deal of energy trying to explain why this is bad. Countless writers ‘fact-check’ Trump’s bloviations, seemingly unaware that from the perspective of reality-as-ARG, the fact that Trump is lying doesn’t matter. Nor does it really matter whether QAnon is real or not. Reality is, to a great extent, beside the point.

Laurie Penny got closer to the truth in this 2018 piece, where she characterises the very notion of a ‘marketplace of ideas’ as being a kind of LARP: “a Classical fever-dream of a society where pedigreed intellectuals freely exchange ideas in front of a respectful audience”. The reality, she argues, is that this ‘marketplace of ideas’ is less free, rational exchange than dick-swinging theatre.

Those who like to imagine this pessimistic perspective is new, wholly the fault of the Orange Man (or perhaps Facebook), should recall the words of an unnamed aide to George W Bush, quoted in 2004 on the relationship between facts, reality and the military invasion of Iraq:

The aide said that guys like me were “in what we call the reality-based community,” which he defined as people who “believe that solutions emerge from your judicious study of discernible reality.” I nodded and murmured something about enlightenment principles and empiricism. He cut me off. “That’s not the way the world really works any more,” he continued. “We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality — judiciously, as you will — we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We’re history’s actors . . . and you, all of you, will be left to just study what we do.”

Ron Suskind, New York Times

Though his approach was less overtly hubristic, Tony Blair’s embrace of spin reflected a similar belief in his own ability to manipulate public narratives. ‘Political communications’ has only grown in significance since those days, and taken a self-referential turn. Today it’s as common for commentators to criticise a politician for performing badly at a presser — for poor-quality larping, or bad theatre in Penny’s formulation — as for saying things that are immoral or factually wrong.

Donald Trump is distinct from George W Bush not so much in disdaining facts as in lacking the religious conviction Bush deployed to fill in the gaps left behind by their disregard. But both, in different ways, embodied or embody the idea that what you believe is what is. If you LARP hard enough, this view says, your larp will come true.

Boris Johnson’s administration has something of the same cavalier attitude to the relationship between facts and rhetoric. To date, the handling of coronavirus has routinely over-promised and under-delivered, while seeming indifferent to the disorienting effect on public life of a string of announcements only very loosely tethered to everyday experience.

It’s not a coincidence that this larpification of politics has evolved in tandem with a public fixation on ‘evidence-based policy’. The political polarity of absolute LARP — blatant political lying — and absolute insistence on evidence are two sides of the same loss of faith in a common understanding of reality.

If you’re not convinced, consider SAGE, the government’s scientific advisory committee. Then consider ‘Independent SAGE’, a kind of counter-SAGE comprising scientists every bit as eminent as those on SAGE. This august body produces its own carefully evidence-based reports, which are then used as a foundation from which to disagree with whatever positions the Tories choose to adopt from official SAGE.

Who do we believe? That’s politics. If the Brexit debate hadn’t already killed your faith in ‘the evidence’, the competing claims of SAGE and counter-SAGE should be the death-blow. There is no dispassionate foundation of facts we can deploy to take the politics out of political decisions. The original LARPers might have a fun weekend, then pack up and go back to their normal lives; but in its political sense, there’s no outside to the game. It’s larping all the way down.

Some parts of our culture are coping better with this shift than others. Among the worst performers, to my eye, are mainstream liberals of both left and right. Many have responded to the larpification of everything by concluding that in losing objectivity we’ve lost reality. Some then imagine they can make reality whatever they want it to be by sheer force of will (the Trump/Bush approach). Others suggest we can fix things by supplying enough facts to prove whatever we already believed (the SAGE/counter-SAGE approach). Others, such as Laurie Penny, try to refuse to play.

But we haven’t lost reality, just the fixed vantage point we pretended we had from which to evaluate it. What we have instead is a kind of reality-shaping free-for-all, and there’s no opting out.

As most of us flounder, disoriented, we’re starting to see subcultures adapting. The old story about the Inuit having 50 words for snow is (appropriately) itself probably fake news. But much as a snow-dwelling people might be expected to develop specialist terminology for different types of frozen precipitation, we should understand the emergence of words like ‘larp’ and ‘performative’ as analogous. We’re developing a specialist vocabulary for types of unreality.

We’re also having to find new ways to talk about the reality that, inconveniently, refuses to go away completely. The grim story of the Iraq invasion and its destructive and bloody aftermath gave the lie to Bush’s messianic faith in his capacity to create a new reality to order. Humans still can’t really change sex. And no amount of fiddling the statistics can bring back those people who’ve already died of coronavirus.

The political future turns on our being able to get used to parsing our new Babel for what’s actually happening, and what actually matters. We have to get used to doing this without trying to eliminate the element of LARP (spoiler: can’t be done) or pretending we can abolish reality (ditto).

But there’s no putting the genie back in the bottle. If the ground is moving under all our feet now, the way forward is learning how to dance.

This article was originally published at Unherd

The Irreligious Right

Today’s hottest property: young fogeys. Blue Labour hailed Boris Johnson’s landslide election victory as a rebellion by the country’s ‘culturally conservative’ silent majority. A new conservative magazine seems to appear every week. We have even seen a youth movement for the revival of socially conservative values popping up in that bastion of modern double liberalism, the Conservative Party.

What do they all want? At the more wonkish end of the debate, the argument is broadly that the political push throughout the twentieth century for ever greater social and economic freedom has brought many benefits, but that these have been unevenly distributed and are now reaching the point of diminishing returns.

The pursuit of ever greater freedom and individualism, this strand of thought argues, has delivered rising wealth while hollowing out working-class communities; liberated some women while forcing others to work a double shift and abandon the young and old in substandard care, and provided an infinitude of consumer choice but at the cost of mounting ecological damage. Under the sign of radical individualism, the new communitarians argue, we are all becoming more solitary and self-absorbed. Even charitable giving seems to be in unstoppable decline.

But what, in practice, are the new social conservatives seeking to conserve? Calls for a revival of cultural conservatism, many in the name of Christian values, seem often on closer examination oddly insubstantial. In 2017, UKIP’s leader-for-that-week Stephen Crowther said that the UK is a Christian country, “and we intend to stay that way.” But for Crowther, being a Christian country does not seem to impose any obligation to actually be Christian: 

including Christian in our list [of principles] does not imply any requirement for individual faith, but it reflects the Judeo-Christian classical and enlightenment origins on which our laws, our social systems and our cultural norms have been built over two millennia.

Elsewhere in Europe, Hungarian Prime Minister Victor Orbàn describes his brand of authoritarian, identity-oriented politics as ‘Christian democracy’. Only a minority of Hungarians go to church every week – 56% of the country identifies as Catholic, though only 12% attends church regularly – but the identifier ‘Christian’ has nonetheless become central to Orbàn’s politics.

Much as Crowther did, the Orban-supporting Bishop of Szeged, László Kiss-Rigó, bridges this gap with a vague, cultural definition of what actually constitutes a ‘Christian’: “In Europe, even an atheist is a Christian”, he said. It turns out that being ‘Christian’ is less about prayer or doctrine than ‘values’: “We are very happy that there are a few politicians like Orbán and Trump who really represent those values which we Christians believe to be important.”

What exactly are these values, then? Attendees at anti-Islam Pegida rallies in Germany carry crosses and sing carols. Italian right-winger Matteo Salvini punctuates anti-immigration rhetoric by brandishing a rosary, drawing criticism from the very Catholic faith whose symbols he invokes. Try to pin down any actual values this form of Christianity might require of its adherents, and matters are much less clear.

Even those whose stated desire is to defend the place of faith in public and political life seem keen that the faith itself stop short of imposing actual obligations. To take a more moderate example of the new cultural conservatism, the Social Democratic Party took a broadly post-liberal, culturally conservative stance in its 2018 relaunch. The New Declaration made an energetic defence of our right to hold even illiberal religious views openly in public life:

Citizens holding a traditional, patriotic or religious outlook are often bullied and marginalised, stifling the open debate upon which a free and democratic society depends. 

Then, about a year later, the SDP lost its only donor over a bitter intra-party dispute about whether or not it should be party policy to ban halal slaughter – a position markedly at odds with the party’s previous defence of religious pluralism. And  when the Church of England recently reiterated its long-held position on sex and marriage, prominent SDP member Patrick O’Flynn took to the pages of the Daily Express to mock ‘the otherworldliness of these Men of God’. Instead of insisting on ‘out of touch’ doctrine, O’Flynn suggested, in order to attract more young people to weekly worship the Church should adjust its doctrines on sex and marriage to reflect their values.

In this view of faith, theological positions do not reflect any kind of truth-claim but should be emergent properties of the aggregate ethical positions held by the members of that church. Less ‘Christian democracy’ than ‘democratic Christianity’: whatever the congregants believe becomes the doctrine of the church.

From a religious perspective this makes no sense. To the believer, doctrine is handed down from God Himself. The thought of God’s word being subject to plebiscite is absurd, if not outright blasphemous.

This debate reveals the missing piece in today’s would-be conservative revival. Where do our values come from? What is the proper source of political authority? Progressives gesture at natural rights or an imagined future utopia, but for anyone who remains unconvinced that we are all on a journey somewhere wonderful, some other authority is required.

Edmund Burke suggested the answer lay in a blend of deference to tradition and God’s grand design, tempered by carefully constrained democratic institutions; his Savoyard contemporary, Joseph de Maistre, argued that the only proper form of authority lay in God’s will, delivered via the Pope and an absolute monarch.

The history of modernity has unfolded in the tensions between these competing understandings of political authority. ‘The will of God’, the will of ‘the People’, and the grand designs of various utopias have variously been used to justify all manner of enterprises, with outcomes from the magnificent to the horrific. But our present political difficulties may be in part down to a growing popular discomfort with accepting the legitimacy of any of the above.

Since the election of Donald Trump and the vote to leave the EU, there has been a low but persistent rumble from our moral betters that democracy should maybe have its wings clipped a little, to stop stupid proles making bad decisions. A degree of wing-clipping has in fact long since taken place: John Gray has discussed recently in these pages the way the language and legal mechanism of ‘rights’ is used to shift entire areas of public life from democratic debate to the dry realm of unelected lawyers and judges. But if authority does not reside in the will of the people, nor does it reside with God: it is difficult to imagine a mainstream British politician claiming moral authority on the basis of divine will without being roundly pilloried

Progress and human rights, then? Every young person who passes through a modern university is taught in no uncertain terms that totalising metanarratives are suspect. At best, they are power moves. Whenever you find one you should ask cui bono? In the case of universal human rights, the answer is probably: lawyers.

This leaves would-be conservatives in a bind. If (with a few honourable exceptions still holding out for direct Vatican rule) political authority rests not in tradition (too restrictive on personal liberty) or democracy (probably rigged) or even God (don’t tell ME what to do!) or even in the lawyers, then what is left?  Politics professor Matt McManus argues that the result is a postmodernism of the right as well as of the left: a series of nested calls for a return to authority, tradition and culture that all, on closer inspection, turn out to be largely delivery mechanisms for adversarial but hollow identity politics.

Having come unmoored from its roots either in the past, the divine, or the popular will, McManus suggests that this postmodern conservatism has warped a Burkean belief in tradition into a kind of moral cosplay whose main purpose is less seeking the good life than making a noisy defence of whichever identities its sworn enemies attack. As the postmodern liberal-left demonises heterosexual white males, so postmodern conservatism sets out to defend them; and so on.

Seen in this light, the problem with Orbàn and other borrowers of Christian clothing is not that they do not believe their own words. Inasmuch as they can mean anything, they genuinely identify as Christians. It is more that when all sources of authority are suspect, the only legitimate recourse is to the self: to identity, and identification.

And the problem with identification is that it remains separate from whatever it identifies as. Just like the modern dating marketplace, where commitment is radically undermined by the ease of swiping right, modern cultural conservatism is radically undermined by the fear that without a reliable foundation of authority, and with more identity-choice options only a click away, we are never fully the thing we claim as our identity.

Without a sense of confidence in the roots of its political legitimacy, conservative values dissolve from concrete obligations to consumer accessories. This in turn is why Orbànist ‘Christian democracy’ and many of its populist cousins find their most compelling realisation not in religious doctrine or observance, but in defining themselves against their outgroup. If “even an atheist is a Christian” then either no one is a Christian, or everyone is. The only way of defining what a Christian is, is in terms of what it is not: foreigners.

But if this is so, then in a postmodern environment, shorn of recourse to authority, cultural conservatism is a waste of energy. It cannot define what it wants. All is insubstantial; there is no exit from the Matrix, nothing left to conserve.

Does it follow from this that those who long for place, limits, love, family, faith and meaning should just sit in the rubble and watch it all burn? I do not think so. But when there is nothing solid to go back to, anyone attracted to what is left of the ideology that used to be called ‘conservative’ needs to find a new name for their yearning. ‘Constructionists’, perhaps. There is a lot of building to do.

This article first appeared at Unherd

Turning royalty into royalties impoverishes us all

What if we could create a marketplace for relationships, so that – just as we can rent our homes on Airbnb – we had an app that allowed us to sell at the market rate dinner with our husbands or bedtime with the kids?

Marriage is a legally recognised agreement after all, one that has been shown to confer many benefits for health and wellbeing. Why should I not be able to rent my place as wife and mother in my particular family to others who wish to enjoy some of those benefits?

Ryan Bourne of the Cato Institute recently argued that the technology exists to enable us to trade citizenship rights. Calling the right of British nationals to work in the UK’s high-wage economy “an effective property right we own but can’t currently trade”, he suggests we could ease immigration pressures by implementing an Airbnb-style secondary market in working rights.

If we frame citizenship, or marriage, as something owned by an individual, it is simply a set of bureaucratic permissions. Like the right to live in a house, surely this could be traded in a marketplace? And if the technology exists to create a citizenship market, surely we could do the same for marriage? I could sublet my wifedom and nip off for a weekend on the tiles with the proceeds. Why not?

The problem is obvious — my husband and daughter would, not unreasonably, object. She would no more want her bedtime story read by a stranger than my husband would want to share a bed with that stranger.

My marriage is not a good I own but a relationship, created by mutual consent. In a marriage, I give up some of my autonomy, privacy and private property rights by declaring my commitment to the relationship. What I gain is of immeasurable value: a sphere of belonging, the foundation of my existence as a social creature.

Likewise, citizenship implies relations of belonging, both of me to a community but also a community to me. It also implies commitments on behalf of the community of which I am a citizen. And in exchange it requires commitments of me, as a citizen: to uphold the law, to behave according to its customs and so on. As the late Roger Scruton put it in a 2017 speech:

The citizen participates in government and does not just submit to it. Although citizens recognise natural law as a moral limit, they accept that they make laws for themselves. They are not just subjects: they appoint the sovereign power and are in a sense parts of that sovereign power, bound to it by a quasi-contract which is also an existential tie. The arrangement is not necessarily democratic, but is rather founded on a relation of mutual accountability.

Roger Scruton

Just as my husband and daughter have a stake in who is entitled to be called “wife” or “Mummy” in our particular context, so other citizens of a nation have a stake in who is entitled to the rights conferred by citizenship.

In this light we can better understand the revulsion that greeted the actions of the Duke and Duchess of Sussex in trademarking “Sussex Royal” for personal commercial gain. Royalty, after all, does not exist in a vacuum. It is not an intrinsic property of a person, like blue eyes or long legs, but something conferred both by the monarchy and also by the subjects of that monarchy.

As Charles I discovered in 1649, ultimately no king can govern save by the consent of his subjects. Royalty is not a private property, but a relationship. The popular disgust and anger engendered by the Sussexes’ move to transfer their stock of royalty from the relational public sphere to that of private property is in truth anger at their privatising something which does not belong to them but to the whole nation.

In The Question Concerning Technologywrites Josh Pauling, Heidegger argues that technology uncouples humans from what is real, paving the way for a mindset that treats everything as “standing-reserve”, or in other words “resources to be consumed”. For Heidegger, seeing the world thus is dangerous because it flattens all other perspectives:

Commodifying nature and humanity leads us to discard other understandings of being-in-the-world and the practices, beliefs and ideas that accompany them: all aspects of reality are incorporated into the ordering of standing-reserve.

Josh Pauling

My husband’s goodwill would rapidly wear thin were I to Airbnb my role in our family. Similarly, Bourne’s citizenship marketplace fails to consider how the general population would react to seeing fellow citizens renting their right to work to non-citizens and swanning about spending the unearned proceeds. And the goodwill enjoyed by the Duke and Duchess of Sussex while discharging their royal duties has already evaporated, now it transpires they wish to enjoy the privileges of their elevated station without embracing its obligations.

Treated as objects to be exploited, relational meanings wither and die. Treated as dynamic relationships, they are infinitely renewable. In this sense, they are more akin to ecologies in the natural world. In Expecting the Earth, Wendy Wheeler argues that in fact ecologies are systems of meaning: whether at the level of DNA or megafauna, she says, living things deal not in information but in meanings that change dynamically depending on context.

Why does any of this matter? “Modernity is a surprisingly simple deal,”  writes Yuval Noah Harari in Homo Deus. “The entire contract can be summarised in a single phrase: humans agree to give up meaning in exchange for power.” The impressive achievements of modernity might make the loss of meaning seem, to some, a fair exchange.

But if Wheeler is right, meaning is more than an optional seasoning on the mechanistic business of living. In Man’s Search for Meaning, Victor Frankl observes of his time in Nazi concentration camps that those who felt they had a goal or purpose were also those most likely to survive.

Indeed, the growing phenomenon of “deaths of despair” is driven, some argue, by deterioration in community bonds, good-quality jobs, dignity and social connection — in a word, the relational goods that confer meaning and purpose on life. As Frankl observed, humans need meaning as much as we need air, food and water: “Woe to him who saw no more sense in his life, no aim, no purpose, and therefore no point in carrying on. He was soon lost.”

An order of commerce that treats relational ecologies as objects that can be exploited will exhaust those objects. That is, in the course of its commercial activities it actively destroys one of the basic preconditions for human flourishing: meaning.

The Estonian thinker Ivar Puura has called the destruction of meaning “semiocide”. As concern mounts about the effects of pollution and emissions on the earth, campaigners have called for new laws to criminalise the destruction of ecologies, which they call “ecocide”. Perhaps we should take semiocide more seriously as well.

This piece was originally published at Unherd

Remainers are the ones longing for empire

In his valedictory speech as outgoing European Council President, Donald Tusk described Brexit as a delusion driven by the foolish nostalgia of those Brits still “longing for the empire”. His words prompted the usual harrumphing, but the truth is he has it precisely backwards. It is not Brexiters who are chasing an imperialist high, but those devoted to the European Union.

Since its founding, the EU has self-mythologised as a project of peace, whose principal aim is to prevent a repeat of the two World Wars of 1914 and 1939. The basis for this argument tends to be a notion that the World Wars were caused by an excess of “nationalism”, with the aggressive and expansive German identity promoted by the Nazis held up as the primary exhibit, and that by diluting the power of Europe’s nation states nationalism will also be attenuated.

Lately, despite its convoluted and multivariate origins, the First World War has also been recruited by European leaders as a cautionary tale against nationalism. But the origin of the Second World War can just as reasonably be described as a multi-sided jockeying for power between imperial powers.

And as Yoram Hazony has argued in The Virtue of Nationalism, Hitler was less a nationalist than an imperialist, who sought to expand German-controlled territory and as such was resisted by the rival empires of Britain, the United States and other allies. That is to say, the two World Wars were arguably more driven by the competing interests of imperial players than an excess of national identification as such.

Over the horrific bloodshed that took place between 1914 and 1945, these imperial powers lost or began the irreversible process of losing their empires. The British Empire was at its greatest, not to mention most crisis-ridden, after the end of the First World War, and by the end of the Second was exhausted to the point where it no longer had either the will or the resources to sustain its imperial reach. 

The international world order that replaced the Old World empires from 1945 until relatively recently was, in effect, an empire of American-influenced rules underpinned by American military and economic dominance.  And in this new age of Pax Americana, international conventions established the right of nations to self-determination. It was no longer the done thing to invade countries halfway round the world for the purpose of grabbing resources, extending geopolitical influence and/or “civilising: the natives.

With no one overseas to colonise, what happened to the old ruling bureaucracies of the formerly imperial nations of Europe? What now for those educated with imperial dreams and a global vision, trained from a young age to run international business and political institutions, dreaming of rule across vast territories and hundreds of millions of benighted souls in need of guidance?

The solution they came up with was to colonise one another. To console themselves for the loss of the riches and ready supply of servants in their overseas colonies, the washed-up post-imperial nations of Europe agreed to pool their reach, influence and unwashed natives into a kind of ersatz empire.

It did not greatly matter whether the natives in question liked the idea or not, as the pooling was undertaken largely without public discussion and in practice (to begin with at least) made little difference to their everyday lives. Rather, the extension of ‘reach’ and ‘influence’ was largely a bureaucratic one, harmonising rules on the kind of trade and manufacturing standards which most ordinary people care very little about.

The result provided an imperial buzz for a cadre of civil servants, who got to dictate standards on the minutiae of countless areas of commerce for hundreds of millions of people rather than mere tens (and enjoy the perks of a colossal corporate lobbying industry in the process).

Even better, they could do all this without any of the demonstrable dangers of the kind of overheated jingoism that came with the style of imperialism that ended in bloodshed with the two world wars. A kind of diet imperialism, if you like: all the fun of civilising the heathens, with none of the guilt.

Their diet empire now constituted, the post-imperial civil servants of each EU member state could enjoy something of the lavish transnational lifestylemoney-no-object pageantry and grand entertaining they missed out on by the unfortunate fact of having been born too late for a career enjoying absolute power in the colonies while feathering their own nests. Indeed, the strange disappearance of a 2014 report on corruption within EU institutions suggests the diet imperialism of Europe offers ample opportunities of the nest-feathering variety.

Those in the administrative class who missed out on the opportunities for self-enrichment in the prewar empires can enjoy instead the huge and relatively unaccountable sums of money that flow around the European Union’s various budgets.

Indeed, even when misbehaviour tips over into outright criminal activity it can sometimes go unpunished, as was the case with IMF head Christine Lagarde, who received a criminal conviction in 2016 for negligence over inappropriate payouts while in the French Government but was nonetheless installed this year as head of the European Central Bank.

The administrative empire also delivers a servant class, at a scale appropriate to the post-imperial nostalgia it serves to alleviate. The debate around the Brexit referendum was full of dire warnings about the looming loss of staff to (among other things) wipe bottoms, look after children, pick fruit  and make lattes.

These laments strongly hint at the preoccupations of a colonial class reluctant in the extreme to let go of a rich supply of subaltern masses whose services were rendered affordable by the expansion of the labour market through freedom of movement.

It is not just the servants. The prospect of losing the European extension to their shrunken, empire-less British geopolitical self-image cuts to the heart of our modern governing class. As one would expect, then, those lamenting Britain’s post-Brexit loss of “standing” or evolution into a “laughing stock” (who cares?) are not the supposedly imperialist and thin-skinned Brexiters but those who wish to remain. Because in their view the only available modern source of the suitably elevated pomp, influence and imperial “standing” to which they feel entitled is our membership of the EU.

Paradoxically, in the act of accusing Brexiters of the imperial nostalgia of which they themselves are guilty, the Remain Europhiles have hit on a term which is more accurate than they realise for their Brexiter foes: Little Englanders. As has been pointed out elsewhere, the original Little Englanders were anti-imperialist, and wanted the borders of the United Kingdom to stop at the edges of the British Isles.

The epithet tends to be used against Brexiters to imply jingoistic and probably racist imperial aspirations, but this is the opposite of what it meant when first used. And taken in its original sense, calling Brexiters Little Englanders is entirely accurate: they would like the borders of the nation to which they belong to be at the edge of the British Isles, not along the edge of Turkey or Russia.

Should they get their way, this will present the United Kingdom with the prospect of life as an independent nation of modest size. We can then look forward to a future going about our business much reduced from the giddy, extractive and racist highs of the early twentieth century but hopefully more stable, more content with ourselves and, importantly, perhaps even finally at ease with the loss of British imperial reach.

For the imperialist nostalgists of Remain, though, unable to reconcile themselves to the notion of the United Kingdom as anything but a world power, this possibility is anathema. The argument tends to be that unless we join a large power bloc we will be ground to dust between them. Gideon Rachman argued recently in the FT  that “the EU needs to become a power project”, saying that future geopolitics will be a contest between four or five large blocs including China and the US and the individual nations of Europe cannot hold a candle to these behemoths.

But must this necessarily be so? Rachman’s future is just a projection, and many projections – such as Fukuyama’s famous one about the “end of history” have been proved wrong by subsequent events. Admittedly, a multipolar future seems likely. But any age of competing superpowers has always also contained smaller nations that managed to avoid absorption into a larger empire by one means or another. Why should Little England not be one of them?

The only thing holding us back from a post-Brexit and doubly post-imperial future, at ease with our reduction in stature and ready for a new chapter in our national history, is the imperial nostalgia of the Europhiles.

This post originally published at Unherd

The Somewheres are beginning to organise

Yesterday I attended the SDP’s party conference. The rump of the party that merged with the Liberals to become the Liberal Democrats has enjoyed something of a revival in the last year under William Clouston, who has led the charge to reinvent its social-democratic platform along distinctly post-liberal lines. The party is a minnow compared to the big hitters of conference season, but the conference was important. Here’s why.

With very few exceptions, the party’s leadership do not live in London. Its strongest support base is in Yorkshire, notably around Leeds where the conference was held. Clouston himself lives in a village in the North-East. In his closing remarks, he apologised to delegates for the fact that the next meeting will be in London. Where most of the big parties now talk about the need to take note of the perspective of people outside the capital, within the SDP the reverse is the case.

The party leans centre-right on social issues and centre-left on cultural ones. Broadly speaking, it stands for family, community, nation and a robust welfare state, and bears some similarities to ‘Blue Labour’, Maurice Glasman’s project to bring issues such as family and patriotism back into Labour politics. But whereas Glasman’s project was to a significant degree driven by metropolitan intellectuals, the SDP is not driven by London voices or perspectives. This is also perhaps why the SDP has to date had little cut-through in media terms despite numerous polls that suggest widespread support for a combination of redistributive economic policy with small-c social conservative values.

Movements that articulate concerns or perspectives widespread in the UK population outside major cities have in recent years often been traduced in the media as ‘populist’ or even ‘far right’. But while several speakers at the conference inveighed against identity politics and ‘political correctness’, the SDP is not reactionary. The first motion to carry was one to amend the party policy banning non-stun slaughter to one regulating it, both in the interests of religious tolerance but also to avoid far-right dogwhistles. Clouston himself referred in his speech to a ‘decent populism’ that seeks to return the common concerns of those outside major cities and the liberal consensus to mainstream political discourse.

The watchword was ‘community’ and ‘solidarity’. A key theme emerging from the speakers was: what are the proper limits to individual freedom? Where is it more important to consider the needs of a group? Who pays the price for ‘double liberalism’, and how can we mitigate those costs?

For some considerable time, politics has been something done by Anywheres (Goodhart) and more done to the Somewheres. Efforts to rebalance this have tended to be treated as monstrous aberrations that must be contained, whether with disparaging media coverage or more government funding for some client-state scheme or other.

But looking around on Saturday, my sense is this may change. The Somewheres are beginning to organise.

On halal, kosher, religious tolerance and having it both ways

Yesterday I live tweeted the SDP conference in Leeds. It was a great day with many interesting speakers, but easily the most controversial discussion – and the one that has generated the most reaction in my Twitter mentions since – was the motion to amend SDP policy on non stun slaughter. Previously, policy was to ban these methods of slaughter, but at the conference a motion was decisively carried to amend this to provisions on strict standards, ensuring supply does not outstrip demand (eg non stun slaughter for export) and proper labelling.

I gather that debate around the subject prior to conference was heated. I know at least one person who left the party over the subject. I spoke in favour of the motion despite being personally uncomfortable with such methods of slaughter on the grounds that an explicitly communitarian party needs to be willing to demonstrate a recognition that religious practice is immensely important to some groups, and to create space for such practices even if we find them personally unappealing.

But once you start making explicit provision for communitarian considerations, the tension between faith and other ethical frameworks is immediately apparent.

The subsequent discussion – and its links into ‘preserve our culture’ groups such as For Britain and Britain First – put me in mind of two brouhahas a little while ago where politicians tried to articulate a position weighing private faith against public mainstream morality. In April 2017, then Lib Dem leader Tim Farron refused to deny that his personal faith held homosexuality to be a sin. In September the same year, Jacob Rees-Mogg made statements on abortion and homosexuality, consistent with Catholic social teaching, that saw him excoriated as ‘a bigot’ and ‘wildly out of step with public opinion’.

Commentators at the time lined up to defend Farron and Rees-Mogg. There was the usual hum from offstage (ie Twitter) about the right to express views in keeping with traditional Christianity without facing punishment from an illiberal liberal elite.

So I find it interesting to see that when it comes to a religious practice from Islam and Judaism – slaughtering animals by slitting their throats, without stunning them first – some of the voices raised most loudly in agreement about the iniquity of ‘You can’t say that’ culture as it bears on Christians today should be perfectly content to support policies that actively militate the practice of those other faiths. If we are to defend Rees-Mogg and Farron on grounds of religious tolerance, should we not also consider defending halal and kosher slaughter on the same grounds? After all, the core argument of tolerance is not that one tolerates only things that one likes or feels indifferent to but that it is extended to things one actively dislikes.

It feels to me as though there are two things going on here.

Firstly, the Britain First types who wish to support religious exemptions for Christians but not for Jews or Muslims are not, themselves, Christians for the most part. Rather, they are secular inheritors of the Christian tradition who wish to preserve the structure of that tradition for the benefits it has for some time provided – a fairly stable, prosperous, harmonious society with congenial values – without taking on the obligations of the faith itself. To put it a less fashionable way, they wish to be redeemed but without themselves taking up the cross. For that, in a nutshell, is the argument made by those who argue against ‘illiberal liberalism’ but do so from a perspective that rejects the necessity of faith – any faith, perhaps, or Christianity in particular – in creating the society to which they wish to belong.

We might term it ‘religious utilitarianism’ – a worldview that recognises the utility of faith in delivering certain social goods but takes no position on the veracity or otherwise of the tenets of any faith in particular. Liberal relativism is a kind of equal-opportunities religious utilitarianism, that wishes to make space for any and all faiths to provide those goods in a pluralistic way, while the Britain First / Batten-era UKIP version of the same wishes to privilege Christian religious utilitarianism over the more relativistic liberal sort. That is, Britain First types want to keep only the outward forms of Christianity but do not wish anyone else to replace those forms with a more deeply-felt faith of their own.

But if we are to argue for religious tolerance, and for Christianity to play an active rather than a purely decorative role in our society then – the logic dictates – we must either be explicit about repressing other faiths in support of that goal, or else extend the same courtesy to other faiths. The alternative – hiding our hostility to other faiths behind a selectively-applied appeal for religious tolerance only as it pertains to ‘our’ deviations from the liberal consensus – is simply not good enough.

Weekend long read pick: the real problem at Yale is not free speech

If you’re looking for something long-form this weekend, and are tired of culture war takes on student ‘wokeness’, this lucid piece by Natalia Dashan in Palladium may even give you some measure of compassion for the lost children of America’s super-elite.

A class-inflected personal account of the author’s experience at Yale, the piece argues that the Great Awokening is less a free speech issue than a byproduct of a loss of moral purpose in America’s upper class. Her view is that America’s young elite has so far lost the desire to rule that for the most part it now prefers to give away its power, either via careers that effectively render them middle class, or else throwing themselves into ‘social justice’ activities whose purpose is less social justice than social bonding, or what she calls ‘coordination by ideology’.

Wokeness, she suggests, is really a convoluted and guilt-ridden form of class signalling that serves both to police the boundaries of an elite in-group while also deflecting any genuine responsibility for leadership that membership of a franker and more self-confident elite might entail. As it is not rooted in any clear objectives or shared political interests, the psychodrama of wokeness also relentlessly devours itself, creating a negative elite feedback loop in the process:

It doesn’t matter that the ideology is abusive to its own constituents and allies, or that it doesn’t really even serve its formal beneficiaries. All that matters is this: for everyone who gets purged for a slight infraction, there are dozens who learn from this example never to stand up to the ideology, dozens who learn that they can attack with impunity if they use the ideology to do it, and dozens who are vaguely convinced by its rhetoric to be supportive of the next purge. So, on it goes.

She asks: who benefits? In her view, those who wish to duck responsibility, to obscure their class status, or to build power bases in the chaos it creates. The price of this evasion of leadership is no less than  ‘the standards of reality itself’, alongside a cumulative decay of institutions whose purpose would once have been to channel the idealism and noblesse oblige of a young elite into public service.

And this matters, because what is now well-established at Yale will trickle down not just across America but across the world:

And what’s happening at Yale reflects a crisis in America’s broader governing class. Unable to effectively respond to the challenges facing them, they instead try to bail out of their own class. The result is an ideology which acts as an escape raft, allowing some of the most privileged young people in the country to present themselves as devoid of power. Institutions like Yale, once meant to direct people in how to use their position for the greater good, are systematically undermined—a vicious cycle which ultimately erodes the country as a whole.
Segments of this class engage in risk-averse managerialism, while others take advantage of the glut to disrupt things and expand personal power. The broader population becomes caught up in these conflicts as these actors attempt to build power bases and mobilize against each other. And like Yale, it seems a safe bet that things will continue and even accelerate until some new vision and stable, non-ideological set of coordination mechanisms are able to establish hegemony and become a new ground for real cooperation.

As to what that ‘new vision’ looks like? The author has less to offer here. But the piece is a persuasive first-hand analysis by someone in a position – by virtue of her background – to reflect critically not just on the content but also the social form of the contemporary US campus wars.

This piece was first published at Unherd

Transform the Lords to save us from Faragism

(This article was originally published on Reaction.life.)

Michael Gove famously said during the EU referendum campaign: “People have had enough of experts”. His words, though much-derided, reflect a popular sense that our politics has moved away from democratically-accountable government, driven largely by supranational institutions and treaties, and populated by appointed ‘experts’ to whom we must defer without any means of influencing their decisions.

To this transnational class of epistocrats has been added, at the domestic level, a parallel species of quangocrat touted as ‘independent’ and similarly unresponsive to electoral pressure. Resentment toward this ecosystem of insiders has been growing for years, if not decades. In our country, Farage and his Brexit Party have now made it their mission to burn this whole edifice down.

This may be politically resonant, but is it wise? One persuasive argument for remaining in the EU is that the complexity and interdependence of modern nation states cannot be mastered at speed by elected non-specialists. That the effective management of the modern world needs a grasp of often highly technical matters that takes years to acquire, and some policy areas need serious expertise as well as a degree of insulation from MPs who believe, Boris-like, that any issue can be adequately grasped with a few hours of cramming and a bon mot or two.

Some areas of government are too abstruse to make it into the general political discourse – the scandal of hygiene standards in manufacturing, say, or rules governing the import of consumer goods – while remaining immensely important overall. The failure of UK MPs to get to grips with the detail of pretty much all such areas since the EU referendum has been painfully obvious.

This is the core of the pro-EU view that it is better to agree this stuff together with the rest of the club, then leave the system in the hands of experienced professional civil servants while we get on with our daily lives. It’s an argument that has some merit, especially when compared to the blundering attempts of our MPs to cram technical subjects in a few hours in order to make decisions that will affect the lives of millions.

In this view, public resentment of experts is self-evidently foolish and destructive and should simply be ignored. But this view is only half right. The public as a whole welcomes expertise, serious statesmanship and long-term thinking in public life and is unhappy not with experts but with their lack of accountability. No-one really disputes that if we do ever leave the EU we will need our institutional memory, and our experts, more than ever. A Faragist destruction of our governing institutions would cause a loss of this institutional memory that we can ill afford, given its already etiolated state after decades of outsourcing policy to Brussels. So, given that we need them, how can we make our experts more accountable, and prevent populism from throwing experience, expertise, long-term thinking and other important babies out with the ‘metropolitan elite’ bathwater? My proposal is that this should be the role of the House of Lords.

Whatever its faults, the hereditary House of Lords did supply some long-term thinking in our public life. But since Blair’s reforms it has become both an extension of party politics and a form of reward for good behaviour in the ecology of ‘experts’ that populates public life. Both these developments are to the detriment both of democratic accountability and long-term thinking.

We should abolish the system of appointed hereditary peers that so typifies the ‘insiders’ club’ feeling of modern politics and instead invite experts to run for election to the Lords. This would be on a long electoral cycle (let’s say ten years) with a recall mechanism in extremis and specific responsibility for taking the long view on key policy areas where expertise is needed and party politics a source of harm.

Areas of policy that might benefit from being managed in this way include (in no particular order) healthcare, education, consumer standards and international trade. Education and healthcare in particular suffer from being treated by all sides as a political football. They are subjected to interminable ‘reforms’ by MPs thinking in electoral cycles rather than the long term, and desperate for impact with no regard for the millions whose daily jobs are turned upside down by the latest eye-catching initiative. And international trade and product standards are (as the Brexit negotiations have amply demonstrated) too technical for the brief to be grasped on a short timescale by elected non-experts.

Under this system, rather than having (for example) an education secretary in situ for a year or two, fiddling with policy for the sake of looking busy, we could have subject experts with hands-on experience, such as Katherine Birbalsingh or Amanda Spielman, standing for the Lords on a ten-year education ticket, long enough to see the results of any decisions taken and be held accountable for them. We could see a Lords education candidate for child-centred ‘skills’ education debate a Lords candidate keen on knowledge-and-discipline-first, with the electorate able to make the decision. Alongside this critical function of managing areas of policy for the long term, our elected expert Lords could then continue their role scrutinising legislation, as at present.

This transformation would at a stroke rid us of our increasingly unpopular ‘crony’ Lords, create more space for long-term thinking in key policy areas, and make the experts we need more democratically accountable. It would move some areas of policymaking away from short-term party politics and more toward a blend of long-termism and direct democracy. In doing so it could balance the need for experts in modern government with the equally pressing need to respond to a general public sense of democratic deficit, and thus maybe yet save us all from Faragism.