Why won’t our artists aim higher?

London ravaged by disease. Social and sexual mores collapsing. Shifting political alliances and a wobbling constitution. A Babel of competing voices vying to dominate new media channels, driving public discourse to fever pitch. It’s not the first time we’ve been here.

Today our artists embrace (and sometimes accelerate) the vibe. Sculptors are more interested in subverting statuary than glorifying anything; painters warn of an oncoming apocalypse in two-storey murals and most music is about getting laid. But back at the dawn of the modern world, when politics, culture, mores and faith were as much in flux as they are today, the 18th century’s artists took a more aspirational approach.

The cultural sphere they depicted was every bit as harmonious as the world that produced it was volatile. But while today we still listen to the measured strains of Handel, and marvel at the elegant proportions of a building by Inigo Jones, the poets of the same era are ignored. Of these, the most criminally underrated is also, perhaps, the one whose work offers the most intriguing clues for the modern world: Alexander Pope.

Pope was born the same year as modern Britain: 1688, when a group of English statesmen deposed James II as King of England, in favour of his son-in-law William of Orange. The reasons for James’ deposition were complicated, but included his Roman Catholicism as well as his insistence on the king’s divine right to abolish Parliament and govern centrally via decree.

Unenthusiastic about absolute monarchy, and nervous of future kings trying it on again, Parliament slapped new constraints on royal power — and the upshot was the constitutional monarchy we’ve lived with ever since.

As the old order liquefied at the end of the 17th century, and the fight began in earnest for power at the beginning of the 18th, aristocrats and a new class of emerging industrialists poured in to fill the vacuum left behind by an absolute ruler. These politicos increasingly split between “Tory” defenders of James II, and “Whig” proponents of Protestantism, in a political configuration that gradually took the form that would become our modern adversarial Parliament.

This binary antagonism, every bit as values-driven and visceral as the Leavers and Remainers of today, drove a febrile “us and them” political discourse. And in a forerunner of today’s clickbait-for-profit content machine, the flames were fanned by advances in printing technology, that made the written word suddenly cheap and plentiful. Presses sprang up like mushrooms, and publishers grew rich selling the scandals, libels and “fake news” of the day.

Modern politicos blame social media for a decline in public civility. But compared to the grotesque caricatures, insulting posters, inflammatory street speakers and assassination plots against senior Tories that characterised politics in the early 18th century, what gets painted today as declining standards of politeness appears more like a return to form.

New governing elites, having displaced an absolute monarch less than a generation before, were sharply aware of how fragile public consent was for their newfangled constitutional monarchy — and how much potential hostile presses had to shatter that consent. In a move that foreshadows modern drives worldwide to regulate social media, new laws pushed to suppress dissent: the 1706 Star Chamber Case De Libellis Famosis ruled that accusations against the monarch or government could constitute seditious libel even if they were true.

Pope was in many ways an outsider, a condition that today we associate with a subversive mindset. Like the deposed James II, he was Catholic, and also a Tory in a hegemonically Whig era. But he was as preoccupied with order and stability as the Star Chamber, and — albeit in a different way — every bit as critical as they were of the newly democratic world of letters.

Rather than the law, though, Pope’s battleground was literature, where he emerged as a fierce defender of high culture and classical tradition against the pandemonium of “Grub Street”. First published 1728, The Dunciad pillories the hacks of “Grub Street”, in ironically high style, as a throng of “Dunces” under the Queen of Dullness herself.

It’s perhaps the most barbed and sparkling feature-length piece of literary shade ever thrown, by turns cultivated and scabrous. Where Twitter today might just call someone a shit writer, Pope depicts one rival as powered by the spatterings of Jove’s own chamberpot:

Renew’d by ordure’s sympathetic force,
As oil’d with magic juices for the course,
Vig’rous he rises; from th’effluvia strong;
Imbibes new life, and scours and stinks along; (Dunciad II, 103-6)

Without a working knowledge of Pope’s political and literary world, getting The Dunciad’s jokes is bit like someone from the year 2320 to try and follow the jokes on Have I Got News For You. But it’s hard not to see an echo in it of our access-to-all digital “publishing” environment, and the impact it’s had on the contemporary discourse:

‘Twas chatt’ring, grinning, mouthing, jabb’ring all,
And Noise, and Norton, Brangling, and Breval,
Dennis and Dissonance; and captious Art,
And Snip-snap short, and Interruption smart. (Dunciad II, 231-34)

Pope’s blend of wit, erudition and waspishness made him a sharp satirist of contemporary chaos, but his happier visions were of tradition and harmony. London, in Windsor-Forest (1713) was envisioned as a gilded, ordered, place and the rightful heir of antiquity. Faced with its glory, the Muses would quit singing about the glories of ancient Rome, and praise England’s capital instead:

Behold! Augusta’s glitt’ring Spires increase,
And Temples rise, the beauteous Works of Peace. (Windsor-Forest, 377-8)

“Augusta”, a Roman name for London, gives Pope and his contemporaries the name by which we know them today: the Augustans. And yet London in Pope’s day was not a vision of order and beauty at all, but famous for slums, licentiousness, corruption and STDs.

The print boom extended to a flourishing trade in porn, with smutty publications bought not just for private consumption but to read aloud in pubs and coffee houses. And prefiguring Frank Ski by some centuries, there really were whores in all kinds of houses: Covent Garden was a byword for the sex trade, from the low-class “flash-mollishers” and theatre-visiting “spells” to brothel-operating “bawds” and “Covent Garden Nuns”. Prominent prostitutes, such as Sally Salisbury (1692-1724) became celebrities: Salisbury’s noted clients including Viscount Bolingbroke, and even (according to rumour) the future George II.

On top of this gossipy, salacious and politicised backdrop, urban living conditions in the city were filthy and disease-ridden: more people died in London in the 1700s than were baptised every year. The century was characterised by near-continuous military engagement. So on the face of it, nothing makes sense about Pope’s depiction in the 1733 Essay on Man, of all the cosmos as “the chain of Love/Combining all below and all above”, in which “Whatever IS, is RIGHT”.

This seems especially strange today, in the light of our modern preference for art that’s “representative” of demographics or otherwise reflective of “the real world”. But Pope’s fixation on order, hierarchy and beauty make sense, because he feared that the alternative to an idealised order would be infinitely worse:

Let Earth unbalanc’d from her orbit fly,
Planets and Suns run lawless thro’ the sky,
Let ruling Angels from their spheres be hurl’d,
Being on being wreck’d, and world on world,
Heav’n’s whole foundations to their centre, nod,
And Nature tremble to the throne of God:
All this dread ORDER break – for whom? For thee?
Vile worm! Oh Madness, Pride, Impiety! (Essay on Man, Ep. I, 251-7)

Modern tastes run more to deconstructing than glorifying canonical art or the social hierarchies it idealises. Today we’re all about writing doctorates on marginalia, humanising a stammering monarch, or revealing the sexual licence beneath the aristocratic facade. But from Pope’s perspective, it was order that needed defending, as the only real defence against tyranny:

What could be free, when lawless Beasts obey’d
And ev’n the Elements a Tyrant sway’d? (WF, 51-2)

Read against the corruption, volatility and rampant, clap-infested shagging of Georgian high society, the restrained vituperation, classical learning and formal orderliness of Pope’s writing could be seen as a paradox. Or, perhaps, a state of denial. But what if it was more a set of aspirations that succeeded — just not straight away?

The ensuing century, dominated by Victoria and Albert, is perhaps Peak Order for modern Britain. If Boswell’s diaries, in the latter half of the 18th century, record 19 separate instances of gonorrhea, Victoria’s ascent to the British throne in 1837 was characterised by a society-wide backlash against the excesses of the preceding era.

Whether methodically colouring the globe in red, or imposing strict codes of sexual conduct, public-spiritedness and emotional reserve at home, the Victorians reacted against the perceived licentiousness of the Georgian era — by delivering the kind of order that Alexander Pope both depicted in his writing and also, in his own political era, never saw realised.

In the time since Peak Order we’ve all become somewhat more free-and-easy again. But we should be wary of viewing this either as evidence of moral progress, or (depending on your outlook) of a decline that’s likely to continue indefinitely. Our age has its digital Grub Street, its own pandemic, its unstable political settlement, and its patronage politics. So perhaps it may yet produce its own Alexander Pope, and with it a new poetics of order — for a future none of us will live long enough to see.

Originally published at UnHerd

Our humanity depends on the things we don’t sell

Earlier this year, mining company Rio Tinto dynamited a 46,000-year-old Aboriginal sacred site in the Pilbara region of Australia, in pursuit of the iron ore deposits that lay beneath the sacred caves. The decision triggered outrage from Aboriginal communities and the wider world alike. Pressure from investors concerned about the resulting PR disaster eventually forced the CEO to resign.

But that’s not much of a victory for those to whom the now-destroyed site was sacred. As a place of pilgrimage, continuously inhabited since before the last Ice Age, its religious significance had accumulated over millennia of repeated visits, inhabitation and ritual. The holiness of Juukan Gorge lay in the unimaginably long-term accretion of memories, social patterns, and shared cultural maps by countless generations of the Puutu Kunti Kurrama and Pinikura peoples.

Strip mining, the method of resource extraction used to reach much of Pilbara’s iron ore, was the subject of a blistering 1962 Atlantic essay by Harry Caudill. Titled ‘Rape of the Appalachians,’ it describes a process as violent as the analogy suggests, in which entire mountaintops are removed in search of coal deposits. But when you consider the role played by commerce, it’s more accurate to describe the process as prostitution. 

It’s not unusual for those looking to destigmatize prostitution to argue that selling sexual access to one’s own body should be morally acceptable, precisely because it’s no worse than coal mining. So here we have two sides of a disagreement, both of whom see commonalities between prostitution and mining, even as they disagree over whether the action itself is good or bad.

How would we characterize what prostitution and mining have in common? Resource extraction, perhaps. Dynamiting Appalachian mountaintops has obvious tradeoffs, but on the upside you get to extract coal from the exposed rock, which you can then use to generate electricity. We accept the environmental destruction, deterioration in air quality, and changed landscape contours (or at least mostly choose to overlook them), because the alternative—no electricity—appears worse. 

Selling access to female bodies is also a form of resource extraction. The product may be subtler—orgasm, the illusion of intimacy, darker types of wish-fulfilment or, in the case of commercial surrogacy, a human baby—and the tradeoffs less visibly destructive than a landscape reshaped. But the dynamic is similar. In each case, a woman rents access to something that we consider to belong to each individual alone—her body—and earns money in return. The American Civil Liberties Union, which has supported the decriminalization of prostitution since 1975, recently argued for de-stigmatizing “those who choose to make a living based by self-governing their own bodies.” Earning money independently is good. Self-government over our own resources is good. So on what basis can we criticize people who choose to sell access to their own bodies? 

In his 1954 lecture ‘The Question Concerning Technology,’ Martin Heidegger argued that when we organize life under the rubric of technology, the world ceases to have a presence in its own right and is ordered instead as ‘standing-reserve’—that is, as resources to be instrumentalized. Coal and iron ore, the products of technology themselves, and even human sexual desire then come to be seen as part of the standing-reserve. It becomes increasingly difficult to see reasons why there should exist any limits on extracting such resources.

Today, it feels as though we’ve always been engaged in this inexorable onward march. From a more mainstream perspective, what Heidegger is describing is simply the process we now call economic development. It is the transition from pre-industrial societies—characterized by primitive and localised forms of exchange, low workforce mobility, and in many cases by extreme poverty—to longer and more complex supply chains, technological innovation, more trade, more stuff, more wealth, and more personal freedom. 

But as Austro-Hungarian economist Karl Polanyi argued in The Great Transformationfor much of human history trade occupied a much less central place in human relations than it does today: “man’s economy, as a rule, is submerged in his social relationships.” Polanyi showed how in Britain, economic development and the emergent market society was driven by the Enclosure Acts between 1750 and 1860. Prior to enclosure, much of Britain comprised subsistence-farming peasants living on land held and worked in common. 

Enclosures, justified by a need for greater farming efficiency, stripped the peasantry of any right to common land in favor of a private property model. Pre-enclosure, the necessities of life might have been bare, but many of those necessities existed outside the realm of ownership and trade. A peasant might spend his or her whole life in a tied cottage, with a right to common land, working to produce food but with very little need to buy or sell anything. Enclosure reordered whole swathes of human life from the shared, social realm to that of standing-reserve: that is, the realm of private property and transactional exchange.

Post-enclosures, what had previously been held in common—whether land or labor—was now privatized as standing-reserve for exploitation by free individuals. In the process, millions of human lives were arguably made much freer. The working poor were liberated from the feudal ties often implied by subsistence farming, free to move if they pleased, and free to sell their own labor for money. 

But this development was never simply the voluntary spread of a new, enlightened way of making everyone better-off. Like mining, it came with tradeoffs: peasant resistance to the Enclosure Acts suggests that for those people, at least, something was lost. And if enclosure opened up domestic markets in goods such as housing and food, it did not rely on the consent of those British peasants forcibly displaced from subsistence lifestyles into waged factory work.

The violence involved in opening up colonial markets likewise rejected the benign invisible hand. In February 1897, for example, not long after the completion of the enclosures in Britain itself, British imperial officials responded to the Oba of Benin’s refusal to open up trade in palm oil and rubber from his thriving city-state on the Niger Delta. Their answer was the Punitive Expedition, in which 5,000 British troops armed with machine guns razed Benin, massacring its inhabitants, flattening its temples, and looting the bronzes that inscribed its most treasured cultural memories. A month after the Punitive Expedition, a golf course had been laid over the city’s site, with the ninth hole where the most sacred tree had stood.

Most histories of the present characterize the story of economic development as an upward one of human progress, that has liberated millions from indentured labour into greater agency as free individuals in market society. And there’s something in this story of freedom; I wouldn’t swap my life today for that of a medieval subsistence peasant. But, like the extraction of Appalachian coal, nothing comes without tradeoffs. And while it’s easy enough to describe historical events in our transition from a largely relational society to a largely transactional one, the cost of moving to a market society is more difficult to count. 

It’s perhaps easier to find a way into this blind spot via a more recent large-scale displacement of humans from a relational to a market existence. The migration of women from the domestic sphere to the workplace began in earnest in the 20th century, and it’s perhaps not a coincidence that it gathered pace around the time the economic gains available via overseas colonial expansion began to falter. I’ve never been a subsistence peasant or Aboriginal nomad, but for a few years I did step a small distance outside the market society as a full-time mother. And what I learned there about how, and why, this form of work is invisible today helps to illuminate the tradeoffs demanded by the market society. It also offers clues as to how we might yet stand for things crucial to humans but indefensible within a transactional worldview, such as ecosystems, sacred places, or even a view of dating that isn’t a sexual marketplace.

For something to be treated as standing-reserve, it must be possible to own it. Our social norms demand that we claim ownership of a resource before exploiting it. Selling my labor in the marketplace presumes that I can dispose of my time as I see fit, that no one else has a claim on my time or my body—in short, that I’m a free individual.

But to be a mother is quintessentially to experience not entirely belonging to yourself. It begins in pregnancy, with the realization that what you eat or drink affects your unborn child; it continues with breastfeeding, as you make the food that nourishes your child with your own body; it goes on far beyond infancy, in the need your children have to be physically close to you. When you see how powerfully your small child craves your presence, it’s very difficult to sustain the illusion of belonging only to yourself.

To the extent that something belongs to others as well as to ourselves—such as common land in 18th century Britain—it will resist being privatized for use as standing-reserve. So caring for my child can’t easily be viewed as a transaction, because it’s a relationship in which we aren’t exactly individuals. That is, we don’t belong only to ourselves, but to each other as well. And when you don’t belong solely to yourself, work can be understood not as a transaction—my labor exchanged for your money—but as relational. In other words, it is less oriented toward resource extraction and exchange than sustaining interdependent patterns of life. 

This in turn helps explain why the politics of our market society has such a blind spot where motherhood is concerned: the market society’s notion of liberation into the standing-reserve is deeply at odds with the work of caring. Sustaining interdependency isn’t about fleeting transactional logic. It’s about maintaining a valuable relationship. I don’t care for my child or my partner because I have a utilitarian goal in mind, but because we belong to each other and that makes caring for them a necessity for my existence too. 

Despite being in a sense repetitive and goal-less, caring is also pregnant with meaning. As the pioneering biosemioticist Wendy Wheeler puts it in Information and Meaning, repetition and pattern are central to communication throughout the organic and even the inorganic world. Organisms and natural systems don’t just respond to one-off signals, but rather exist in emergent, interdependent dialogue with the signals sent by other organisms and environmental factors around them—what Jakob von Uexküll calls an organism’s Umwelt. Thus, information in the natural world does not exist in some abstract sense, but only in the context of how it’s received within larger feedback loops. From the smallest microbiota to complex human civilisations, meanings are fundamentally relational, contextual, and pattern-based. 

Seen this way, it’s easier to understand why non-transactional, relational spheres of life and particularly family, remain Americans’ most potent sources of meaning. For individuals, meaning is to be found less in peak experiences, one-offs, the exceptional or abstractit hides in the repetitive, the everyday, and the relational. At a collective level, meaning coils through those pattern-languages transmitted via tradition, whether in vernacular architecture, folk music or oral histories. It lies thick in sacred places: humans have long used pattern, repetition, and the expected as the core of ritual religious and spiritual practices. 

The philosopher Adam Robbert connects meaning-making with askēsis, a Greek term that refers to the act of practice and discipline as itself a form of extended cognition, that enables the expansion of meaning-making beyond the rational sphere via the bringing-together of attention and repetition. We can understand motherhood as a kind of relational askēsis, whose core is the attentive, attuned pattern-work of sustaining a child’s Umwelt while they are too young to do it themselves. This is a central reason why many women are willing to sacrifice social status and earning power to work part-time or stay at home with young children: it’s as satisfyingly rich in meaning-as-pattern as it is starved of social status and pecuniary reward.

But the central concern of mothering with pattern, sameness, and contextual meaning as opposed to information devalues it in the order of standing-reserve, even as it delivers untold riches on its own terms. Information theory, a core science underpinning much of our technology, explicitly excludes the realm of pattern and sameness as ‘redundancy,’ preferring to focus on the unexpected. Our contemporary culture is quintessentially one of information theory: we celebrate the new, the innovative, the individual who doesn’t follow the rules. I can’t think of many movies where the hero defies calls to go his own way and instead saves the world by embracing convention.

And yet meaning, as Wheeler emphasizes, “is made up of pattern, repetition, the expected.” Information theory is thus blind to it, as she further points out: “What information engineers count as redundancy, living organisms in their systems count as meaning.” In this worldview, the tradeoff between motherhood and the workplace is a brutal one. No matter how meaningful life with a baby seems in its relational context, we have no vocabulary for understanding that, save as redundancy. It’s no surprise to discover that market society frames caring for children as a punishment: “the motherhood penalty.” 

The transactional world has little facility for repetition, pattern, or the expected; this is ‘redundancy’ to be dismissed in pursuit of the special, the distinct, the signal. This blindness to meaning-as-pattern, visible in the devaluation of motherhood and trust relationships, is similarly evident in contemporary architecture’s indifference to those vernacular pattern-languages in local built environments, that encode ways of life specific to different places. You can see it again in the treatment of folk music as second-class and unoriginal, the dismissal of religious practice as dogma, or the indifference to accumulated sacredness that allowed the destruction of Juukan Gorge. 

Within the worldview that reads motherhood as a punishment, ecologies of meaning accumulated via everyday pattern, human relationship, or religious ritual are at best yet-to-be-monetized resources. If they resist this transformation, they are obstacles to be ignored or dynamited. Bringing these pieces together, it’s now easier to see what’s lost under the rubric of information theory and standing-reserve. To see the world in terms of standing-reserve means seeing it as transactions rather than relationships, and information rather than meaning: as Heidegger puts it, “shattered,” and confined to a “circuit of orderability.” 

This shattered world is the same one the market society mindset calls ‘open’: openness to new forms, after all, means weak adherence to existing ones. To borrow Oscar Wilde’s famous phrase, then, seeing the price of everything by definition means seeing the value of nothing. Reframing the world in transactional terms, as ‘open’ resources that can be instrumentalized, necessitates the destruction of its meanings. Strip-mining self-evidently degrades the environment being mined. After demutualization, it took less than two decades for Britain’s building societies to go from embedded, localized community micro-lenders to casino-banking basket cases. And people who sell sexual access to their own bodies find it difficult to form and maintain intimate partner relationships

Likewise, treating human gestation as a service in commercial surrogacy interrupts the biologically-based symbiosis between mother and child that makes such relationships resistant to marketization. Instead, surrogacy contracts treat the baby as separate from its mother, a product that can be commissioned. Humans are thus shattered and reordered as objects, as in this case of a disabled child rejected both by her commissioning ‘parents’ and also by her Ukrainian gestational mother, as though she were a faulty smartphone.

Here we begin to see more clearly who pays when we replace meaning with information and relationship with transaction: anyone in need of care, and anyone leading an ordinary life. The winners in the information world are those whose lives are oriented toward peak experiences, agency, variety, surprise, and control. To the extent that you find fulfilment in pattern, repetition, and the quotidian, a technological and economic order blind to meaning-as-pattern and hyperfocused on the unexpected will be, by definition, unable to see you. 

But we’re running out of relational resources to convert and consume. Much as on current trends many key natural resources will be exhausted within a few decades, there are signs that in our civilization, the relational commons that underpins ordinary human life is approaching a point so shattered that the capacity of society to function is increasingly compromised. Certainly where I live in Britain, the weak institutional response to COVID-19 has revealed a nation in which social solidarity may be present on a local level, but is increasingly, acrimoniously, absent at scale. 

Pursuing resilience in this context means seeking out the relational, and looking to strengthen it: that means standing up for the interests of women, babies, the everyday, the natural world—and the value of norms, custom, and religious faith. From this, it follows that defending women and the environment means not embracing but resisting the logic of transaction. In that case, communities with some religious basis for sustaining relational resources as a sacred domain will prove more resilient than the ‘liberatory’ vision of market society and standing-reserve—precisely because they reject the appetitive logic of transaction. 

From a transactional point of view, this is at best a romanticization of some imaginary lost Eden, and likely a manifesto for ending innovation and demand to return to pre-industrial society. But a defense of ordinary-ness, pattern and repetition does not imply turning back the clock, or levelling all humans to identical cellular automata. Nor is it a case against extraordinary people: the natural world, after all, has megafauna as well as microbiota. 

Making the case for meaning as well as information is not to claim that we should revert to Tudor times, all be the same, or all spend our lives raising children. But it’s to defend pattern, repetition, and ordinariness as valuable in their own right, whether as the medium for future rituals and sacred places to emerge, as the domain of social life, or simply as bulwarks against the voracity of a transactional worldview that would commodify even our deepest social instincts. It’s to argue for our radical interdependence with our Umwelt. And it’s to affirm that in order for a society to thrive, sacred things must not just be defended as exempt from standing-reserve, or moved to a museum like the looted Benin bronzes, but continually and actively re-consecrated. 

Originally published at Palladium

The world according to LARP

Who would have guessed that a weekend hobby for outdoorsy nerds could spawn an era-defining political metaphor?

LARP, or live action role-playing, is an offshoot of the fantasy roleplaying subculture. It involves dressing up in costume and acting out a fantasy-fiction game narrative in real time and space, sometimes over several days. A witty friend once characterised the experience as something akin to ‘cross-country pantomime’.

Thanks to lockdown, no one’s LARPing this year — at least not in the cross-country pantomime sense. But the word ‘LARP’ has escaped into the wild: far from being the preserve of fantasy fans, I’ve noticed it appearing with increasing frequency in political discourse.

When riot police finally pushed activists out of the Chapel Hill Autonomous Zone following the murder of one joyriding teenager and serious wounding of another by CHAZ ‘security’, resistance to the advancing riot shields was so paltry it prompted contemptuous accusations of ‘revolutionary larping’. Weird Christian Twitter (it’s a thing) hosts arguments where people are accused of larping more traditionalist theologies than they truly espouse. Still further out on the fringes, the QAnon conspiracy rabbit hole (don’t go there) is fiercely defended by its True Believers against accusations that it is, in fact, a bunch of people LARPing.

Around the time my friends were discovering LARP, I got into LARP’s Very Online cousin, Alternate Reality Gaming (ARGs). An artefact of the age before Facebook and Twitter colonised most of the internet, ARGs are a hybrid of online treasure hunt, mystery story, and live-action immersive theatre. The first mass-participation ARG was a promotional stunt for the 2001 film AI, and featured a series of fiendish clues for participants to crack and follow, which unlocked further elements of story including live-action segments.

For a moment in the mid-Noughties, ARGs looked like the future of storytelling. The idea of internet communities over-writing stable systems of meaning with playful new narratives that danced back and forth between the web and real world felt refreshing and subversive. With hindsight, though, the phenomenon was just a more-than-usually-creative moment in a widespread unmooring of reality that’s been under way for decades.

It’s not all the fault of the internet. In 1955, the philosopher J L Austin developed a theory of ‘performative’ language: that is, language that does something to reality in the act of being spoken. ‘I pronounce you man and wife’ is an example of performative speech — words that effect change through the act of being spoken.

Then, in 1993, the queer theorist Judith Butler borrowed the concept of ‘performative’ language wholesale and applied it to sex and gender, arguing that the identities ‘man’ and ‘woman’ — along with the bodies and biologies associated with those identities — are performative. In taking these roles on, Butler claimed, we make them real.

While these ideas pre-date mass adoption of the internet, the notion that we participate in creating our own realities has been wildly accelerated by social media. Online, it’s easy to get the impression that we can reinvent ourselves entirely, independent of our bodies or other dull ‘meatspace’ constraints. Unsurprisingly, Butler’s conception of sex and gender as performance has long since escaped the petri dish of academia and, like the concept of LARPing, is evolving rapidly in the wild.

Strikingly, the word ‘performative’ has also mutated. Today, it isn’t used as Butler did, to mean “a performance that shapes reality”, but the opposite: an insincere performance for social kudos. So, for example, celebrity endorsements of social justice orthodoxies are often accused of being ‘performative’. It means much the same as ‘larping’, but with an added payload of cynicism. So where ‘LARPing’ means “playacting at something you wish you were”, ‘performative’ means “playacting at something you don’t really believe”.

Meanwhile, the LARP is no longer confined to cheery camping trips accessorised with pretend armour. Back in the noughties, online communities refactoring reality to fit a fantasy storyline felt like a fun game, but as I stare into the sucking void of the QAnon conspiracy, that perspective now seems hopelessly naïve. It’s not a game today: it’s how we do politics.

Liberal commentators spend a great deal of energy trying to explain why this is bad. Countless writers ‘fact-check’ Trump’s bloviations, seemingly unaware that from the perspective of reality-as-ARG, the fact that Trump is lying doesn’t matter. Nor does it really matter whether QAnon is real or not. Reality is, to a great extent, beside the point.

Laurie Penny got closer to the truth in this 2018 piece, where she characterises the very notion of a ‘marketplace of ideas’ as being a kind of LARP: “a Classical fever-dream of a society where pedigreed intellectuals freely exchange ideas in front of a respectful audience”. The reality, she argues, is that this ‘marketplace of ideas’ is less free, rational exchange than dick-swinging theatre.

Those who like to imagine this pessimistic perspective is new, wholly the fault of the Orange Man (or perhaps Facebook), should recall the words of an unnamed aide to George W Bush, quoted in 2004 on the relationship between facts, reality and the military invasion of Iraq:

The aide said that guys like me were “in what we call the reality-based community,” which he defined as people who “believe that solutions emerge from your judicious study of discernible reality.” I nodded and murmured something about enlightenment principles and empiricism. He cut me off. “That’s not the way the world really works any more,” he continued. “We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality — judiciously, as you will — we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We’re history’s actors . . . and you, all of you, will be left to just study what we do.”

Ron Suskind, New York Times

Though his approach was less overtly hubristic, Tony Blair’s embrace of spin reflected a similar belief in his own ability to manipulate public narratives. ‘Political communications’ has only grown in significance since those days, and taken a self-referential turn. Today it’s as common for commentators to criticise a politician for performing badly at a presser — for poor-quality larping, or bad theatre in Penny’s formulation — as for saying things that are immoral or factually wrong.

Donald Trump is distinct from George W Bush not so much in disdaining facts as in lacking the religious conviction Bush deployed to fill in the gaps left behind by their disregard. But both, in different ways, embodied or embody the idea that what you believe is what is. If you LARP hard enough, this view says, your larp will come true.

Boris Johnson’s administration has something of the same cavalier attitude to the relationship between facts and rhetoric. To date, the handling of coronavirus has routinely over-promised and under-delivered, while seeming indifferent to the disorienting effect on public life of a string of announcements only very loosely tethered to everyday experience.

It’s not a coincidence that this larpification of politics has evolved in tandem with a public fixation on ‘evidence-based policy’. The political polarity of absolute LARP — blatant political lying — and absolute insistence on evidence are two sides of the same loss of faith in a common understanding of reality.

If you’re not convinced, consider SAGE, the government’s scientific advisory committee. Then consider ‘Independent SAGE’, a kind of counter-SAGE comprising scientists every bit as eminent as those on SAGE. This august body produces its own carefully evidence-based reports, which are then used as a foundation from which to disagree with whatever positions the Tories choose to adopt from official SAGE.

Who do we believe? That’s politics. If the Brexit debate hadn’t already killed your faith in ‘the evidence’, the competing claims of SAGE and counter-SAGE should be the death-blow. There is no dispassionate foundation of facts we can deploy to take the politics out of political decisions. The original LARPers might have a fun weekend, then pack up and go back to their normal lives; but in its political sense, there’s no outside to the game. It’s larping all the way down.

Some parts of our culture are coping better with this shift than others. Among the worst performers, to my eye, are mainstream liberals of both left and right. Many have responded to the larpification of everything by concluding that in losing objectivity we’ve lost reality. Some then imagine they can make reality whatever they want it to be by sheer force of will (the Trump/Bush approach). Others suggest we can fix things by supplying enough facts to prove whatever we already believed (the SAGE/counter-SAGE approach). Others, such as Laurie Penny, try to refuse to play.

But we haven’t lost reality, just the fixed vantage point we pretended we had from which to evaluate it. What we have instead is a kind of reality-shaping free-for-all, and there’s no opting out.

As most of us flounder, disoriented, we’re starting to see subcultures adapting. The old story about the Inuit having 50 words for snow is (appropriately) itself probably fake news. But much as a snow-dwelling people might be expected to develop specialist terminology for different types of frozen precipitation, we should understand the emergence of words like ‘larp’ and ‘performative’ as analogous. We’re developing a specialist vocabulary for types of unreality.

We’re also having to find new ways to talk about the reality that, inconveniently, refuses to go away completely. The grim story of the Iraq invasion and its destructive and bloody aftermath gave the lie to Bush’s messianic faith in his capacity to create a new reality to order. Humans still can’t really change sex. And no amount of fiddling the statistics can bring back those people who’ve already died of coronavirus.

The political future turns on our being able to get used to parsing our new Babel for what’s actually happening, and what actually matters. We have to get used to doing this without trying to eliminate the element of LARP (spoiler: can’t be done) or pretending we can abolish reality (ditto).

But there’s no putting the genie back in the bottle. If the ground is moving under all our feet now, the way forward is learning how to dance.

This article was originally published at Unherd

How To Find Meaning When Everything Is Power

While we live, we all present different facets of ourselves to different people. Whether in our friendships, work, family or at different times in our lives, we encounter others. All remember us slightly differently, according to their perspective.

While we live, our physical presence holds that multiplicity together. After we die, though, the memories begin to come apart. When my step-grandfather married my grandmother, he already had two children with his first wife. But she had already left him and moved to a different country; he was stepfather to my mother and aunts instead.

He was a big character: an aristocrat of the Greatest Generation, the subject of several films about his war exploits, well-loved farmer, and patriarch to two families. At his funeral, the many facets of his life were already coming apart. Each version of his memory was fiercely defended by the mourner to whom it belonged. Long-standing quarrels, no longer held in check by his living presence, began trickling back into the open. It was not an easy day.

Today, we are all mourners at the funeral of a character on a scale that dwarfs even my roaring, hectoring, pedantic, affectionate, and irascible step-grandfather. We are gathered to mourn teleology itself—the belief that life has objective meaning and direction. What we call the culture war is the aggregate of those quarrels now breaking out between the gathered mourners over their divergent memories of the deceased.

Were we progressing toward universal peace, justice, and equality? Was it the resurrection and the life of the world to come? Perhaps it was the end of history in universal liberal democracy? We cannot agree.

The death of teleology represents a collective cultural trauma that accounts for, among other things, the increasingly unhinged debates around social justice within elite universities, and the reactive phenomenon of the aggressively transgressive online far-right.

But it doesn’t have to be like this. Post-structuralism killed teleology, but it did so in error, by taking a wrong turn; it is this wrong turn that has left us so traumatized.

What is commonly referred to as postmodernism is not in fact post-modern but rather represents a last-ditch attempt by modernism to resist the implications of the post-structuralist mindset whose inevitability is now indicated by fields as diverse as physics, ecology, and psychotherapy.

Deconstruction is not the end: reconstruction is possible, indeed essential.

To situate myself a little in this story: I belong to a generation that is marginal, facing two directions, in several ways that are relevant to my argument. Born in 1979, I sit at the tail end of Generation X. I am old enough to remember the days before the internet, but young enough to be more or less a digital native. I got my first cell phone and email address as an undergraduate at Oxford. I researched my undergrad essays sitting in actual libraries reading physical books, but wrote them on a word processor. I can remember life before social media.

I also received, prior to undergraduate life, a recognizably classical education. This was, in the old-fashioned way, designed to deliver a whistle-stop tour of the march of civilizations from Ancient Egypt via the classical era to Western Christendom, with at least a vague grasp of the cultural and historical highlights of each.

The overall impression delivered was of an evolution of societies, consciousnesses, and cultures over a vast sweep of time and different human epochs that nonetheless seemed to have at least some narrative continuity and directionality. Everything else we learned seemed at least to an extent framed by that sense of situatedness within a larger narrative of human cultural evolution, whose direction was a mystery but did at least seem to be headed somewhere.

Then, in my first year as an English Literature undergraduate, I encountered critical theory—and the entire organizing principle for my understanding of reality fell apart.

To summarize: Saussure proposed that instead of treating language as transparent, its meaning rising off the page without any need for elucidation, we should split language into ‘sign’ and ‘signified.’ That is, what a word means is separable from the word that means it. We can thus, he argued, institute a new discipline of ‘semiotics’: the study of signs—a study that reaches far beyond language and was immediately influential in the social sciences.

This insight was developed by Jacques Derrida, whose simple but devastating observation was that if this is the case, we cannot define any given ‘signified’ except with reference to further signs, which then in turn themselves require definition with reference to further signs. It’s turtles all the way down. We have no means, through language, of arriving at any kind of truth that we are able to experience directly. Furthermore, the concerted efforts by millennia of culture to obscure the fact that it’s turtles all the way down is in fact a cunning effort to shore up entrenched interests, and to conceal the operations of power.

In this view, recourses to authority are null and void. There is no solid foundation, no God, no truth, no authority. Only power, and a form of consensus reality arrived at through the glacial accretion of a billion tiny operations of power that have, in sum, decreed that the world should be thus and not thus.

For me, the shift from a sense of the world as having some stable narrative trajectory to this perspective, in which meanings were not only networked but fundamentally without foundation, was deeply disturbing. It landed like a psychotic experience. Overnight, the hallowed architecture of Oxford University went from seeming like a benign repository of traditions within which I could find my place, to a hostile incursion into my consciousness of something phallic, domineering, and authoritarian. I remember describing to a friend how, as a woman and sexual minority, I suddenly experienced the ‘dreaming spires’ as ‘barbed penises straining to penetrate the sky.’

I wish I could say it passed, but it did not. What did happen, though, after I left, was that I found an accommodation with the loss of teleology and objectivity from my frame of reference. I did this by theorizing that if to posit anything at all is an act of power, then it was one I was also entitled to attempt. All cognition, meaning-making, interpretation, and perception is conceptually laden and socially-mediated action. It is impossible to ground even perception in anything but action and thus power. But so be it. We live in a society and participate in the flow of power all the time. I developed the idea of ‘temporary certainties,’ or in other words, the idea that even if meanings are not stable, many of them are stable enough for me to act as if they were solid in the pre-Derridean sense. I did not have to deconstruct every minuscule interaction for the operations of power it encoded.

In an effort to evade the monstrous pervasiveness of systems of domination and submission, I experimented with radically non-hierarchical forms of living, power exchange sexualities, non-binary gender presentation. I tried my own operations of power: I changed my name to Sebastian, to see what it felt like, then settled for a while on Sebastian Mary. I co-founded a startup with friends, in which we tried to avoid having a management hierarchy.

My accommodation kind of worked, for a while. But it did not last. It is all very well to theorize about non-hierarchical forms of organization, but in order to get stuff done you need a chain of accountability. And the worst sort of hierarchies have a habit of emerging, too, especially in social situations where they are intentionally obscured or deprecated. Communes, collaborative projects, and the like all find their leaders and followers, or their tyrants and victims. My increasing bitterness as I learned this, in the course of trying to get somewhere with the startup, made me so obnoxious as a co-worker that eventually I was expelled from the project which was, by then, failing anyway.

With that rupture, I lost my social circle, my best friend, and my entire carefully reassembled working theory for how to navigate the rubble of broken teleologies that was my adult life in the ‘00s. Concurrently, the Great Crash of 2008 destroyed the equally teleological fantasy of global liberal-democratic hegemony under international capitalism that had powered the Iraq invasion along with the triumphalism of the Blair years.

In the wreckage, though, something wonderful happened. Two wonderful things, actually. First, I met the man who I would eventually marry, and by degrees let go of the belief that in order to sustain my integrity as a person I had to reject any form of stable loving relationship to an Other in favor of multiple, overlapping, unstable platonic, sexual, or ambiguous friendships. Second, I decided I needed to learn how to do something more useful than floating around London curating experimental art events and screwing up entrepreneurship, and went back to school to train as a psychotherapist.

In the course of that study, I learned where postmodernism took its wrong turn. Implicit in the post-structuralist theories taught to every young humanities student at university is the idea that because meanings have no singular objectively correct grounding, they are therefore of no value. Also implicit is the idea that because of this, no satisfying, authentic or truthful encounter with the Other is ever possible—only an endless recursive hall of mirrors composed either of our own anguished reflections or the invasive pressure against our psyches of another’s desire.

In studying psychotherapy, though, I came to realize that while the same post-structuralist decentering of the self took place in psychoanalytic theory between Freud and his contemporary descendants, therapists had—because they have to—rejected the idea that we can never encounter the other. While much contemporary analytic theory acknowledges the need to excavate and make space for the operations of overdetermined systems such as race, class, or sex, it does not automatically follow from the presence of those things that intersubjective contact and meaningful connection cannot take place.

Just like post-structuralism decentered the observer, intersubjective psychoanalysis radically decenters the analyst. But an intersubjective understanding of the relational space as co-created by client and therapist does not preclude the possibility of therapeutic work taking place. And this in turn speaks powerfully to a claim that however muddled, muddied and overdetermined our encounters with the other may be, yet they still contain the potential to be not just benign but real, true, and transformative.

I suppose I could deconstruct that claim in turn. But I have experienced its truth both as client and also, in the course of my work, as therapist. Through intersubjective encounters in the consulting room, I have been transformed, and have transformed in turn. From this vantage point, the claim of post-structuralism to render meaningless all semiotic systems, and reveal as brute operations of power all encounters with the other, seems not just mistaken but (in the Kleinian sense) paranoid-schizoid. It is the tantrum of a child who, on realizing they cannot have exactly what they want, refuses to have even the next best thing and dismisses everything and everyone as evil.

The alternative to this paranoid-schizoid repudiation of meaning is not to reject meaning as dead or hopelessly suborned by power, but to accept that we are enmeshed, shaped and in turn helping to shape networks of meaning as part of a dynamic dialogue. We are nodes in the social and semiotic system. As such, even the act of contemplating those systems of meaning will have some tiny effect on them. When Derrida said ‘Il n’y a pas d’hors-texte’—”there is no outside-text,” though commonly mistranslated as “there is nothing outside the text”—I took it to mean meaning itself was hopelessly corrupted, and objectivity a bust. Today, I see it more as a radical decentering of my selfhood that opens up new, vibrant possibilities of connectedness.

If we read ‘text’ in the biosemiotic sense as fractal, multi-dimensional, and interconnected systems of signification, both of human culture and the natural world (inasmuch as those things can even be separated), then indeed there is nothing outside the text. But that does not mean the text is wholly illegible, or that it does not exist—simply that in reading, we affect what it says, and in return it changes us. We are unavoidably caught up in perspectival context, without truly objective ground to stand on. But objectivity was always an implicit abdication and obscuration of power and the necessity of choice. It was the idea that we could calculate what to do from objective factors that we didn’t have to take responsibility for. We do have to take responsibility, but that can mean a proactive positive acceptance. We can step up to the challenge of power and perspective, rather than reject it out of guilt and trauma.

Seen thus, a living post-structuralism is a philosophy not of radical alienation but radical interconnection. It is not the death of stable meaning, but the moment a system we thought rigid, immovable, and observable from the outside stirred and opened its eyes to return our gaze. It is also increasingly supported by contemporary studies in—for example—ecology and theoretical physics. If even the hardest of hard sciences now advances a theory of reality that embrace radical uncertainty and the implication of the observer in what is observed, then surely the humanities can do so as well without giving up on meaning altogether?

The great insight of postmodernism is that meaning is unstable, and mediated in infinite complexity by systems of power in which we are decentered but implicated. But the response to this insight from the humanities has been a furious rearguard action by the ideology of fixed meanings that postmodernism itself displaced. Enlightenment rationalism is to postmodernism as Newtonian physics is to general relativity, and it is in the ‘social justice’ ideologies now increasingly hegemonic in elite institutions that Enlightenment rationalism is seeking to make its last stand against the new philosophy of radical interconnection.

If postmodernism claimed that all meanings are unstable, socially constructed, and held in place by operations of power, the defining characteristic of the anti-postmodernism that masquerades as contemporary postmodern thought is its determination to apply that analysis to everything except its own categories and hierarchies. In effect, this system of thought seeks to recoup semiotic stability by replacing the old ‘bad’ hierarchies of Western, patriarchal, heterosexual, etc. dominance with new ‘good’ ones.

All activities, goes the claim, are tainted by the toxic operations of overdetermined systems of oppressive social meaning which speak through us and over us regardless of what little agency we might imagine ourselves to have. So in the political framework of anti-postmodernism, fixed immutable characteristics such as race assign their bearers a position on a rigid hierarchy of ‘marginalization’ which in turn influences their status within the system. The legitimacy of the new, fixed hierarchies of marginalization-as-status rests, we are told, in how they correct for, deconstruct, and overcome previously imposed power inequalities. The chief form of political action is a wholesale effort to dismantle these former inequalities, wherever they may be found.

But in practice, the demand that all historically imposed power relations be deconstructed unwinds the legitimacy of any possible social relationship or institution. All meanings necessitate the exclusion of what-is-not-meant. Making absolute inclusion a central political demand is thus in effect a call for the abolition of meaning. We are never told what form the good life might take, should this project of semiocide ever be completed. But one thing is clear: it can have no social or intersubjective dimension, for that would imply shared meanings, and with shared meanings the operations of power—exclusion, definition, the imposition of significations not wholly self-chosen—inescapably return, as do hierarchies. In this sense, the push for semiocide in the name of social justice is a project whose ultimate aim is an individuation so total it precludes any form of encounter with the Other, except in a multidirectional contest for domination that none can be permitted to win.

From other vantage points within the culture war, the reaction to this doctrine is often mockery, for the doctrine’s self-absorption, incoherence or preoccupation with language and ‘luxury beliefs.’ This is mistaken. Its adherents are motivated by compassionate idealism, but have been misled by a destructive falsehood and are in many cases deeply unhappy. The decentering of the Enlightenment subject brings with it an invitation to a more fluid experience of selfhood as radically inseparable from and in a process of co-creation with all of reality, and yes, with the power structures of the society in which we live. But the contemporary critical theory I am calling anti-postmodernism shows young people this vision of beauty, only to dismiss it as a pack of tendentious, self-interested lies.

It is no wonder today’s young people fling themselves miserably against the bars of whatever structures of meaning are still standing in an effort to knock them down—or perhaps to prop themselves up. Whether it is the SJWs, the frog memers, or the ‘failson’ ironists, they can smell the fresh breeze of meaning, less linear than the rationalists would like but nonetheless real, and yet they have been told they cannot have it, because it is not there, or else comprises only violence and hostility. So, they fight over the broken rubble of the Enlightenment, or with each other, or their ancestors, and starve in the midst of a banquet.

To recap, then: what gets called ‘postmodernism’ today is not postmodernism but the last spasm of the worldview displaced by postmodernism, that saw meanings as fixed, knowable and amenable to human mastery. This anti-postmodernism diverts young people from the astonishing richness of a systems-based, decentered engagement with the world’s semiotic complexity by seeking the only remaining form of mastery it can imagine: a defensive assault on meaning itself.

Instead of embracing the fluidity of systems of meaning, and each subject’s situatedness within that system, young people are taught that the only legitimate foundation for political action—or indeed any kind of social participation—is atomized selfhood, constructed from within and defended with narcissistic brittleness. They are taught to see themselves as solely responsible for discovering, curating, optimizing and presenting this supposedly ‘authentic’ self as their central marketable asset. But they also learn that it is continually under assault by hostile forces of oppressive social meaning whose aim is to keep them—or others like them, or someone anyway—marginalized, abject and on the back foot.

Within this system, it follows that the central locus of political activism must be to disrupt these oppressive forces that marginalize unfavored groups, so as to advance the project of collective liberation to ‘be our authentic selves.’ This is not just a political project but an existential one, for along with marginalizing unfavored groups these forces impose unlooked-for and oppressively overdetermined social meanings on each of us, undermining each young person’s quest for authentic selfhood. Individuals caught up in this worldview genuinely believe they are agitating not just for the liberation of the oppressed but for their very existence.

The fixation of today’s elite graduates on ‘validation’ of ‘identities’ may seem frivolous to older generations. But within a worldview that frames all forms of social meaning as oppressive by definition, the very gaze of the Other is an unacceptable attack on the pristine territory of the self. If we reject the genuinely postmodern ethic of radical semiotic interconnection, and our interwovenness with structures of meaning in society and the natural world, then the movement of these structures in, on and within our individual identities comes to be experienced as violence.

This perspective exists in tormented symbiosis with an Other it can neither tolerate, nor yet wholly dispense with. For the paradox is that the invasive gaze of the Other, laden with unwanted and oppressive shared meanings, is simultaneously the source of suffering and salvation. The gaze of the Other is experienced as a hostile and violent invasion, forever imposing unlooked-for social meanings that constrain the liberty of each sacred self. But it is also the only source of the ‘validation’ that will reassure each individual that their self-creation project is real, true and accepted.

The solution, within this worldview, is an (again paranoid-schizoid in the Kleinian sense) ever more desperate effort to control the thoughts of the Other. We see this in politicized campaigns to control speech in the service of identities. But as any psychotherapist (or parent) will tell you, trying to control the inner life of another is a project that in normal circumstances seems achievable (or indeed desirable) only to young children or the mentally disturbed. That it should become a central political desideratum for a generation of elite young people does not bode well for the future health of public life.

When I started my undergraduate degree 20 years ago, critical theory was one epistemology among several, which we learned about as it were ‘from the outside’ rather than as a framework for understanding other philosophies. Though it affected me severely, in ways I have already described, most of my contemporaries simply learned about the ideas and were largely unaffected. Today, though, this epistemology has eaten and digested the humanities and begun to nibble on science and even mathematics. As a result, for today’s young people, it is increasingly difficult to find a vantage point outside its political ontology from which to evaluate its operations.

We should not be surprised, then, that mental health issues have skyrocketed in elite college-age populations. They are being taught to believe, as a foundational framework for understanding the world, that acceptance in the gaze of the Other is key to validating a selfhood they alone are responsible for creating, curating and optimizing. But they are also being taught that all shared meanings—in other words, anything conferred by the gaze of the Other—represents a hostile act of violence. How is any young adult meant to navigate this catch-22?

It is a mistake to dismiss this as narcissistic—or, at least, to ignore the suffering of those trapped in this bind. To be ‘defined’ by something other than our own desire is in this system to be injured, parts of our authentic self mauled or amputated, whether by social meanings we did not choose or the givens of our embodied existence. This is a phenomenally cruel thing to teach young people, as it leaves them feeling perpetually oppressed by the givens of existence itself.

This analysis also sheds light on the crisis of elite purpose and leadership Natalia Dashan described in her Palladium piece last year. If shared meanings are not only unavailable but actively hostile, how is any young person meant to formulate a legitimate rationale for stepping up? No wonder so many elite graduates dismiss even the possibility of public service in favor either of pecuniary self-interest in finance or tech, or else joining the ranks of activist-bureaucrats seeking to advance the destruction of shared meanings in the name of total inclusion.

But as societies around the globe struggle to get to grips with coronavirus, we no longer have the luxury of sitting about like Shakespeare’s Richard II, mourning a broken model of meaning as the world disintegrates around us. Facing the deaths perhaps of loved ones, and certainly of everything we thought of as ‘normal’ until a few weeks ago, destroying what is left of our structures of social meaning in the name of liberation politics or frog-meme irony is an indulgence we cannot afford. The project of reconstruction is urgent. This project is both an inner and an outer one: reconstruction of an inner life capable of navigating social meanings without experiencing them as violence, and also of our willingness to participate in the external, political analogue of those social meanings, namely institutions, political structures and—yes—hierarchies.

This is not to say that we should shrug at unjust systems of domination. The ‘social justice’ excavation of ‘implicit bias’ is not wholly without merit. It is on all of us to make sincere efforts to meet the Other to the best of our abilities as we find it, and not simply reduce the world out there to our preconceptions. But this effort cannot be so all-encompassing as to destroy what systems of shared meaning we have left. Nor can we afford to see it grind common endeavor to a standstill.

No one knows yet what the world will look like as we emerge from the political and economic convulsions engendered by this global pandemic. One thing is clear, though: the ethic of radically individualist atomization implicit in ‘social justice’ campaigns for the destruction of all shared meaning is woefully inadequate to the challenges we now face. Through its lethal spread and infectiousness, coronavirus has demonstrated vividly how our fates remain bound to one another in infinitely complex ways, however loudly we may assert our right to self-authorship. Faced with the persistence of our social, biological, semiotic, economic, and ecological interconnectedness, we would do well to embrace and make a virtue of it, to salvage those shared meanings that remain to us, and begin the process of building new ones that will sustain us into the future.

This article was originally published at Palladium magazine.

The Irreligious Right

Today’s hottest property: young fogeys. Blue Labour hailed Boris Johnson’s landslide election victory as a rebellion by the country’s ‘culturally conservative’ silent majority. A new conservative magazine seems to appear every week. We have even seen a youth movement for the revival of socially conservative values popping up in that bastion of modern double liberalism, the Conservative Party.

What do they all want? At the more wonkish end of the debate, the argument is broadly that the political push throughout the twentieth century for ever greater social and economic freedom has brought many benefits, but that these have been unevenly distributed and are now reaching the point of diminishing returns.

The pursuit of ever greater freedom and individualism, this strand of thought argues, has delivered rising wealth while hollowing out working-class communities; liberated some women while forcing others to work a double shift and abandon the young and old in substandard care, and provided an infinitude of consumer choice but at the cost of mounting ecological damage. Under the sign of radical individualism, the new communitarians argue, we are all becoming more solitary and self-absorbed. Even charitable giving seems to be in unstoppable decline.

But what, in practice, are the new social conservatives seeking to conserve? Calls for a revival of cultural conservatism, many in the name of Christian values, seem often on closer examination oddly insubstantial. In 2017, UKIP’s leader-for-that-week Stephen Crowther said that the UK is a Christian country, “and we intend to stay that way.” But for Crowther, being a Christian country does not seem to impose any obligation to actually be Christian: 

including Christian in our list [of principles] does not imply any requirement for individual faith, but it reflects the Judeo-Christian classical and enlightenment origins on which our laws, our social systems and our cultural norms have been built over two millennia.

Elsewhere in Europe, Hungarian Prime Minister Victor Orbàn describes his brand of authoritarian, identity-oriented politics as ‘Christian democracy’. Only a minority of Hungarians go to church every week – 56% of the country identifies as Catholic, though only 12% attends church regularly – but the identifier ‘Christian’ has nonetheless become central to Orbàn’s politics.

Much as Crowther did, the Orban-supporting Bishop of Szeged, László Kiss-Rigó, bridges this gap with a vague, cultural definition of what actually constitutes a ‘Christian’: “In Europe, even an atheist is a Christian”, he said. It turns out that being ‘Christian’ is less about prayer or doctrine than ‘values’: “We are very happy that there are a few politicians like Orbán and Trump who really represent those values which we Christians believe to be important.”

What exactly are these values, then? Attendees at anti-Islam Pegida rallies in Germany carry crosses and sing carols. Italian right-winger Matteo Salvini punctuates anti-immigration rhetoric by brandishing a rosary, drawing criticism from the very Catholic faith whose symbols he invokes. Try to pin down any actual values this form of Christianity might require of its adherents, and matters are much less clear.

Even those whose stated desire is to defend the place of faith in public and political life seem keen that the faith itself stop short of imposing actual obligations. To take a more moderate example of the new cultural conservatism, the Social Democratic Party took a broadly post-liberal, culturally conservative stance in its 2018 relaunch. The New Declaration made an energetic defence of our right to hold even illiberal religious views openly in public life:

Citizens holding a traditional, patriotic or religious outlook are often bullied and marginalised, stifling the open debate upon which a free and democratic society depends. 

Then, about a year later, the SDP lost its only donor over a bitter intra-party dispute about whether or not it should be party policy to ban halal slaughter – a position markedly at odds with the party’s previous defence of religious pluralism. And  when the Church of England recently reiterated its long-held position on sex and marriage, prominent SDP member Patrick O’Flynn took to the pages of the Daily Express to mock ‘the otherworldliness of these Men of God’. Instead of insisting on ‘out of touch’ doctrine, O’Flynn suggested, in order to attract more young people to weekly worship the Church should adjust its doctrines on sex and marriage to reflect their values.

In this view of faith, theological positions do not reflect any kind of truth-claim but should be emergent properties of the aggregate ethical positions held by the members of that church. Less ‘Christian democracy’ than ‘democratic Christianity’: whatever the congregants believe becomes the doctrine of the church.

From a religious perspective this makes no sense. To the believer, doctrine is handed down from God Himself. The thought of God’s word being subject to plebiscite is absurd, if not outright blasphemous.

This debate reveals the missing piece in today’s would-be conservative revival. Where do our values come from? What is the proper source of political authority? Progressives gesture at natural rights or an imagined future utopia, but for anyone who remains unconvinced that we are all on a journey somewhere wonderful, some other authority is required.

Edmund Burke suggested the answer lay in a blend of deference to tradition and God’s grand design, tempered by carefully constrained democratic institutions; his Savoyard contemporary, Joseph de Maistre, argued that the only proper form of authority lay in God’s will, delivered via the Pope and an absolute monarch.

The history of modernity has unfolded in the tensions between these competing understandings of political authority. ‘The will of God’, the will of ‘the People’, and the grand designs of various utopias have variously been used to justify all manner of enterprises, with outcomes from the magnificent to the horrific. But our present political difficulties may be in part down to a growing popular discomfort with accepting the legitimacy of any of the above.

Since the election of Donald Trump and the vote to leave the EU, there has been a low but persistent rumble from our moral betters that democracy should maybe have its wings clipped a little, to stop stupid proles making bad decisions. A degree of wing-clipping has in fact long since taken place: John Gray has discussed recently in these pages the way the language and legal mechanism of ‘rights’ is used to shift entire areas of public life from democratic debate to the dry realm of unelected lawyers and judges. But if authority does not reside in the will of the people, nor does it reside with God: it is difficult to imagine a mainstream British politician claiming moral authority on the basis of divine will without being roundly pilloried

Progress and human rights, then? Every young person who passes through a modern university is taught in no uncertain terms that totalising metanarratives are suspect. At best, they are power moves. Whenever you find one you should ask cui bono? In the case of universal human rights, the answer is probably: lawyers.

This leaves would-be conservatives in a bind. If (with a few honourable exceptions still holding out for direct Vatican rule) political authority rests not in tradition (too restrictive on personal liberty) or democracy (probably rigged) or even God (don’t tell ME what to do!) or even in the lawyers, then what is left?  Politics professor Matt McManus argues that the result is a postmodernism of the right as well as of the left: a series of nested calls for a return to authority, tradition and culture that all, on closer inspection, turn out to be largely delivery mechanisms for adversarial but hollow identity politics.

Having come unmoored from its roots either in the past, the divine, or the popular will, McManus suggests that this postmodern conservatism has warped a Burkean belief in tradition into a kind of moral cosplay whose main purpose is less seeking the good life than making a noisy defence of whichever identities its sworn enemies attack. As the postmodern liberal-left demonises heterosexual white males, so postmodern conservatism sets out to defend them; and so on.

Seen in this light, the problem with Orbàn and other borrowers of Christian clothing is not that they do not believe their own words. Inasmuch as they can mean anything, they genuinely identify as Christians. It is more that when all sources of authority are suspect, the only legitimate recourse is to the self: to identity, and identification.

And the problem with identification is that it remains separate from whatever it identifies as. Just like the modern dating marketplace, where commitment is radically undermined by the ease of swiping right, modern cultural conservatism is radically undermined by the fear that without a reliable foundation of authority, and with more identity-choice options only a click away, we are never fully the thing we claim as our identity.

Without a sense of confidence in the roots of its political legitimacy, conservative values dissolve from concrete obligations to consumer accessories. This in turn is why Orbànist ‘Christian democracy’ and many of its populist cousins find their most compelling realisation not in religious doctrine or observance, but in defining themselves against their outgroup. If “even an atheist is a Christian” then either no one is a Christian, or everyone is. The only way of defining what a Christian is, is in terms of what it is not: foreigners.

But if this is so, then in a postmodern environment, shorn of recourse to authority, cultural conservatism is a waste of energy. It cannot define what it wants. All is insubstantial; there is no exit from the Matrix, nothing left to conserve.

Does it follow from this that those who long for place, limits, love, family, faith and meaning should just sit in the rubble and watch it all burn? I do not think so. But when there is nothing solid to go back to, anyone attracted to what is left of the ideology that used to be called ‘conservative’ needs to find a new name for their yearning. ‘Constructionists’, perhaps. There is a lot of building to do.

This article first appeared at Unherd

On the censoring of seriousness for children

Our local church runs a monthly service aimed at children, with crafts and without Holy Communion. The team that organises the Friends and Family services are lovely, work very hard to come up with activities and an appealing programme for younger worshippers, and it is popular with families many of whom I don’t see at regular services. My daughter (3) loves it.

It’s on the first Sunday of every month, so the first Sunday of Advent coincided with the Friends and Family service. My daughter enjoyed decorating the Christmas tree, making little Christmas crafts and other activities. But one thing puzzled and still puzzles me.

This is one of the songs we were invited to sing. ‘Hee haw, hee haw, doesn’t anybody care? There’s a baby in my dinner and it’s just not fair.’ It’s supposed to be a funny song, from the donkey’s point of view, about the Holy Family in the stable and Jesus in the crib. What I don’t understand is why this should be considered more suitable for children than (say) Away In A Manger.

The former depends, for any kind of impact, on a level of familiarity with the Christmas story that allows you to see it’s a funny retelling and to get the joke. That already makes it more suitable for adults. The latter paints the Christmas scene in simple language and follows it with a prayer that connects the picture with the greater story of the faith it celebrates. The tune is easy to learn and join in with. Why choose the first, with its ironic posture and ugly, difficult tune, over the latter with its plain language and unforced attitude of devotion?

I’ve wondered for some time what it is about our culture that makes us reluctant to allow children to be serious. Children are naturally reverent: if the adults around them treat something as sacred, even very young children will follow suit without much prompting. This should come as no surprise – the whole world is full of mystery and wonder to a 3-year-old. It is us that fails so often to see this, not the children.

So why do we feel uncomfortable allowing children to experience seriousness? Sacredness? Reverence? How and why have we convinced ourselves that children will become bored or fractious unless even profoundly serious central pillars of our culture, such as the Christmas story, are rendered funny and frivolous?

The only explanation I can come up with is that it reflects an embarrassment among adults, even those who are still observant Christians, about standing quietly in the presence of the sacred. What we teach our children, consciously or unconsciously, is the most unforgiving measure of what we ourselves hold important. But it seems we shift uncomfortably at the thought of a preschool child experiencing the full force of the Christmas story in all its solemnity. Instead we find ourselves couching it in awkward irony, wholly unnecessary for the children but a salve to our own withered sense of the divine.

If it has become generally uncomfortable for us to see reverence in a young child, during Advent, then the Christian faith really is in trouble.

On halal, kosher, religious tolerance and having it both ways

Yesterday I live tweeted the SDP conference in Leeds. It was a great day with many interesting speakers, but easily the most controversial discussion – and the one that has generated the most reaction in my Twitter mentions since – was the motion to amend SDP policy on non stun slaughter. Previously, policy was to ban these methods of slaughter, but at the conference a motion was decisively carried to amend this to provisions on strict standards, ensuring supply does not outstrip demand (eg non stun slaughter for export) and proper labelling.

I gather that debate around the subject prior to conference was heated. I know at least one person who left the party over the subject. I spoke in favour of the motion despite being personally uncomfortable with such methods of slaughter on the grounds that an explicitly communitarian party needs to be willing to demonstrate a recognition that religious practice is immensely important to some groups, and to create space for such practices even if we find them personally unappealing.

But once you start making explicit provision for communitarian considerations, the tension between faith and other ethical frameworks is immediately apparent.

The subsequent discussion – and its links into ‘preserve our culture’ groups such as For Britain and Britain First – put me in mind of two brouhahas a little while ago where politicians tried to articulate a position weighing private faith against public mainstream morality. In April 2017, then Lib Dem leader Tim Farron refused to deny that his personal faith held homosexuality to be a sin. In September the same year, Jacob Rees-Mogg made statements on abortion and homosexuality, consistent with Catholic social teaching, that saw him excoriated as ‘a bigot’ and ‘wildly out of step with public opinion’.

Commentators at the time lined up to defend Farron and Rees-Mogg. There was the usual hum from offstage (ie Twitter) about the right to express views in keeping with traditional Christianity without facing punishment from an illiberal liberal elite.

So I find it interesting to see that when it comes to a religious practice from Islam and Judaism – slaughtering animals by slitting their throats, without stunning them first – some of the voices raised most loudly in agreement about the iniquity of ‘You can’t say that’ culture as it bears on Christians today should be perfectly content to support policies that actively militate the practice of those other faiths. If we are to defend Rees-Mogg and Farron on grounds of religious tolerance, should we not also consider defending halal and kosher slaughter on the same grounds? After all, the core argument of tolerance is not that one tolerates only things that one likes or feels indifferent to but that it is extended to things one actively dislikes.

It feels to me as though there are two things going on here.

Firstly, the Britain First types who wish to support religious exemptions for Christians but not for Jews or Muslims are not, themselves, Christians for the most part. Rather, they are secular inheritors of the Christian tradition who wish to preserve the structure of that tradition for the benefits it has for some time provided – a fairly stable, prosperous, harmonious society with congenial values – without taking on the obligations of the faith itself. To put it a less fashionable way, they wish to be redeemed but without themselves taking up the cross. For that, in a nutshell, is the argument made by those who argue against ‘illiberal liberalism’ but do so from a perspective that rejects the necessity of faith – any faith, perhaps, or Christianity in particular – in creating the society to which they wish to belong.

We might term it ‘religious utilitarianism’ – a worldview that recognises the utility of faith in delivering certain social goods but takes no position on the veracity or otherwise of the tenets of any faith in particular. Liberal relativism is a kind of equal-opportunities religious utilitarianism, that wishes to make space for any and all faiths to provide those goods in a pluralistic way, while the Britain First / Batten-era UKIP version of the same wishes to privilege Christian religious utilitarianism over the more relativistic liberal sort. That is, Britain First types want to keep only the outward forms of Christianity but do not wish anyone else to replace those forms with a more deeply-felt faith of their own.

But if we are to argue for religious tolerance, and for Christianity to play an active rather than a purely decorative role in our society then – the logic dictates – we must either be explicit about repressing other faiths in support of that goal, or else extend the same courtesy to other faiths. The alternative – hiding our hostility to other faiths behind a selectively-applied appeal for religious tolerance only as it pertains to ‘our’ deviations from the liberal consensus – is simply not good enough.

On marriage, tattoos, time and despair

Young people don’t get married. Young people are covered in tattoos. Now that I’m middle-aged, this is the kind of thing it would be tempting to see as evidence that the world is going to the dogs, that we’re facing some sort of terrible moral decline and that the solution is for everyone to buck up and improve their attitude.

I think we are indeed facing a growing cultural crisis, but I’m increasingly of the view that telling young people to buck up wholly misses the point, and that what we are seeing isn’t a deterioration of attitude but an emanation of something more like despair. Two things I’ve read recently prompted this line of thought.

This rather wonderful article from the Institute for Family Studies is worth a read in its own right for a wealth of beautifully phrased observations on marriage. But one paragraph, on the decline of marriage among the young and/or less wealthy, pulled me right up short:

I think the problem that the less wealthy are having [in regards to marriage] is this kind of achievement attitude that we have about marriage—that I can’t get married because I don’t have a stable job; I can’t get married because one of the partners is not employed, and I don’t want to be on the hook for them or a drag on them. I think that the American government, for all that it loves marriage, does not support families very well. The minimum wage here is a joke; people would have to work 25/8 on that to support a family. There’s so little family leave. It’s brutal, especially at the lower end of the wage spectrum. If you don’t work in a knowledge industry, if you’re sort of an hourly employee, it’s incredibly hard to have a family and have children. Johns Hopkins sociologist Andrew Cherlin writes a lot about how the working classes have abandoned marriage partly because it’s an achievement and partly because getting married suggests a plan for the future; it’s an optimistic thing to do. And I think that often people find that they just don’t have enough hope in the future to be able to make that statement…

That is to say, maybe it is not the deliquescing effect of corrupting liberal values that are causing this breakdown in willingness to commit long-term among the young and/or poor. Maybe these demographics are not getting married because they don’t have enough hope for the future to make long-term decisions seem like a good use of energy and resources.  Let that sink in. How utterly screwed are we as a society if we’re so inapable of solidarity across generations that anyone young, or less wealthy is sinking into a kind of future-free despair?

On a similar note, consider tattoos. A recent study reports that

according to numerous measures, those with tattoos, especially visible ones, are more short-sighted and impulsive than the non-tattooed. Almost nothing mitigates these results, neither the motive for the tattoo, the time contemplated before getting tattooed nor the time elapsed since the last tattoo. Even the expressed intention to get a(nother) tattoo predicts increased short-sightedness and helps establish the direction of causality between tattoos and short-sightedness.

Conservatives such as Dalrymple write  about tattoos as cultural degradation, with the clear inference that what it evidences is a collective moral decline. But if this study is correct, that is only half right: rather, it points to a rise in short-termism. That could be read as moral decline of a sort. After all, an inability to plan for the future is a serious inhibitor if anyone’s ability to think and act socially, or with any of the ability to defer gratification we associate with civilised achievements of all kinds. But could it not also be read as a failure of optimism?

It’s a thought that lands like a ton of bricks in the middle of any temptation I might feel to wag a moralising finger at someone just starting out now on adult life. Maybe each of these tattooed, unmarried, commitment-shy young people is less a weak-chinned scion of all that is good, pissing his or her cultural inheritance up the wall on frivolities, than a despairing soul fallen out the other end of of a cultural moment and stuck in their own personal Weimar Republic with no meaningful event horizon and no desire to do anything but dance, drink, fuck and draw on themselves with Biro. If this is the case, then older generations truly have a duty to try and help in some way. What ‘help’ looks like in that context I am less sure, but it is surely on anyone over 35 or so to consider where hope resides, and what duty we have to ensure it is not, like home ownership or a stable job, simply something that people used to have before we all gave up and danced ourselves to a childless, tattooed death.

Can societies survive without blasphemy laws?

So today I was mulling gloomily over the way hate crime laws seem to have taken seamlessly over the function of blasphemy laws in the UK. I decided to look up when blasphemy was abolished as an offence in the country, thinking it might be sometime in the 1970s. Wrong – blasphemy was abolished as an offence in 2008. The acts governing hate crime (the Crime and Disorder Act and the Criminal Justice Act) were added to the statute book in 1998 and and 2003 respectively.

The CPS’ own website states that

The police and the CPS have agreed the following definition for identifying and flagging hate crimes: “Any criminal offence which is perceived by the victim or any other person, to be motivated by hostility or prejudice, based on a person’s disability or perceived disability; race or perceived race; or religion or perceived religion; or sexual orientation or perceived sexual orientation or transgender identity or perceived transgender identity.”

These laws have been used in recent times for such diverse purposes as fining a man who taught his girlfriend’s dog to make a Nazi salute and arresting a woman for calling a transgender woman a man.

The common feature of both the blasphemy laws of yore and the hate crime laws of today is that both prohibit speech considered harmful to society’s morals. That society’s morals are no longer situated in a common belief system (such as Christianity) but an atomised, individualistic inner space (as expressed by the definition of hate crime as anything which is perceived by an individual as being such) is neither here nor there. Certain tenets cannot be challenged lest doing so harms the fabric of society.

It’s also neither here nor there that some of those moral tenets are unprovable or unfalsifiable in any objective sense: the Resurrection of Christ, say, or the existence of some magical inner ‘gender identity’. Indeed the more outlandish a protected belief the better, because the function of blasphemy laws is to compel moral obedience, and what better sign of moral obedience than to see people dutifully repeating something that is in no sense objectively true (such as that men can become women) on pain of being punished if they don’t comply?

My argument here isn’t that we should abolish hate crime laws as we did their predecessors, the laws of blasphemy. I don’t want to rant, Spiked-style, about the threat from blasphemy and hate crime laws to free speech so much I want to ask: have we ever really had free speech? It seems no sooner did we get rid of one set of rules about what you can’t say than we replaced them with another. There was, perhaps, a couple of decades where blasphemy was effectively defunct despite the statute remaining in existence and before hate crime came to be. But the collapse of controls on speech for religious reasons is nigh-simultaneous with the rise of controls on speech for social justice/equality reasons. The Human Rights Act 1998 forced blasphemy law to be restrained by the right to free speech; the same year, the Crime and Disorder Act made hateful behaviour toward a victim based on membership (or presumed membership) in a racial or religious group an aggravating factor in sentencing. (Insert chin-stroking emoji here.)

This leads me to suspect that human societies cannot, in fact, survive very long without laws of some kind governing speech. I’d love to see a counter-example. But I’ll be astonished if anyone can point me to a state that has abolished religious blasphemy without replacing it with controls on speech for other reasons, whether (under supposedly atheistic Communism) to forbid speaking against the Dictator, or (under supposedly individualistic, pluralistic liberalism) to forbid speaking against individuals’ notional right to self-define without reference to the collective.

Much as every human represses some aspects of their personality in order to function, every society does so too; it is a foolish or short-lived society that makes no effort to clamp down on behaviours or opinions that pose a threat to what that society considers the good or virtuous life. If that’s the case, is there even any value in trying to fight what feels like a rising tide of authoritarian busybodying keen to tell me what I can and can’t say? Or should I just pile in and make my bid to be on the team who’s in charge of deciding what should or shouldn’t be banned?

Right now, the two groups jostling most energetically for that position in the UK are the proponents of ‘intersectionality’ and the radical Islamists. If Nassim Taleb is correct, and social mores are disproportionately set by tiny ideological minorities purely based on the strength of their conviction, then whether we end up punishing those who assert that men cannot become women or those who draw cartoons of Mohammed will be a straight fight between which of those groups is more determined to blow shit up if they don’t get their way.

I don’t really like the way this argument is going. If I’m right, then social mores in a few decades will bear few resemblances to those of today And whether they’re structured with reference to authoritarian liberalism or radical Islam I don’t think I will particularly like their shape. But there’s nothing I can do about it – the moral majority in the country is firmly post-Christian and, as I’ve argued elsewhere, a society that can’t be arsed to defend its moral traditions is guaranteed to see them supplanted by ideologies with more committed adherents. And indeed, the kind of Christianity that did once upon a time get out of bed to defend its moral tenets by any means necessary would probably, in practice, be as repugnant to me as either of the likely moral futures toward which our society is heading.

 

On Reconstruction: surviving the trauma of postmodernism

I’ve been mentally composing a version of this essay for a long time. I thought perhaps its relevance might have passed but the explosion in the last few years of postmodern identity politics into the mainstream convinces me that far from being something that happened briefly to one not very happy undergraduate in the early 00s, the mental distress I experienced as a result of exposure to ‘critical theory’ has expanded to encompass much of contemporary discourse. I don’t claim to have a solution to that, but I want to share how I survived.

I went to a moderately eccentric school by ordinary standards, but for the purposes of this essay we can treat it as a classical education, inasmuch as we learned about great civilisations that came before ours and this knowledge was treated as important and still relevant to us and the world and culture today. Built into the form of the curriculum was a tacit teleology, that implied (whether or not it was ever stated) an evolutionary relation of each civilisation to the one that preceded it. It was a narrative that led to where we are now, and the civilisation we currently inhabit.

Imagine my surprise, then, when as an English Literature undergraduate at Oxford in around 2000, I discovered postmodernist thought, and its many schools of critical theory.

By ‘critical theory’ I mean the body of thought emanating initially mostly from France, with Saussure and Derrida, then expanding out to include such figures as Paul de Man, Slavoj Žižek and Judith Butler. Many more names have joined that list since, and taken together I believe it is referred to as ‘cultural studies’ today, or, in the words of the Sokal Squared hoaxsters, ‘grievance studies’. Back in 2002 at Oxford, critical theory was a looming presence at the edge of the arts but seemed most pertinent to the study of literature; it has subsequently, I gather, swallowed most of the humanities and is mounting siege against the sciences as I write.

But I digress. The central insight of this discipline was the destabilising one, and that I think has not changed. To summarise: Saussure proposed that instead of treating language as transparent, its meaning rising off the page without any need for elucidation, we should split language into ‘sign’ and ‘signified’. That is, what a word means is separable from the word that means it. We can thus, he argued, institute a new discipline of ‘semiotics’: the study of signs – a study that reaches far beyond language and was immediately influential in the social sciences.

This insight was developed by Jaques Derrida, whose simple but devastating observation was that if this is the case, we cannot define any given ‘signified’ except with reference to further signs, which then in turn themselves require definition with reference to further signs. It’s turtles all the way down. We have no means, through language, of arriving at any kind of truth that we are able to experience directly. Furthermore, the concerted efforts by centuries of culture since the Enlightenment to obscure the fact that it’s turtles all the way down is in fact a cunning effort to shore up vested interests, and to conceal the operations of power. Recourses to authority are null and void. There is no solid foundation, no God, no truth, no authority. Only power, and a form of consensus reality arrived at through the glacial accretion of a billion tiny operations of power that have, in sum, decreed that the world should be thus and not thus.

I freely admit that I was a bit loopy anyway when I reached that point on my reading list, for unrelated personal reasons. But this insight hit me like a freight train. I spent most of one Trinity term feeling as though I was in the midst of a psychotic experience. Instead of seeing the ‘dreaming spires’ around me as the accumulation of centuries of carefully-tended tradition, a representation in architecture of the ideal of the university as a custodian of the best that has been thought and said to date, I saw each building as a kind of nightmarish extrusion into physical space of power structures that were indifferent if not hostile to me as a sexual minority and a woman. I felt suffocated: stifled by a kind of blaring architectural triumphalism that declared at every corner, with every church tower, every statue of every great man, ‘YOU CANNOT CHANGE ANY OF THIS, WE WILL ALWAYS WIN’. And stifled again by the nihilistic twist postmodernism places on this reading of the world, and culture, in that it assures us that there is nowhere to stand outside the push and pull of power. There is nothing outside the text. So in trying to challenge these operations of power, we will probably just end up re-inscribing them.

By now you’re probably thinking ‘wow, she sounds nuts’. Well yes, I was a bit at that point, as I said before. But I describe this experience in detail because 1) it was so distressing and 2) the state I spent the next few years in following my fall from the Eden of pre-post-modernism[1] sounded, in my inner monologue, so similar to what I read of the toxic ‘social justice’ debate that rolls around our social media some 15 or so years on that I feel they must be related. What if today’s SJWs are in fact acting out a traumatic state of mind engendered by exposure to ‘cultural studies’ at university? If that is the case, then there may be someone out there who will find some comfort in my story of how I recovered from that experience to the point where I was able to make any decisions at all.

Because make no mistake, the Fall engendered by internalising the idea that ‘there is nothing outside the text’ is a horrible place to be. Consider that iconic scene in the Matrix where Neo wakes from the dream he believed to be normal life, in a slime-covered capsule, to discover that he and the rest of the human species are in fact mindless peons farmed by forces beyond their power to change. Then bin the rest of the Matrix franchise, shoot Morpheus and the rest of the resistance, end the film with Neo back in his pod as a human generator, just without his connection to the Matrix. Eyes staring helplessly into the machine-farm abyss. That’s a bit how it feels.

Forget political radicalism. There’s nothing left, this worldview says, but a continuous action of ‘disruption’ from within the system. There is no way to change the world for the better because what even is the better anyway? All you have left available to you is a kind of carping from the sidelines. Calling out particularly brazen efforts by the collective voice of consensus reality to perpetutate itself in its current form and to silence potentially disruptive voices. Maybe trying to widen the range of voices permitted to contribute to the operations of power. Maybe you can see now how this could be a mindset conducive to (for example) the contemporary popularity of ‘call-out culture’ and quixotic obsession of public discourse with ensuring the identity categories of figures in public life and Hollywood films precisely replicate their demographic proportions in the population at large.

No truth, no authority, no meaning, no means of striving for the good without producing more of the same. Just power. For the longest time I couldn’t find a way out of the dragging nihilism engendered by this worldview. Eventually though it occurred to me that I just didn’t have to be absolutist about it. I just had to be a bit more pragmatic. So what if we can never be wholly certain that what we mean to say to someone else is exactly what they hear, because every definition we use in theory needs to be defined in its turn, and so on ad infinitum? If I ask my friend to pass the salt, and he passes the salt, I really don’t need to waste energy mulling over the social forces underlying the particular rituals around eating and table manners that obtain in my current cultural context. I thank my friend and add some salt to my dinner.

This is a tiny example but I decided to try and apply this principle to life in general. If I needed to get on with something, instead of getting bogged down, within every social context and every situation, with the subterranean operations of power, patriarchy, compulsory heterosexuality etc etc etc, I’d try and bracket all that stuff and act as if things were still as stable as they were before the Fall. I coined the term ‘temporary certainties’ for this state of mind. It took a bit of mental effort (and you probably still think I sound mad) but far less mental effort than inwardly deconstructing every utterance, object and situation I found myself in for signs of Western-colonialist cisheteropatriarchal blahdeblah.

Gradually, the psychosis waned. Now, 10 or so years on from arriving at this solution, it’s still working for me. The world can never be as solid-seeming as it was before my Fall. Truth still seems a bit relative depending on where one is standing. But the important insight is that many categories, many tropes, objects and structures, are stable enough to treat them ‘as if’ they were pre-post-modernist type solid. You don’t need to waste time deconstructing everything; indeed, trying to do so is a fast track to a sense of perpetual victimisation and bitter, impotent rage. And trying to build any kind of transformative politics on a foundation of perpetual victimisation and bitter, impotent rage is not going to turn out as a net force for good, however radically you relativise the notion of ‘good’.

This doesn’t have to mean buying in wholesale to things as they are and becoming a cheerleader for keeping things unchanged. But to anyone currently struggling to focus in a world that seems hostile and composed entirely of operations of power, I say: pick your battles. Much of the world is still good (for a temporarily certain value of good), many people are kind and well-meaning. Creating new interpersonal dynamics around the anxious effort to avoid the accidental replication in ordinary speech of sociocultural dynamics you find oppressive (aka ‘microaggressions’ may not, in the end, make for a more functional society. It’s possible to treat as a temporary certainty the hypothesis that in asking ‘Where are you from?’ someone is not in fact unconsciously othering you by virtue of your apparent ethnic difference, but simply – from maybe a naïve position in a social background that does not include many ethnic minorities – seeking to know more about you, in order to befriend you.

The beauty of a temporary certainty is that, choosing such a vantage point, we can say of any given cultural phenomenon (the institution of marriage, say) ‘we are where we are’. We are no longer stuck with the Hobson’s choice of either pretending to buy into something as an absolute that we see as contingent and culturally constructed, or else setting ourselves pointlessly in opposition to it, protesting that as it is culturally constructed we should make all efforts to disrupt or transform it into some form that might appear more ‘just’. Instead, we can accept that despite this phenomenon being, strictly speaking, contingent, it remains stable enough that we can and should find a pragmatic relation to it. (In my case, that was to get married. One of the best decisions I ever made.)

You may object that my argument here amounts to a strategy for recouping something for cultural conservatism from the rubble of the post-modernist project. I beg to differ. Rather, what I’m advocating here is more along the lines of a plea to those who see themselves as political radicals to think deeply about what really matters and to focus on that. As it stands, ‘social justice’ social media suggests that thanks to the post-Fall malaise I postulate as infecting most of our young people, radical politics is resiling into a kind of nihilistic shit-slinging incapable of going beyond critiquing the contingency of what it seeks to change in order to advocate for anything better.

[1] I don’t mean modernism, hence the clumsy construction. I mean something more like ‘the popular twentieth century Enlightenment-ish consensus about truth, reason and meaning’