How To Find Meaning When Everything Is Power

While we live, we all present different facets of ourselves to different people. Whether in our friendships, work, family or at different times in our lives, we encounter others. All remember us slightly differently, according to their perspective.

While we live, our physical presence holds that multiplicity together. After we die, though, the memories begin to come apart. When my step-grandfather married my grandmother, he already had two children with his first wife. But she had already left him and moved to a different country; he was stepfather to my mother and aunts instead.

He was a big character: an aristocrat of the Greatest Generation, the subject of several films about his war exploits, well-loved farmer, and patriarch to two families. At his funeral, the many facets of his life were already coming apart. Each version of his memory was fiercely defended by the mourner to whom it belonged. Long-standing quarrels, no longer held in check by his living presence, began trickling back into the open. It was not an easy day.

Today, we are all mourners at the funeral of a character on a scale that dwarfs even my roaring, hectoring, pedantic, affectionate, and irascible step-grandfather. We are gathered to mourn teleology itself—the belief that life has objective meaning and direction. What we call the culture war is the aggregate of those quarrels now breaking out between the gathered mourners over their divergent memories of the deceased.

Were we progressing toward universal peace, justice, and equality? Was it the resurrection and the life of the world to come? Perhaps it was the end of history in universal liberal democracy? We cannot agree.

The death of teleology represents a collective cultural trauma that accounts for, among other things, the increasingly unhinged debates around social justice within elite universities, and the reactive phenomenon of the aggressively transgressive online far-right.

But it doesn’t have to be like this. Post-structuralism killed teleology, but it did so in error, by taking a wrong turn; it is this wrong turn that has left us so traumatized.

What is commonly referred to as postmodernism is not in fact post-modern but rather represents a last-ditch attempt by modernism to resist the implications of the post-structuralist mindset whose inevitability is now indicated by fields as diverse as physics, ecology, and psychotherapy.

Deconstruction is not the end: reconstruction is possible, indeed essential.

To situate myself a little in this story: I belong to a generation that is marginal, facing two directions, in several ways that are relevant to my argument. Born in 1979, I sit at the tail end of Generation X. I am old enough to remember the days before the internet, but young enough to be more or less a digital native. I got my first cell phone and email address as an undergraduate at Oxford. I researched my undergrad essays sitting in actual libraries reading physical books, but wrote them on a word processor. I can remember life before social media.

I also received, prior to undergraduate life, a recognizably classical education. This was, in the old-fashioned way, designed to deliver a whistle-stop tour of the march of civilizations from Ancient Egypt via the classical era to Western Christendom, with at least a vague grasp of the cultural and historical highlights of each.

The overall impression delivered was of an evolution of societies, consciousnesses, and cultures over a vast sweep of time and different human epochs that nonetheless seemed to have at least some narrative continuity and directionality. Everything else we learned seemed at least to an extent framed by that sense of situatedness within a larger narrative of human cultural evolution, whose direction was a mystery but did at least seem to be headed somewhere.

Then, in my first year as an English Literature undergraduate, I encountered critical theory—and the entire organizing principle for my understanding of reality fell apart.

To summarize: Saussure proposed that instead of treating language as transparent, its meaning rising off the page without any need for elucidation, we should split language into ‘sign’ and ‘signified.’ That is, what a word means is separable from the word that means it. We can thus, he argued, institute a new discipline of ‘semiotics’: the study of signs—a study that reaches far beyond language and was immediately influential in the social sciences.

This insight was developed by Jacques Derrida, whose simple but devastating observation was that if this is the case, we cannot define any given ‘signified’ except with reference to further signs, which then in turn themselves require definition with reference to further signs. It’s turtles all the way down. We have no means, through language, of arriving at any kind of truth that we are able to experience directly. Furthermore, the concerted efforts by millennia of culture to obscure the fact that it’s turtles all the way down is in fact a cunning effort to shore up entrenched interests, and to conceal the operations of power.

In this view, recourses to authority are null and void. There is no solid foundation, no God, no truth, no authority. Only power, and a form of consensus reality arrived at through the glacial accretion of a billion tiny operations of power that have, in sum, decreed that the world should be thus and not thus.

For me, the shift from a sense of the world as having some stable narrative trajectory to this perspective, in which meanings were not only networked but fundamentally without foundation, was deeply disturbing. It landed like a psychotic experience. Overnight, the hallowed architecture of Oxford University went from seeming like a benign repository of traditions within which I could find my place, to a hostile incursion into my consciousness of something phallic, domineering, and authoritarian. I remember describing to a friend how, as a woman and sexual minority, I suddenly experienced the ‘dreaming spires’ as ‘barbed penises straining to penetrate the sky.’

I wish I could say it passed, but it did not. What did happen, though, after I left, was that I found an accommodation with the loss of teleology and objectivity from my frame of reference. I did this by theorizing that if to posit anything at all is an act of power, then it was one I was also entitled to attempt. All cognition, meaning-making, interpretation, and perception is conceptually laden and socially-mediated action. It is impossible to ground even perception in anything but action and thus power. But so be it. We live in a society and participate in the flow of power all the time. I developed the idea of ‘temporary certainties,’ or in other words, the idea that even if meanings are not stable, many of them are stable enough for me to act as if they were solid in the pre-Derridean sense. I did not have to deconstruct every minuscule interaction for the operations of power it encoded.

In an effort to evade the monstrous pervasiveness of systems of domination and submission, I experimented with radically non-hierarchical forms of living, power exchange sexualities, non-binary gender presentation. I tried my own operations of power: I changed my name to Sebastian, to see what it felt like, then settled for a while on Sebastian Mary. I co-founded a startup with friends, in which we tried to avoid having a management hierarchy.

My accommodation kind of worked, for a while. But it did not last. It is all very well to theorize about non-hierarchical forms of organization, but in order to get stuff done you need a chain of accountability. And the worst sort of hierarchies have a habit of emerging, too, especially in social situations where they are intentionally obscured or deprecated. Communes, collaborative projects, and the like all find their leaders and followers, or their tyrants and victims. My increasing bitterness as I learned this, in the course of trying to get somewhere with the startup, made me so obnoxious as a co-worker that eventually I was expelled from the project which was, by then, failing anyway.

With that rupture, I lost my social circle, my best friend, and my entire carefully reassembled working theory for how to navigate the rubble of broken teleologies that was my adult life in the ‘00s. Concurrently, the Great Crash of 2008 destroyed the equally teleological fantasy of global liberal-democratic hegemony under international capitalism that had powered the Iraq invasion along with the triumphalism of the Blair years.

In the wreckage, though, something wonderful happened. Two wonderful things, actually. First, I met the man who I would eventually marry, and by degrees let go of the belief that in order to sustain my integrity as a person I had to reject any form of stable loving relationship to an Other in favor of multiple, overlapping, unstable platonic, sexual, or ambiguous friendships. Second, I decided I needed to learn how to do something more useful than floating around London curating experimental art events and screwing up entrepreneurship, and went back to school to train as a psychotherapist.

In the course of that study, I learned where postmodernism took its wrong turn. Implicit in the post-structuralist theories taught to every young humanities student at university is the idea that because meanings have no singular objectively correct grounding, they are therefore of no value. Also implicit is the idea that because of this, no satisfying, authentic or truthful encounter with the Other is ever possible—only an endless recursive hall of mirrors composed either of our own anguished reflections or the invasive pressure against our psyches of another’s desire.

In studying psychotherapy, though, I came to realize that while the same post-structuralist decentering of the self took place in psychoanalytic theory between Freud and his contemporary descendants, therapists had—because they have to—rejected the idea that we can never encounter the other. While much contemporary analytic theory acknowledges the need to excavate and make space for the operations of overdetermined systems such as race, class, or sex, it does not automatically follow from the presence of those things that intersubjective contact and meaningful connection cannot take place.

Just like post-structuralism decentered the observer, intersubjective psychoanalysis radically decenters the analyst. But an intersubjective understanding of the relational space as co-created by client and therapist does not preclude the possibility of therapeutic work taking place. And this in turn speaks powerfully to a claim that however muddled, muddied and overdetermined our encounters with the other may be, yet they still contain the potential to be not just benign but real, true, and transformative.

I suppose I could deconstruct that claim in turn. But I have experienced its truth both as client and also, in the course of my work, as therapist. Through intersubjective encounters in the consulting room, I have been transformed, and have transformed in turn. From this vantage point, the claim of post-structuralism to render meaningless all semiotic systems, and reveal as brute operations of power all encounters with the other, seems not just mistaken but (in the Kleinian sense) paranoid-schizoid. It is the tantrum of a child who, on realizing they cannot have exactly what they want, refuses to have even the next best thing and dismisses everything and everyone as evil.

The alternative to this paranoid-schizoid repudiation of meaning is not to reject meaning as dead or hopelessly suborned by power, but to accept that we are enmeshed, shaped and in turn helping to shape networks of meaning as part of a dynamic dialogue. We are nodes in the social and semiotic system. As such, even the act of contemplating those systems of meaning will have some tiny effect on them. When Derrida said ‘Il n’y a pas d’hors-texte’—”there is no outside-text,” though commonly mistranslated as “there is nothing outside the text”—I took it to mean meaning itself was hopelessly corrupted, and objectivity a bust. Today, I see it more as a radical decentering of my selfhood that opens up new, vibrant possibilities of connectedness.

If we read ‘text’ in the biosemiotic sense as fractal, multi-dimensional, and interconnected systems of signification, both of human culture and the natural world (inasmuch as those things can even be separated), then indeed there is nothing outside the text. But that does not mean the text is wholly illegible, or that it does not exist—simply that in reading, we affect what it says, and in return it changes us. We are unavoidably caught up in perspectival context, without truly objective ground to stand on. But objectivity was always an implicit abdication and obscuration of power and the necessity of choice. It was the idea that we could calculate what to do from objective factors that we didn’t have to take responsibility for. We do have to take responsibility, but that can mean a proactive positive acceptance. We can step up to the challenge of power and perspective, rather than reject it out of guilt and trauma.

Seen thus, a living post-structuralism is a philosophy not of radical alienation but radical interconnection. It is not the death of stable meaning, but the moment a system we thought rigid, immovable, and observable from the outside stirred and opened its eyes to return our gaze. It is also increasingly supported by contemporary studies in—for example—ecology and theoretical physics. If even the hardest of hard sciences now advances a theory of reality that embrace radical uncertainty and the implication of the observer in what is observed, then surely the humanities can do so as well without giving up on meaning altogether?

The great insight of postmodernism is that meaning is unstable, and mediated in infinite complexity by systems of power in which we are decentered but implicated. But the response to this insight from the humanities has been a furious rearguard action by the ideology of fixed meanings that postmodernism itself displaced. Enlightenment rationalism is to postmodernism as Newtonian physics is to general relativity, and it is in the ‘social justice’ ideologies now increasingly hegemonic in elite institutions that Enlightenment rationalism is seeking to make its last stand against the new philosophy of radical interconnection.

If postmodernism claimed that all meanings are unstable, socially constructed, and held in place by operations of power, the defining characteristic of the anti-postmodernism that masquerades as contemporary postmodern thought is its determination to apply that analysis to everything except its own categories and hierarchies. In effect, this system of thought seeks to recoup semiotic stability by replacing the old ‘bad’ hierarchies of Western, patriarchal, heterosexual, etc. dominance with new ‘good’ ones.

All activities, goes the claim, are tainted by the toxic operations of overdetermined systems of oppressive social meaning which speak through us and over us regardless of what little agency we might imagine ourselves to have. So in the political framework of anti-postmodernism, fixed immutable characteristics such as race assign their bearers a position on a rigid hierarchy of ‘marginalization’ which in turn influences their status within the system. The legitimacy of the new, fixed hierarchies of marginalization-as-status rests, we are told, in how they correct for, deconstruct, and overcome previously imposed power inequalities. The chief form of political action is a wholesale effort to dismantle these former inequalities, wherever they may be found.

But in practice, the demand that all historically imposed power relations be deconstructed unwinds the legitimacy of any possible social relationship or institution. All meanings necessitate the exclusion of what-is-not-meant. Making absolute inclusion a central political demand is thus in effect a call for the abolition of meaning. We are never told what form the good life might take, should this project of semiocide ever be completed. But one thing is clear: it can have no social or intersubjective dimension, for that would imply shared meanings, and with shared meanings the operations of power—exclusion, definition, the imposition of significations not wholly self-chosen—inescapably return, as do hierarchies. In this sense, the push for semiocide in the name of social justice is a project whose ultimate aim is an individuation so total it precludes any form of encounter with the Other, except in a multidirectional contest for domination that none can be permitted to win.

From other vantage points within the culture war, the reaction to this doctrine is often mockery, for the doctrine’s self-absorption, incoherence or preoccupation with language and ‘luxury beliefs.’ This is mistaken. Its adherents are motivated by compassionate idealism, but have been misled by a destructive falsehood and are in many cases deeply unhappy. The decentering of the Enlightenment subject brings with it an invitation to a more fluid experience of selfhood as radically inseparable from and in a process of co-creation with all of reality, and yes, with the power structures of the society in which we live. But the contemporary critical theory I am calling anti-postmodernism shows young people this vision of beauty, only to dismiss it as a pack of tendentious, self-interested lies.

It is no wonder today’s young people fling themselves miserably against the bars of whatever structures of meaning are still standing in an effort to knock them down—or perhaps to prop themselves up. Whether it is the SJWs, the frog memers, or the ‘failson’ ironists, they can smell the fresh breeze of meaning, less linear than the rationalists would like but nonetheless real, and yet they have been told they cannot have it, because it is not there, or else comprises only violence and hostility. So, they fight over the broken rubble of the Enlightenment, or with each other, or their ancestors, and starve in the midst of a banquet.

To recap, then: what gets called ‘postmodernism’ today is not postmodernism but the last spasm of the worldview displaced by postmodernism, that saw meanings as fixed, knowable and amenable to human mastery. This anti-postmodernism diverts young people from the astonishing richness of a systems-based, decentered engagement with the world’s semiotic complexity by seeking the only remaining form of mastery it can imagine: a defensive assault on meaning itself.

Instead of embracing the fluidity of systems of meaning, and each subject’s situatedness within that system, young people are taught that the only legitimate foundation for political action—or indeed any kind of social participation—is atomized selfhood, constructed from within and defended with narcissistic brittleness. They are taught to see themselves as solely responsible for discovering, curating, optimizing and presenting this supposedly ‘authentic’ self as their central marketable asset. But they also learn that it is continually under assault by hostile forces of oppressive social meaning whose aim is to keep them—or others like them, or someone anyway—marginalized, abject and on the back foot.

Within this system, it follows that the central locus of political activism must be to disrupt these oppressive forces that marginalize unfavored groups, so as to advance the project of collective liberation to ‘be our authentic selves.’ This is not just a political project but an existential one, for along with marginalizing unfavored groups these forces impose unlooked-for and oppressively overdetermined social meanings on each of us, undermining each young person’s quest for authentic selfhood. Individuals caught up in this worldview genuinely believe they are agitating not just for the liberation of the oppressed but for their very existence.

The fixation of today’s elite graduates on ‘validation’ of ‘identities’ may seem frivolous to older generations. But within a worldview that frames all forms of social meaning as oppressive by definition, the very gaze of the Other is an unacceptable attack on the pristine territory of the self. If we reject the genuinely postmodern ethic of radical semiotic interconnection, and our interwovenness with structures of meaning in society and the natural world, then the movement of these structures in, on and within our individual identities comes to be experienced as violence.

This perspective exists in tormented symbiosis with an Other it can neither tolerate, nor yet wholly dispense with. For the paradox is that the invasive gaze of the Other, laden with unwanted and oppressive shared meanings, is simultaneously the source of suffering and salvation. The gaze of the Other is experienced as a hostile and violent invasion, forever imposing unlooked-for social meanings that constrain the liberty of each sacred self. But it is also the only source of the ‘validation’ that will reassure each individual that their self-creation project is real, true and accepted.

The solution, within this worldview, is an (again paranoid-schizoid in the Kleinian sense) ever more desperate effort to control the thoughts of the Other. We see this in politicized campaigns to control speech in the service of identities. But as any psychotherapist (or parent) will tell you, trying to control the inner life of another is a project that in normal circumstances seems achievable (or indeed desirable) only to young children or the mentally disturbed. That it should become a central political desideratum for a generation of elite young people does not bode well for the future health of public life.

When I started my undergraduate degree 20 years ago, critical theory was one epistemology among several, which we learned about as it were ‘from the outside’ rather than as a framework for understanding other philosophies. Though it affected me severely, in ways I have already described, most of my contemporaries simply learned about the ideas and were largely unaffected. Today, though, this epistemology has eaten and digested the humanities and begun to nibble on science and even mathematics. As a result, for today’s young people, it is increasingly difficult to find a vantage point outside its political ontology from which to evaluate its operations.

We should not be surprised, then, that mental health issues have skyrocketed in elite college-age populations. They are being taught to believe, as a foundational framework for understanding the world, that acceptance in the gaze of the Other is key to validating a selfhood they alone are responsible for creating, curating and optimizing. But they are also being taught that all shared meanings—in other words, anything conferred by the gaze of the Other—represents a hostile act of violence. How is any young adult meant to navigate this catch-22?

It is a mistake to dismiss this as narcissistic—or, at least, to ignore the suffering of those trapped in this bind. To be ‘defined’ by something other than our own desire is in this system to be injured, parts of our authentic self mauled or amputated, whether by social meanings we did not choose or the givens of our embodied existence. This is a phenomenally cruel thing to teach young people, as it leaves them feeling perpetually oppressed by the givens of existence itself.

This analysis also sheds light on the crisis of elite purpose and leadership Natalia Dashan described in her Palladium piece last year. If shared meanings are not only unavailable but actively hostile, how is any young person meant to formulate a legitimate rationale for stepping up? No wonder so many elite graduates dismiss even the possibility of public service in favor either of pecuniary self-interest in finance or tech, or else joining the ranks of activist-bureaucrats seeking to advance the destruction of shared meanings in the name of total inclusion.

But as societies around the globe struggle to get to grips with coronavirus, we no longer have the luxury of sitting about like Shakespeare’s Richard II, mourning a broken model of meaning as the world disintegrates around us. Facing the deaths perhaps of loved ones, and certainly of everything we thought of as ‘normal’ until a few weeks ago, destroying what is left of our structures of social meaning in the name of liberation politics or frog-meme irony is an indulgence we cannot afford. The project of reconstruction is urgent. This project is both an inner and an outer one: reconstruction of an inner life capable of navigating social meanings without experiencing them as violence, and also of our willingness to participate in the external, political analogue of those social meanings, namely institutions, political structures and—yes—hierarchies.

This is not to say that we should shrug at unjust systems of domination. The ‘social justice’ excavation of ‘implicit bias’ is not wholly without merit. It is on all of us to make sincere efforts to meet the Other to the best of our abilities as we find it, and not simply reduce the world out there to our preconceptions. But this effort cannot be so all-encompassing as to destroy what systems of shared meaning we have left. Nor can we afford to see it grind common endeavor to a standstill.

No one knows yet what the world will look like as we emerge from the political and economic convulsions engendered by this global pandemic. One thing is clear, though: the ethic of radically individualist atomization implicit in ‘social justice’ campaigns for the destruction of all shared meaning is woefully inadequate to the challenges we now face. Through its lethal spread and infectiousness, coronavirus has demonstrated vividly how our fates remain bound to one another in infinitely complex ways, however loudly we may assert our right to self-authorship. Faced with the persistence of our social, biological, semiotic, economic, and ecological interconnectedness, we would do well to embrace and make a virtue of it, to salvage those shared meanings that remain to us, and begin the process of building new ones that will sustain us into the future.

This article was originally published at Palladium magazine.

The Irreligious Right

Today’s hottest property: young fogeys. Blue Labour hailed Boris Johnson’s landslide election victory as a rebellion by the country’s ‘culturally conservative’ silent majority. A new conservative magazine seems to appear every week. We have even seen a youth movement for the revival of socially conservative values popping up in that bastion of modern double liberalism, the Conservative Party.

What do they all want? At the more wonkish end of the debate, the argument is broadly that the political push throughout the twentieth century for ever greater social and economic freedom has brought many benefits, but that these have been unevenly distributed and are now reaching the point of diminishing returns.

The pursuit of ever greater freedom and individualism, this strand of thought argues, has delivered rising wealth while hollowing out working-class communities; liberated some women while forcing others to work a double shift and abandon the young and old in substandard care, and provided an infinitude of consumer choice but at the cost of mounting ecological damage. Under the sign of radical individualism, the new communitarians argue, we are all becoming more solitary and self-absorbed. Even charitable giving seems to be in unstoppable decline.

But what, in practice, are the new social conservatives seeking to conserve? Calls for a revival of cultural conservatism, many in the name of Christian values, seem often on closer examination oddly insubstantial. In 2017, UKIP’s leader-for-that-week Stephen Crowther said that the UK is a Christian country, “and we intend to stay that way.” But for Crowther, being a Christian country does not seem to impose any obligation to actually be Christian: 

including Christian in our list [of principles] does not imply any requirement for individual faith, but it reflects the Judeo-Christian classical and enlightenment origins on which our laws, our social systems and our cultural norms have been built over two millennia.

Elsewhere in Europe, Hungarian Prime Minister Victor Orbàn describes his brand of authoritarian, identity-oriented politics as ‘Christian democracy’. Only a minority of Hungarians go to church every week – 56% of the country identifies as Catholic, though only 12% attends church regularly – but the identifier ‘Christian’ has nonetheless become central to Orbàn’s politics.

Much as Crowther did, the Orban-supporting Bishop of Szeged, László Kiss-Rigó, bridges this gap with a vague, cultural definition of what actually constitutes a ‘Christian’: “In Europe, even an atheist is a Christian”, he said. It turns out that being ‘Christian’ is less about prayer or doctrine than ‘values’: “We are very happy that there are a few politicians like Orbán and Trump who really represent those values which we Christians believe to be important.”

What exactly are these values, then? Attendees at anti-Islam Pegida rallies in Germany carry crosses and sing carols. Italian right-winger Matteo Salvini punctuates anti-immigration rhetoric by brandishing a rosary, drawing criticism from the very Catholic faith whose symbols he invokes. Try to pin down any actual values this form of Christianity might require of its adherents, and matters are much less clear.

Even those whose stated desire is to defend the place of faith in public and political life seem keen that the faith itself stop short of imposing actual obligations. To take a more moderate example of the new cultural conservatism, the Social Democratic Party took a broadly post-liberal, culturally conservative stance in its 2018 relaunch. The New Declaration made an energetic defence of our right to hold even illiberal religious views openly in public life:

Citizens holding a traditional, patriotic or religious outlook are often bullied and marginalised, stifling the open debate upon which a free and democratic society depends. 

Then, about a year later, the SDP lost its only donor over a bitter intra-party dispute about whether or not it should be party policy to ban halal slaughter – a position markedly at odds with the party’s previous defence of religious pluralism. And  when the Church of England recently reiterated its long-held position on sex and marriage, prominent SDP member Patrick O’Flynn took to the pages of the Daily Express to mock ‘the otherworldliness of these Men of God’. Instead of insisting on ‘out of touch’ doctrine, O’Flynn suggested, in order to attract more young people to weekly worship the Church should adjust its doctrines on sex and marriage to reflect their values.

In this view of faith, theological positions do not reflect any kind of truth-claim but should be emergent properties of the aggregate ethical positions held by the members of that church. Less ‘Christian democracy’ than ‘democratic Christianity’: whatever the congregants believe becomes the doctrine of the church.

From a religious perspective this makes no sense. To the believer, doctrine is handed down from God Himself. The thought of God’s word being subject to plebiscite is absurd, if not outright blasphemous.

This debate reveals the missing piece in today’s would-be conservative revival. Where do our values come from? What is the proper source of political authority? Progressives gesture at natural rights or an imagined future utopia, but for anyone who remains unconvinced that we are all on a journey somewhere wonderful, some other authority is required.

Edmund Burke suggested the answer lay in a blend of deference to tradition and God’s grand design, tempered by carefully constrained democratic institutions; his Savoyard contemporary, Joseph de Maistre, argued that the only proper form of authority lay in God’s will, delivered via the Pope and an absolute monarch.

The history of modernity has unfolded in the tensions between these competing understandings of political authority. ‘The will of God’, the will of ‘the People’, and the grand designs of various utopias have variously been used to justify all manner of enterprises, with outcomes from the magnificent to the horrific. But our present political difficulties may be in part down to a growing popular discomfort with accepting the legitimacy of any of the above.

Since the election of Donald Trump and the vote to leave the EU, there has been a low but persistent rumble from our moral betters that democracy should maybe have its wings clipped a little, to stop stupid proles making bad decisions. A degree of wing-clipping has in fact long since taken place: John Gray has discussed recently in these pages the way the language and legal mechanism of ‘rights’ is used to shift entire areas of public life from democratic debate to the dry realm of unelected lawyers and judges. But if authority does not reside in the will of the people, nor does it reside with God: it is difficult to imagine a mainstream British politician claiming moral authority on the basis of divine will without being roundly pilloried

Progress and human rights, then? Every young person who passes through a modern university is taught in no uncertain terms that totalising metanarratives are suspect. At best, they are power moves. Whenever you find one you should ask cui bono? In the case of universal human rights, the answer is probably: lawyers.

This leaves would-be conservatives in a bind. If (with a few honourable exceptions still holding out for direct Vatican rule) political authority rests not in tradition (too restrictive on personal liberty) or democracy (probably rigged) or even God (don’t tell ME what to do!) or even in the lawyers, then what is left?  Politics professor Matt McManus argues that the result is a postmodernism of the right as well as of the left: a series of nested calls for a return to authority, tradition and culture that all, on closer inspection, turn out to be largely delivery mechanisms for adversarial but hollow identity politics.

Having come unmoored from its roots either in the past, the divine, or the popular will, McManus suggests that this postmodern conservatism has warped a Burkean belief in tradition into a kind of moral cosplay whose main purpose is less seeking the good life than making a noisy defence of whichever identities its sworn enemies attack. As the postmodern liberal-left demonises heterosexual white males, so postmodern conservatism sets out to defend them; and so on.

Seen in this light, the problem with Orbàn and other borrowers of Christian clothing is not that they do not believe their own words. Inasmuch as they can mean anything, they genuinely identify as Christians. It is more that when all sources of authority are suspect, the only legitimate recourse is to the self: to identity, and identification.

And the problem with identification is that it remains separate from whatever it identifies as. Just like the modern dating marketplace, where commitment is radically undermined by the ease of swiping right, modern cultural conservatism is radically undermined by the fear that without a reliable foundation of authority, and with more identity-choice options only a click away, we are never fully the thing we claim as our identity.

Without a sense of confidence in the roots of its political legitimacy, conservative values dissolve from concrete obligations to consumer accessories. This in turn is why Orbànist ‘Christian democracy’ and many of its populist cousins find their most compelling realisation not in religious doctrine or observance, but in defining themselves against their outgroup. If “even an atheist is a Christian” then either no one is a Christian, or everyone is. The only way of defining what a Christian is, is in terms of what it is not: foreigners.

But if this is so, then in a postmodern environment, shorn of recourse to authority, cultural conservatism is a waste of energy. It cannot define what it wants. All is insubstantial; there is no exit from the Matrix, nothing left to conserve.

Does it follow from this that those who long for place, limits, love, family, faith and meaning should just sit in the rubble and watch it all burn? I do not think so. But when there is nothing solid to go back to, anyone attracted to what is left of the ideology that used to be called ‘conservative’ needs to find a new name for their yearning. ‘Constructionists’, perhaps. There is a lot of building to do.

This article first appeared at Unherd

On the censoring of seriousness for children

Our local church runs a monthly service aimed at children, with crafts and without Holy Communion. The team that organises the Friends and Family services are lovely, work very hard to come up with activities and an appealing programme for younger worshippers, and it is popular with families many of whom I don’t see at regular services. My daughter (3) loves it.

It’s on the first Sunday of every month, so the first Sunday of Advent coincided with the Friends and Family service. My daughter enjoyed decorating the Christmas tree, making little Christmas crafts and other activities. But one thing puzzled and still puzzles me.

This is one of the songs we were invited to sing. ‘Hee haw, hee haw, doesn’t anybody care? There’s a baby in my dinner and it’s just not fair.’ It’s supposed to be a funny song, from the donkey’s point of view, about the Holy Family in the stable and Jesus in the crib. What I don’t understand is why this should be considered more suitable for children than (say) Away In A Manger.

The former depends, for any kind of impact, on a level of familiarity with the Christmas story that allows you to see it’s a funny retelling and to get the joke. That already makes it more suitable for adults. The latter paints the Christmas scene in simple language and follows it with a prayer that connects the picture with the greater story of the faith it celebrates. The tune is easy to learn and join in with. Why choose the first, with its ironic posture and ugly, difficult tune, over the latter with its plain language and unforced attitude of devotion?

I’ve wondered for some time what it is about our culture that makes us reluctant to allow children to be serious. Children are naturally reverent: if the adults around them treat something as sacred, even very young children will follow suit without much prompting. This should come as no surprise – the whole world is full of mystery and wonder to a 3-year-old. It is us that fails so often to see this, not the children.

So why do we feel uncomfortable allowing children to experience seriousness? Sacredness? Reverence? How and why have we convinced ourselves that children will become bored or fractious unless even profoundly serious central pillars of our culture, such as the Christmas story, are rendered funny and frivolous?

The only explanation I can come up with is that it reflects an embarrassment among adults, even those who are still observant Christians, about standing quietly in the presence of the sacred. What we teach our children, consciously or unconsciously, is the most unforgiving measure of what we ourselves hold important. But it seems we shift uncomfortably at the thought of a preschool child experiencing the full force of the Christmas story in all its solemnity. Instead we find ourselves couching it in awkward irony, wholly unnecessary for the children but a salve to our own withered sense of the divine.

If it has become generally uncomfortable for us to see reverence in a young child, during Advent, then the Christian faith really is in trouble.

On halal, kosher, religious tolerance and having it both ways

Yesterday I live tweeted the SDP conference in Leeds. It was a great day with many interesting speakers, but easily the most controversial discussion – and the one that has generated the most reaction in my Twitter mentions since – was the motion to amend SDP policy on non stun slaughter. Previously, policy was to ban these methods of slaughter, but at the conference a motion was decisively carried to amend this to provisions on strict standards, ensuring supply does not outstrip demand (eg non stun slaughter for export) and proper labelling.

I gather that debate around the subject prior to conference was heated. I know at least one person who left the party over the subject. I spoke in favour of the motion despite being personally uncomfortable with such methods of slaughter on the grounds that an explicitly communitarian party needs to be willing to demonstrate a recognition that religious practice is immensely important to some groups, and to create space for such practices even if we find them personally unappealing.

But once you start making explicit provision for communitarian considerations, the tension between faith and other ethical frameworks is immediately apparent.

The subsequent discussion – and its links into ‘preserve our culture’ groups such as For Britain and Britain First – put me in mind of two brouhahas a little while ago where politicians tried to articulate a position weighing private faith against public mainstream morality. In April 2017, then Lib Dem leader Tim Farron refused to deny that his personal faith held homosexuality to be a sin. In September the same year, Jacob Rees-Mogg made statements on abortion and homosexuality, consistent with Catholic social teaching, that saw him excoriated as ‘a bigot’ and ‘wildly out of step with public opinion’.

Commentators at the time lined up to defend Farron and Rees-Mogg. There was the usual hum from offstage (ie Twitter) about the right to express views in keeping with traditional Christianity without facing punishment from an illiberal liberal elite.

So I find it interesting to see that when it comes to a religious practice from Islam and Judaism – slaughtering animals by slitting their throats, without stunning them first – some of the voices raised most loudly in agreement about the iniquity of ‘You can’t say that’ culture as it bears on Christians today should be perfectly content to support policies that actively militate the practice of those other faiths. If we are to defend Rees-Mogg and Farron on grounds of religious tolerance, should we not also consider defending halal and kosher slaughter on the same grounds? After all, the core argument of tolerance is not that one tolerates only things that one likes or feels indifferent to but that it is extended to things one actively dislikes.

It feels to me as though there are two things going on here.

Firstly, the Britain First types who wish to support religious exemptions for Christians but not for Jews or Muslims are not, themselves, Christians for the most part. Rather, they are secular inheritors of the Christian tradition who wish to preserve the structure of that tradition for the benefits it has for some time provided – a fairly stable, prosperous, harmonious society with congenial values – without taking on the obligations of the faith itself. To put it a less fashionable way, they wish to be redeemed but without themselves taking up the cross. For that, in a nutshell, is the argument made by those who argue against ‘illiberal liberalism’ but do so from a perspective that rejects the necessity of faith – any faith, perhaps, or Christianity in particular – in creating the society to which they wish to belong.

We might term it ‘religious utilitarianism’ – a worldview that recognises the utility of faith in delivering certain social goods but takes no position on the veracity or otherwise of the tenets of any faith in particular. Liberal relativism is a kind of equal-opportunities religious utilitarianism, that wishes to make space for any and all faiths to provide those goods in a pluralistic way, while the Britain First / Batten-era UKIP version of the same wishes to privilege Christian religious utilitarianism over the more relativistic liberal sort. That is, Britain First types want to keep only the outward forms of Christianity but do not wish anyone else to replace those forms with a more deeply-felt faith of their own.

But if we are to argue for religious tolerance, and for Christianity to play an active rather than a purely decorative role in our society then – the logic dictates – we must either be explicit about repressing other faiths in support of that goal, or else extend the same courtesy to other faiths. The alternative – hiding our hostility to other faiths behind a selectively-applied appeal for religious tolerance only as it pertains to ‘our’ deviations from the liberal consensus – is simply not good enough.

On marriage, tattoos, time and despair

Young people don’t get married. Young people are covered in tattoos. Now that I’m middle-aged, this is the kind of thing it would be tempting to see as evidence that the world is going to the dogs, that we’re facing some sort of terrible moral decline and that the solution is for everyone to buck up and improve their attitude.

I think we are indeed facing a growing cultural crisis, but I’m increasingly of the view that telling young people to buck up wholly misses the point, and that what we are seeing isn’t a deterioration of attitude but an emanation of something more like despair. Two things I’ve read recently prompted this line of thought.

This rather wonderful article from the Institute for Family Studies is worth a read in its own right for a wealth of beautifully phrased observations on marriage. But one paragraph, on the decline of marriage among the young and/or less wealthy, pulled me right up short:

I think the problem that the less wealthy are having [in regards to marriage] is this kind of achievement attitude that we have about marriage—that I can’t get married because I don’t have a stable job; I can’t get married because one of the partners is not employed, and I don’t want to be on the hook for them or a drag on them. I think that the American government, for all that it loves marriage, does not support families very well. The minimum wage here is a joke; people would have to work 25/8 on that to support a family. There’s so little family leave. It’s brutal, especially at the lower end of the wage spectrum. If you don’t work in a knowledge industry, if you’re sort of an hourly employee, it’s incredibly hard to have a family and have children. Johns Hopkins sociologist Andrew Cherlin writes a lot about how the working classes have abandoned marriage partly because it’s an achievement and partly because getting married suggests a plan for the future; it’s an optimistic thing to do. And I think that often people find that they just don’t have enough hope in the future to be able to make that statement…

That is to say, maybe it is not the deliquescing effect of corrupting liberal values that are causing this breakdown in willingness to commit long-term among the young and/or poor. Maybe these demographics are not getting married because they don’t have enough hope for the future to make long-term decisions seem like a good use of energy and resources.  Let that sink in. How utterly screwed are we as a society if we’re so inapable of solidarity across generations that anyone young, or less wealthy is sinking into a kind of future-free despair?

On a similar note, consider tattoos. A recent study reports that

according to numerous measures, those with tattoos, especially visible ones, are more short-sighted and impulsive than the non-tattooed. Almost nothing mitigates these results, neither the motive for the tattoo, the time contemplated before getting tattooed nor the time elapsed since the last tattoo. Even the expressed intention to get a(nother) tattoo predicts increased short-sightedness and helps establish the direction of causality between tattoos and short-sightedness.

Conservatives such as Dalrymple write  about tattoos as cultural degradation, with the clear inference that what it evidences is a collective moral decline. But if this study is correct, that is only half right: rather, it points to a rise in short-termism. That could be read as moral decline of a sort. After all, an inability to plan for the future is a serious inhibitor if anyone’s ability to think and act socially, or with any of the ability to defer gratification we associate with civilised achievements of all kinds. But could it not also be read as a failure of optimism?

It’s a thought that lands like a ton of bricks in the middle of any temptation I might feel to wag a moralising finger at someone just starting out now on adult life. Maybe each of these tattooed, unmarried, commitment-shy young people is less a weak-chinned scion of all that is good, pissing his or her cultural inheritance up the wall on frivolities, than a despairing soul fallen out the other end of of a cultural moment and stuck in their own personal Weimar Republic with no meaningful event horizon and no desire to do anything but dance, drink, fuck and draw on themselves with Biro. If this is the case, then older generations truly have a duty to try and help in some way. What ‘help’ looks like in that context I am less sure, but it is surely on anyone over 35 or so to consider where hope resides, and what duty we have to ensure it is not, like home ownership or a stable job, simply something that people used to have before we all gave up and danced ourselves to a childless, tattooed death.

Can societies survive without blasphemy laws?

So today I was mulling gloomily over the way hate crime laws seem to have taken seamlessly over the function of blasphemy laws in the UK. I decided to look up when blasphemy was abolished as an offence in the country, thinking it might be sometime in the 1970s. Wrong – blasphemy was abolished as an offence in 2008. The acts governing hate crime (the Crime and Disorder Act and the Criminal Justice Act) were added to the statute book in 1998 and and 2003 respectively.

The CPS’ own website states that

The police and the CPS have agreed the following definition for identifying and flagging hate crimes: “Any criminal offence which is perceived by the victim or any other person, to be motivated by hostility or prejudice, based on a person’s disability or perceived disability; race or perceived race; or religion or perceived religion; or sexual orientation or perceived sexual orientation or transgender identity or perceived transgender identity.”

These laws have been used in recent times for such diverse purposes as fining a man who taught his girlfriend’s dog to make a Nazi salute and arresting a woman for calling a transgender woman a man.

The common feature of both the blasphemy laws of yore and the hate crime laws of today is that both prohibit speech considered harmful to society’s morals. That society’s morals are no longer situated in a common belief system (such as Christianity) but an atomised, individualistic inner space (as expressed by the definition of hate crime as anything which is perceived by an individual as being such) is neither here nor there. Certain tenets cannot be challenged lest doing so harms the fabric of society.

It’s also neither here nor there that some of those moral tenets are unprovable or unfalsifiable in any objective sense: the Resurrection of Christ, say, or the existence of some magical inner ‘gender identity’. Indeed the more outlandish a protected belief the better, because the function of blasphemy laws is to compel moral obedience, and what better sign of moral obedience than to see people dutifully repeating something that is in no sense objectively true (such as that men can become women) on pain of being punished if they don’t comply?

My argument here isn’t that we should abolish hate crime laws as we did their predecessors, the laws of blasphemy. I don’t want to rant, Spiked-style, about the threat from blasphemy and hate crime laws to free speech so much I want to ask: have we ever really had free speech? It seems no sooner did we get rid of one set of rules about what you can’t say than we replaced them with another. There was, perhaps, a couple of decades where blasphemy was effectively defunct despite the statute remaining in existence and before hate crime came to be. But the collapse of controls on speech for religious reasons is nigh-simultaneous with the rise of controls on speech for social justice/equality reasons. The Human Rights Act 1998 forced blasphemy law to be restrained by the right to free speech; the same year, the Crime and Disorder Act made hateful behaviour toward a victim based on membership (or presumed membership) in a racial or religious group an aggravating factor in sentencing. (Insert chin-stroking emoji here.)

This leads me to suspect that human societies cannot, in fact, survive very long without laws of some kind governing speech. I’d love to see a counter-example. But I’ll be astonished if anyone can point me to a state that has abolished religious blasphemy without replacing it with controls on speech for other reasons, whether (under supposedly atheistic Communism) to forbid speaking against the Dictator, or (under supposedly individualistic, pluralistic liberalism) to forbid speaking against individuals’ notional right to self-define without reference to the collective.

Much as every human represses some aspects of their personality in order to function, every society does so too; it is a foolish or short-lived society that makes no effort to clamp down on behaviours or opinions that pose a threat to what that society considers the good or virtuous life. If that’s the case, is there even any value in trying to fight what feels like a rising tide of authoritarian busybodying keen to tell me what I can and can’t say? Or should I just pile in and make my bid to be on the team who’s in charge of deciding what should or shouldn’t be banned?

Right now, the two groups jostling most energetically for that position in the UK are the proponents of ‘intersectionality’ and the radical Islamists. If Nassim Taleb is correct, and social mores are disproportionately set by tiny ideological minorities purely based on the strength of their conviction, then whether we end up punishing those who assert that men cannot become women or those who draw cartoons of Mohammed will be a straight fight between which of those groups is more determined to blow shit up if they don’t get their way.

I don’t really like the way this argument is going. If I’m right, then social mores in a few decades will bear few resemblances to those of today And whether they’re structured with reference to authoritarian liberalism or radical Islam I don’t think I will particularly like their shape. But there’s nothing I can do about it – the moral majority in the country is firmly post-Christian and, as I’ve argued elsewhere, a society that can’t be arsed to defend its moral traditions is guaranteed to see them supplanted by ideologies with more committed adherents. And indeed, the kind of Christianity that did once upon a time get out of bed to defend its moral tenets by any means necessary would probably, in practice, be as repugnant to me as either of the likely moral futures toward which our society is heading.

 

On Reconstruction: surviving the trauma of postmodernism

I’ve been mentally composing a version of this essay for a long time. I thought perhaps its relevance might have passed but the explosion in the last few years of postmodern identity politics into the mainstream convinces me that far from being something that happened briefly to one not very happy undergraduate in the early 00s, the mental distress I experienced as a result of exposure to ‘critical theory’ has expanded to encompass much of contemporary discourse. I don’t claim to have a solution to that, but I want to share how I survived.

I went to a moderately eccentric school by ordinary standards, but for the purposes of this essay we can treat it as a classical education, inasmuch as we learned about great civilisations that came before ours and this knowledge was treated as important and still relevant to us and the world and culture today. Built into the form of the curriculum was a tacit teleology, that implied (whether or not it was ever stated) an evolutionary relation of each civilisation to the one that preceded it. It was a narrative that led to where we are now, and the civilisation we currently inhabit.

Imagine my surprise, then, when as an English Literature undergraduate at Oxford in around 2000, I discovered postmodernist thought, and its many schools of critical theory.

By ‘critical theory’ I mean the body of thought emanating initially mostly from France, with Saussure and Derrida, then expanding out to include such figures as Paul de Man, Slavoj Žižek and Judith Butler. Many more names have joined that list since, and taken together I believe it is referred to as ‘cultural studies’ today, or, in the words of the Sokal Squared hoaxsters, ‘grievance studies’. Back in 2002 at Oxford, critical theory was a looming presence at the edge of the arts but seemed most pertinent to the study of literature; it has subsequently, I gather, swallowed most of the humanities and is mounting siege against the sciences as I write.

But I digress. The central insight of this discipline was the destabilising one, and that I think has not changed. To summarise: Saussure proposed that instead of treating language as transparent, its meaning rising off the page without any need for elucidation, we should split language into ‘sign’ and ‘signified’. That is, what a word means is separable from the word that means it. We can thus, he argued, institute a new discipline of ‘semiotics’: the study of signs – a study that reaches far beyond language and was immediately influential in the social sciences.

This insight was developed by Jaques Derrida, whose simple but devastating observation was that if this is the case, we cannot define any given ‘signified’ except with reference to further signs, which then in turn themselves require definition with reference to further signs. It’s turtles all the way down. We have no means, through language, of arriving at any kind of truth that we are able to experience directly. Furthermore, the concerted efforts by centuries of culture since the Enlightenment to obscure the fact that it’s turtles all the way down is in fact a cunning effort to shore up vested interests, and to conceal the operations of power. Recourses to authority are null and void. There is no solid foundation, no God, no truth, no authority. Only power, and a form of consensus reality arrived at through the glacial accretion of a billion tiny operations of power that have, in sum, decreed that the world should be thus and not thus.

I freely admit that I was a bit loopy anyway when I reached that point on my reading list, for unrelated personal reasons. But this insight hit me like a freight train. I spent most of one Trinity term feeling as though I was in the midst of a psychotic experience. Instead of seeing the ‘dreaming spires’ around me as the accumulation of centuries of carefully-tended tradition, a representation in architecture of the ideal of the university as a custodian of the best that has been thought and said to date, I saw each building as a kind of nightmarish extrusion into physical space of power structures that were indifferent if not hostile to me as a sexual minority and a woman. I felt suffocated: stifled by a kind of blaring architectural triumphalism that declared at every corner, with every church tower, every statue of every great man, ‘YOU CANNOT CHANGE ANY OF THIS, WE WILL ALWAYS WIN’. And stifled again by the nihilistic twist postmodernism places on this reading of the world, and culture, in that it assures us that there is nowhere to stand outside the push and pull of power. There is nothing outside the text. So in trying to challenge these operations of power, we will probably just end up re-inscribing them.

By now you’re probably thinking ‘wow, she sounds nuts’. Well yes, I was a bit at that point, as I said before. But I describe this experience in detail because 1) it was so distressing and 2) the state I spent the next few years in following my fall from the Eden of pre-post-modernism[1] sounded, in my inner monologue, so similar to what I read of the toxic ‘social justice’ debate that rolls around our social media some 15 or so years on that I feel they must be related. What if today’s SJWs are in fact acting out a traumatic state of mind engendered by exposure to ‘cultural studies’ at university? If that is the case, then there may be someone out there who will find some comfort in my story of how I recovered from that experience to the point where I was able to make any decisions at all.

Because make no mistake, the Fall engendered by internalising the idea that ‘there is nothing outside the text’ is a horrible place to be. Consider that iconic scene in the Matrix where Neo wakes from the dream he believed to be normal life, in a slime-covered capsule, to discover that he and the rest of the human species are in fact mindless peons farmed by forces beyond their power to change. Then bin the rest of the Matrix franchise, shoot Morpheus and the rest of the resistance, end the film with Neo back in his pod as a human generator, just without his connection to the Matrix. Eyes staring helplessly into the machine-farm abyss. That’s a bit how it feels.

Forget political radicalism. There’s nothing left, this worldview says, but a continuous action of ‘disruption’ from within the system. There is no way to change the world for the better because what even is the better anyway? All you have left available to you is a kind of carping from the sidelines. Calling out particularly brazen efforts by the collective voice of consensus reality to perpetutate itself in its current form and to silence potentially disruptive voices. Maybe trying to widen the range of voices permitted to contribute to the operations of power. Maybe you can see now how this could be a mindset conducive to (for example) the contemporary popularity of ‘call-out culture’ and quixotic obsession of public discourse with ensuring the identity categories of figures in public life and Hollywood films precisely replicate their demographic proportions in the population at large.

No truth, no authority, no meaning, no means of striving for the good without producing more of the same. Just power. For the longest time I couldn’t find a way out of the dragging nihilism engendered by this worldview. Eventually though it occurred to me that I just didn’t have to be absolutist about it. I just had to be a bit more pragmatic. So what if we can never be wholly certain that what we mean to say to someone else is exactly what they hear, because every definition we use in theory needs to be defined in its turn, and so on ad infinitum? If I ask my friend to pass the salt, and he passes the salt, I really don’t need to waste energy mulling over the social forces underlying the particular rituals around eating and table manners that obtain in my current cultural context. I thank my friend and add some salt to my dinner.

This is a tiny example but I decided to try and apply this principle to life in general. If I needed to get on with something, instead of getting bogged down, within every social context and every situation, with the subterranean operations of power, patriarchy, compulsory heterosexuality etc etc etc, I’d try and bracket all that stuff and act as if things were still as stable as they were before the Fall. I coined the term ‘temporary certainties’ for this state of mind. It took a bit of mental effort (and you probably still think I sound mad) but far less mental effort than inwardly deconstructing every utterance, object and situation I found myself in for signs of Western-colonialist cisheteropatriarchal blahdeblah.

Gradually, the psychosis waned. Now, 10 or so years on from arriving at this solution, it’s still working for me. The world can never be as solid-seeming as it was before my Fall. Truth still seems a bit relative depending on where one is standing. But the important insight is that many categories, many tropes, objects and structures, are stable enough to treat them ‘as if’ they were pre-post-modernist type solid. You don’t need to waste time deconstructing everything; indeed, trying to do so is a fast track to a sense of perpetual victimisation and bitter, impotent rage. And trying to build any kind of transformative politics on a foundation of perpetual victimisation and bitter, impotent rage is not going to turn out as a net force for good, however radically you relativise the notion of ‘good’.

This doesn’t have to mean buying in wholesale to things as they are and becoming a cheerleader for keeping things unchanged. But to anyone currently struggling to focus in a world that seems hostile and composed entirely of operations of power, I say: pick your battles. Much of the world is still good (for a temporarily certain value of good), many people are kind and well-meaning. Creating new interpersonal dynamics around the anxious effort to avoid the accidental replication in ordinary speech of sociocultural dynamics you find oppressive (aka ‘microaggressions’ may not, in the end, make for a more functional society. It’s possible to treat as a temporary certainty the hypothesis that in asking ‘Where are you from?’ someone is not in fact unconsciously othering you by virtue of your apparent ethnic difference, but simply – from maybe a naïve position in a social background that does not include many ethnic minorities – seeking to know more about you, in order to befriend you.

The beauty of a temporary certainty is that, choosing such a vantage point, we can say of any given cultural phenomenon (the institution of marriage, say) ‘we are where we are’. We are no longer stuck with the Hobson’s choice of either pretending to buy into something as an absolute that we see as contingent and culturally constructed, or else setting ourselves pointlessly in opposition to it, protesting that as it is culturally constructed we should make all efforts to disrupt or transform it into some form that might appear more ‘just’. Instead, we can accept that despite this phenomenon being, strictly speaking, contingent, it remains stable enough that we can and should find a pragmatic relation to it. (In my case, that was to get married. One of the best decisions I ever made.)

You may object that my argument here amounts to a strategy for recouping something for cultural conservatism from the rubble of the post-modernist project. I beg to differ. Rather, what I’m advocating here is more along the lines of a plea to those who see themselves as political radicals to think deeply about what really matters and to focus on that. As it stands, ‘social justice’ social media suggests that thanks to the post-Fall malaise I postulate as infecting most of our young people, radical politics is resiling into a kind of nihilistic shit-slinging incapable of going beyond critiquing the contingency of what it seeks to change in order to advocate for anything better.

[1] I don’t mean modernism, hence the clumsy construction. I mean something more like ‘the popular twentieth century Enlightenment-ish consensus about truth, reason and meaning’

Reading today: Camille Paglia on sex crime

The horrors and atrocities of history have been edited out of primary and secondary education except where they can be blamed on racism, sexism, and imperialism — toxins embedded in oppressive outside structures that must be smashed and remade. But the real problem resides in human nature, which religion as well as great art sees as eternally torn by a war between the forces of darkness and light.

Liberalism lacks a profound sense of evil.

http://time.com/3444749/camille-paglia-the-modern-campus-cannot-comprehend-evil/

Forget policy: to survive, conservatism must fight for Western civilisation

It is clear that the left is enjoying something of a moment, not just in the UK but across most of the West. It has reduced universities to censorious leftist monocultures, is busy imposing its ever more deranged zombie religion of political correctness in public debate and is so effusively full of confidence in its command of the cultural moment that ‘Acid Corbynism’ has caused quite a stir at this year’s Labour Party conference (fringe). Meanwhile the right-leaning press is full of gloomy arguments discussing the Tories’ oncoming demographic Armageddon and crisis of political confidence.

Mulling this over, it strikes me as strange that conservatives should feel thus on the back foot, when there is so much to preserve, so much to care for and pass on to the next generation. The whole of Western civilisation, in fact. Why, then, are conservatives so embarrassed about wishing to conserve?

The doctrine of postmodernism, which advances a wedge of dilettante erudition ahead of its jackhammer of angry philistinism, has used its assault on the concept of canon to leave the best part of three decades’ worth of Western university graduates with barely a piecemeal grasp of their cultural heritage. Even this is filtered for them by their tutors through a lens of guilty identity politics, that reduces everything it touches, no matter how sublime or beautiful, to an ugly scrum for power under ‘cisheteropatriarchy’.

The result is three decades of graduates that simply do not see anything worth conserving. Where conservatism sees our culture as a collective endeavour worth contributing to and continuing, a flame that we all help to carry, the graduates of postmodernism see it as a monolithic engine of marginalisation. A pervasive, miasmic, indestructible force for perpetuating in-groups and injustice, to which the only legitimate reaction is resistance and subversion, and the amplification of voices deemed marginalised. It is in this fundamental perception that much of the ‘snowflake’ stereotype resides, for today’s university students naturally wish to align themselves with the marginalised rather than their imaginary plutocratic oppressors. This leads in turn to the strange phenomenon of Ivy League students, arguably some of the most privileged young people on the planet, throwing public tantrums when their pain and oppression is not validated.

But I digress. My argument is that conservatism’s crisis of confidence lies in the fact that even conservatives have been infected with postmodernism’s anxiety about whether Western civilisation really is worth saving. How could it be otherwise, when we study at the same universities, participate in (to an extent) the same public discourse, live and work with those who would take a hammer to our past? And if it isn’t worth saving, what are are conservatives but a bunch of intransigent junk-hoarders? Or perhaps conservatives just really dig the cisheteropatriarchy? Perhaps they just get off on shitting on marginalised groups and exploiting the poor?

You can see where the current leftist narrative about conservatism originates, and perhaps you begin to see why conservatives struggle to articulate counter-narrative. Because a counter-narrative to this nihilistic, pomo 21st-century mutation of leftism would require saying: I reject your basic premise. Western civilisation is a remarkable collective achievement of some five thousand years and deserves our humble appreciation and positive contribution, not this childish window-smashing. Everything I believe in stems from this premise, while you seem to believe progress can only come about when we tear it all down: the statues, the literature, the music, the architecture, the very notion of high culture itself. And as long as conservatives have even the shadow of a fear that the pomo nihilists might have a point, there is nothing to defend. Nothing to conserve. And if that is true, conservatism really does degrade merely to cheerleading for free-market capitalism or else embittered white nationalism, frothing on Twitter about Islam.

There is something worth conserving. We must say it. Own it. What is Acid Corbynism to the Parthenon, to Rilke, to the sweep of English literature from Beowulf to The Waste Land? To Beethoven’s Ninth? Chartres cathedral? We must fight for our heritage, speak proudly of it, put effort into knowing and sharing it. Don’t let it be destroyed by petty, envious philistinism disguised as radical egalitarianism. In embracing and loving our cultural heritage, and arguing without shame for its continuation, we anchor conservatism in something greater than market capitalism or nativism: in the astonishing sweep of many thousands of years of cultural achievement. A flame worth our helping to carry it on.

notes: it’s not enough to mither on about tolerance in the face of terrorism

What are British values

Freedom, tolerance, gender equality?

Compared to the power of a theocratic Game of Thrones drama, it’s laughably weak

But the Euro elites’ response to each Islamist atrocity is the same – no passion or pride for country because that’s just what the enemy wants.

Obsessive clinging to a bloodless ideal of what Europe is, underpinned by a generalised fear of nationalism and pervasive guilt about our past deeds and present wealth; everyone wants to come here, but we ourselves are forbidden to be proud of it.

Americans recite the oath of allegiance, salute the flag, hang flags everywhere. In Britain the same level of patriotism would be seen as incitement to racism, a foible of the working classes to be tolerated with a shudder.

No wonder radicalism is able to flourish here: the intelligentsia of the country of Shakespeare, Austen and Wordsworth, Watson and Crick, Darwin, Sir Christopher Wren and indeed Sir Norman Foster is ashamed of its past and culture.