On Parkfield School and Tory individualism

Conservative Muslim parents and LGBT activists continue in open conflict over the teaching of gay and trans rights in Birmingham schools. Conservative leadership candidate Esther McVey fanned the flames today by coming down on the side of the protesting parents:

Elsewhere, teenage Tory activist Soutiam Goodarzi, herself of Muslim origin, expressed outrage at McVey’s alignment with the forces of religious conservatism on this most uncomfortable clash of minority rights:

Though it’s tempting to laugh and point at the contortions and cognitive dissonance the left must endure in order to be on the same side as both groups in this clash of rights, it is the conservative predicament which is more acute, in part because it is not out in the open like conservative Muslim homophobia.

McVey here expresses the common conservative viewpoint that holds moral instruction to be the preserve of private families, not of the state. In this worldview, it is simply not the place of government to meddle in the mores parents convey to their children, and in fact schools should concentrate on teaching subjects such as history, science and maths rather than making pronouncements on what is socially acceptable.

Goodarzi expresses the equally common conservative view that religious minorities – especially Muslim ones – should not be permitted to effect a reverse takeover of the public square simply through a mixture of intransigence and leveraged victim politics. To put it another way, Muslims should not be permitted, by virtue of the specially favoured place they hold in the system of diversity (Cobley) to force sweeping changes to what is commonly taught, said or deemed acceptable.

McVey’s stance would leave families – including religious conservative ones – in sole charge of the moral instruction of the young. But Goodarzi’s stance cannot afford to, lest the moral instruction of the young be subject to infiltration and takeover by values alien to a functioning free society.

Goodarzi’s position is more akin to classical liberalism than conservatism proper. In this context, conservative religion – whether Christian or Muslim or something else – is self-evidently an obstacle on the way to individual freedom and self-realisation. Allied to a free-market position that seeks to reduce, remove (or at least disguise) the role of the state in the operation of markets, this is a type of ‘conservatism’ (perhaps more properly called progressive free-market liberalism) typified by George Osborne. Morality, inasmuch as it is discussed at all, is in a sense negative, consisting mainly of strictures designed to maximise individual freedom and self-fulfilment – such as injunctions to eschew homophobic bullying. These, though, may be enforced by the state as it is assumed to be in the best interests of the good society that individual freedom be allowed to flourish as fully as possible.

McVey’s position is a version of this stance, modified by the proviso that some forms of shared morality are desirable. These, however, should be transmitted not by the state, whose role should be limited to activities such as keeping the peace and maintaining roads, but left as the purview of individual families.

The trouble with both these as models for society, though, is that they both depend for their existence on something they also work to undermine: that is, public mores. Moral instruction is, in a sense, both public and private: it concerns our private behaviour, but it also bears on society as a whole. If the moral instruction of children is nonexistent or badly done, those children are less likely to make a positive contribution to society as adults. It is everyone’s business how families educate their children. Our radically individualist society may not like this, but it’s true.

To illustrate.

Some choices parents make impact literally no-one but the parents and child in question. Cosleeping with babies and young children is a good example. It makes zero difference to anyone outside the family whether my toddler sleeps in my bed or her own. Who cares? Potty training, on the other hand, is a different matter. I will annoy no-one outside the family if I wave my hands in a liberal fashion and say airily that my child will sleep in her own bed ‘when she’s ready’. But if I declare that my child ‘refuses to wear a nappy’ and will learn to pee and poo in a potty ‘when she’s ready’ I will quickly incur widespread dislike, hefty dry cleaning bills and a sudden lack of playdate invitations.

Moral instruction is more like potty training than co-sleeping, and this is where McVey’s position falls down. You can say ‘families know best’ when it comes to moral instruction, but would you say that of a parent who was teaching a toddler that it was fine to take a shit on the pavement? Morals are about how we live together as a society; we can’t pretend that they can be atomised to the family level and still work as morals. You have to be confident that all or most families are on the same page about where it’s acceptable to take a crap before you say breezily ‘families know best’. Otherwise you’re just ducking the issue.

But Goodarzi’s conservative-flavoured liberalism doesn’t have much to offer either on the subject of which moral precepts should be adhered to by everyone – except inasmuch as they are enforced by the state. It’s simply assumed that individuals will somehow naturally come to the conclusion that we use the potty. How they get there, it is implied, is not a matter for politics. And if they don’t, we pass a law saying they have to. Anything intermediate is an incursion onto individual liberty.

But the truth is that both these viewpoints take a set of shared moral references so profoundly for granted they are able to pretend they don’t exist. Everyone just knows we don’t shit on the floor; that’s why (McVey) we can trust families to convey that and don’t need to teach it at school or else (Goodarzi) all we need to do is stamp out regressive viewpoints that might limit our freedom to come naturally to the right conclusion about where we take a crap. But that set of shared values is precisely the target of Goodarzi’s individualism. It is the regressive swamp of benighted reactionary muck from which individual freedom is painstakingly extricated. And once this broad framing of our moral past and present is in place, we can’t really trust families to convey the right stuff either.

Goodarzi’s position is more honest than McVey’s, in that it acknowledges more or less explicitly that if we’re accepting radical individualism as a basic social good, then the state needs to step in as coercive arbiter of some moral matters, in order to prevent wholesale anarchy (and shitty pavements). In the terms of my metaphor, Goodarzi’s position suggests that everyone can do as they like but allows for some kind of authority which is empowered to ensure people teach their toddlers to crap in the potty. It at least has a stance on some moral matters, and accepts the need to enforce them.

McVey’s ‘families know best’, on the other hand, avoids making any moral pronouncements about the social good and simply implies that ‘families’ will come up with the right answers about moral instruction on their own. It assumes a shared value set that might once, in a monocultural society, have existed, but which in our post-religious, post-imperial, multicultural, radically-individualist Britain simply cannot be taken for granted. If ‘best’ is taken to mean ‘fitting most harmoniously and beneficially into society as a whole’, it is not at all obvious any more that families do know best. But McVey cannot define ‘best’, any more than Goodarzi can, because both have accepted the basic liberal-individualist premise that even in matters that explicitly concern society as a whole rather than us as individuals or even as families, no-one has any right to tell anyone else what to do.

Left unmodified, these two stances point at two possible futures. Goodarzi’s future is one in which we are all free individuals, and the only agent with a right to tell us what to do is the state, which exists as a kind of medium in which radically unencumbered individuals interact and which intervenes only to maximise individual freedom. McVey’s future is one in which shared values still exist, but not at the level of the nation state – only at the level of individual families or ‘communities’. These ‘communities’ are, in a fashion similar to Goodarzi’s future, the subjects of a total state which exists as the sole arbiter of clashing freedoms and community ‘rights’. In this future, moral values are outsourced to religious, ethnic and sexual minorities and (to a lesser extent) individual families, administrated by an explicitly amoral state whose remit is to hold and defuse tensions between moral standpoints or in extremis to rule in favour of one or another position in an irreconcilable clash.

In neither of these futures is there much to conserve, which leaves conservatism in something of a bind. Its modern proponents have, in different ways, accepted the broad premise that the pursuit of individualism and markets is the highest public good. This in turn means individual freedom should at every turn be prioritised over a shared cultural and moral framework, which is depicted as the dark force of the past and enemy of progress. After some 50 years of this process, we are left with not a great deal except individuals (or, as McVey would have it, individuals and families). Even those pockets of reactionaries who protest are like US Marines stuck in the jungle still fighting the Korean War: it’s over, the pieces are being swept up, we are where we are. Conservatives now face a difficult choice between agreeing that, absent shared mores, the state needs to take a role as moral arbiter, or else watching as a national community disintegrates into ever more balkanised ‘communities’, whose moral frameworks compete and, as at Parkview School, clash irreconcilably. Or (and this is so difficult to imagine in practice as to be very unlikely) conservatives need to consider whether there are shared values worth fighting for as a society, rather than legislating as a government or clutching to our bosoms as individuals and atomised families.

Transform the Lords to save us from Faragism

(This article was originally published on Reaction.life.)

Michael Gove famously said during the EU referendum campaign: “People have had enough of experts”. His words, though much-derided, reflect a popular sense that our politics has moved away from democratically-accountable government, driven largely by supranational institutions and treaties, and populated by appointed ‘experts’ to whom we must defer without any means of influencing their decisions.

To this transnational class of epistocrats has been added, at the domestic level, a parallel species of quangocrat touted as ‘independent’ and similarly unresponsive to electoral pressure. Resentment toward this ecosystem of insiders has been growing for years, if not decades. In our country, Farage and his Brexit Party have now made it their mission to burn this whole edifice down.

This may be politically resonant, but is it wise? One persuasive argument for remaining in the EU is that the complexity and interdependence of modern nation states cannot be mastered at speed by elected non-specialists. That the effective management of the modern world needs a grasp of often highly technical matters that takes years to acquire, and some policy areas need serious expertise as well as a degree of insulation from MPs who believe, Boris-like, that any issue can be adequately grasped with a few hours of cramming and a bon mot or two.

Some areas of government are too abstruse to make it into the general political discourse – the scandal of hygiene standards in manufacturing, say, or rules governing the import of consumer goods – while remaining immensely important overall. The failure of UK MPs to get to grips with the detail of pretty much all such areas since the EU referendum has been painfully obvious.

This is the core of the pro-EU view that it is better to agree this stuff together with the rest of the club, then leave the system in the hands of experienced professional civil servants while we get on with our daily lives. It’s an argument that has some merit, especially when compared to the blundering attempts of our MPs to cram technical subjects in a few hours in order to make decisions that will affect the lives of millions.

In this view, public resentment of experts is self-evidently foolish and destructive and should simply be ignored. But this view is only half right. The public as a whole welcomes expertise, serious statesmanship and long-term thinking in public life and is unhappy not with experts but with their lack of accountability. No-one really disputes that if we do ever leave the EU we will need our institutional memory, and our experts, more than ever. A Faragist destruction of our governing institutions would cause a loss of this institutional memory that we can ill afford, given its already etiolated state after decades of outsourcing policy to Brussels. So, given that we need them, how can we make our experts more accountable, and prevent populism from throwing experience, expertise, long-term thinking and other important babies out with the ‘metropolitan elite’ bathwater? My proposal is that this should be the role of the House of Lords.

Whatever its faults, the hereditary House of Lords did supply some long-term thinking in our public life. But since Blair’s reforms it has become both an extension of party politics and a form of reward for good behaviour in the ecology of ‘experts’ that populates public life. Both these developments are to the detriment both of democratic accountability and long-term thinking.

We should abolish the system of appointed hereditary peers that so typifies the ‘insiders’ club’ feeling of modern politics and instead invite experts to run for election to the Lords. This would be on a long electoral cycle (let’s say ten years) with a recall mechanism in extremis and specific responsibility for taking the long view on key policy areas where expertise is needed and party politics a source of harm.

Areas of policy that might benefit from being managed in this way include (in no particular order) healthcare, education, consumer standards and international trade. Education and healthcare in particular suffer from being treated by all sides as a political football. They are subjected to interminable ‘reforms’ by MPs thinking in electoral cycles rather than the long term, and desperate for impact with no regard for the millions whose daily jobs are turned upside down by the latest eye-catching initiative. And international trade and product standards are (as the Brexit negotiations have amply demonstrated) too technical for the brief to be grasped on a short timescale by elected non-experts.

Under this system, rather than having (for example) an education secretary in situ for a year or two, fiddling with policy for the sake of looking busy, we could have subject experts with hands-on experience, such as Katherine Birbalsingh or Amanda Spielman, standing for the Lords on a ten-year education ticket, long enough to see the results of any decisions taken and be held accountable for them. We could see a Lords education candidate for child-centred ‘skills’ education debate a Lords candidate keen on knowledge-and-discipline-first, with the electorate able to make the decision. Alongside this critical function of managing areas of policy for the long term, our elected expert Lords could then continue their role scrutinising legislation, as at present.

This transformation would at a stroke rid us of our increasingly unpopular ‘crony’ Lords, create more space for long-term thinking in key policy areas, and make the experts we need more democratically accountable. It would move some areas of policymaking away from short-term party politics and more toward a blend of long-termism and direct democracy. In doing so it could balance the need for experts in modern government with the equally pressing need to respond to a general public sense of democratic deficit, and thus maybe yet save us all from Faragism.

House price fetishism: the Tory paradox in a nutshell

Ever since Thatcher introduced Right to Buy, and then Blair super-heated the housing market with a combination of cheap loans and mass immigration, home ownership has become ever more of a sticky wicket for the Tories. On the one hand, Tory voting has historically been associated with home ownership: people with something to lose are typically more conservative. On the other hand though, in order to sustain the pleasantly rising house prices that keep the core Tory base contented (and the cheap money flowing, as people remortgage to pay for extensions, kids’ university fees or whatever) it becomes ever harder for younger generations to join the home-owning ranks of the putatively Tory.

Mulling this over, it struck me that there’s a second, more profound way that the late twentieth-century transformation of homes into part loan collateral, part asset class, part status symbol has left conservatism with a dilemma. A couple of years ago I wrote a piece about the way Brexit was functioning as a proxy war within the Tory Party over which the party valued more: free market dogma or social conservatism. I think my analysis still holds, and indeed that the only thing that has changed is that social conservatives are now losing, and leaving the Tory Party in droves. The housing issue, it seems to me, encapsulates the nature of this conflict in a nutshell.

Here’s why: if you see your house purchase primarily as an asset class, you’re not buying with the intent to settle and make a home there. You’ll do the place up, sell it on and move. No need to get to know the neighbours, form networks, get involved in community activities. Probably best if your kids don’t put down too many local roots or it’ll be a wrench for them to leave their friends. Homes-as-asset-class is the quintessential Anywhere (Goodhart) mindset, that treats a place as a set of resources to be consumed, developed, improved, but which are ultimately that: resources. Not networks, not reciprocal obligations, not really a home. Conversely, if you buy somewhere as a Somewhere, with the intent to put down roots and make a home there – to be there for the rest of your life or at least the foreseeable future – you can’t really treat your home as an asset class because it’s about the least liquid asset imaginable. OK, if house prices rise you’ll benefit a bit in theory, because maybe you can take out a loan against the imagined gain in value of your house but again, that’s only really meaningful if you’re planning to sell.

Now, I’m   being a bit reductive but returning to the Conservatives, your Anywheres are all for free market liberalism – and your Somewheres are all for social conservatism. For many years, the two managed to coexist well enough within the same party, united – perhaps – by a broad consensus (for different reasons) that taxation and public spending should be restrained. But if the issue of European Union membership has been the most visible evidence of that truce collapsing, the breakdown both predates and is more profound than ‘banging on about Europe’ would suggest.

We’ve reached a point now where the demands of the free market are becoming ever more inimical to the needs of the kind of settled community that nurtures and values social conservatism. The kind of worldview that values the free market understands a house as primarily an investment, and invests him or herself in the local community in proportion to that understanding – ie lightly if at all. This is profoundly at odds with the kind of worldview that places value on continuity, community, a sense of place and tradition. Thus while both these groups may place a value on home ownership, it is for radically different reasons: and these two strands of conservatism are increasingly at odds.

Fundamentally, the Conservative Party has acted for some decades as though free market ideology were compatible with a belief in patriotism, conservative social values and a healthy civic society. It is becoming increasingly apparent that this is no longer the case. The profound sociocultural conflict and difference in outlook – and hence spending behaviour, political assumptions and fundamental approach to life – emblematic in the difference between a Somewhere who wishes to buy a house as a home, to live in and care for within the context of a rooted and socially-engaged local existence, and an Anywhere who wishes to buy a house as an investment, with the aim of moving on once it is financially viable, encapsulates this irreducible fracture. It is increasingly apparent that the Conservative Party cannot serve both. It is also increasingly apparent that, if one group has to go, it will not be the Anywheres. So the question is: who will speak for lower middle class Somewheres, when – as is now inevitable – they begin to flex their political muscles somewhere other than the Tory Party?

Can societies survive without blasphemy laws?

So today I was mulling gloomily over the way hate crime laws seem to have taken seamlessly over the function of blasphemy laws in the UK. I decided to look up when blasphemy was abolished as an offence in the country, thinking it might be sometime in the 1970s. Wrong – blasphemy was abolished as an offence in 2008. The acts governing hate crime (the Crime and Disorder Act and the Criminal Justice Act) were added to the statute book in 1998 and and 2003 respectively.

The CPS’ own website states that

The police and the CPS have agreed the following definition for identifying and flagging hate crimes: “Any criminal offence which is perceived by the victim or any other person, to be motivated by hostility or prejudice, based on a person’s disability or perceived disability; race or perceived race; or religion or perceived religion; or sexual orientation or perceived sexual orientation or transgender identity or perceived transgender identity.”

These laws have been used in recent times for such diverse purposes as fining a man who taught his girlfriend’s dog to make a Nazi salute and arresting a woman for calling a transgender woman a man.

The common feature of both the blasphemy laws of yore and the hate crime laws of today is that both prohibit speech considered harmful to society’s morals. That society’s morals are no longer situated in a common belief system (such as Christianity) but an atomised, individualistic inner space (as expressed by the definition of hate crime as anything which is perceived by an individual as being such) is neither here nor there. Certain tenets cannot be challenged lest doing so harms the fabric of society.

It’s also neither here nor there that some of those moral tenets are unprovable or unfalsifiable in any objective sense: the Resurrection of Christ, say, or the existence of some magical inner ‘gender identity’. Indeed the more outlandish a protected belief the better, because the function of blasphemy laws is to compel moral obedience, and what better sign of moral obedience than to see people dutifully repeating something that is in no sense objectively true (such as that men can become women) on pain of being punished if they don’t comply?

My argument here isn’t that we should abolish hate crime laws as we did their predecessors, the laws of blasphemy. I don’t want to rant, Spiked-style, about the threat from blasphemy and hate crime laws to free speech so much I want to ask: have we ever really had free speech? It seems no sooner did we get rid of one set of rules about what you can’t say than we replaced them with another. There was, perhaps, a couple of decades where blasphemy was effectively defunct despite the statute remaining in existence and before hate crime came to be. But the collapse of controls on speech for religious reasons is nigh-simultaneous with the rise of controls on speech for social justice/equality reasons. The Human Rights Act 1998 forced blasphemy law to be restrained by the right to free speech; the same year, the Crime and Disorder Act made hateful behaviour toward a victim based on membership (or presumed membership) in a racial or religious group an aggravating factor in sentencing. (Insert chin-stroking emoji here.)

This leads me to suspect that human societies cannot, in fact, survive very long without laws of some kind governing speech. I’d love to see a counter-example. But I’ll be astonished if anyone can point me to a state that has abolished religious blasphemy without replacing it with controls on speech for other reasons, whether (under supposedly atheistic Communism) to forbid speaking against the Dictator, or (under supposedly individualistic, pluralistic liberalism) to forbid speaking against individuals’ notional right to self-define without reference to the collective.

Much as every human represses some aspects of their personality in order to function, every society does so too; it is a foolish or short-lived society that makes no effort to clamp down on behaviours or opinions that pose a threat to what that society considers the good or virtuous life. If that’s the case, is there even any value in trying to fight what feels like a rising tide of authoritarian busybodying keen to tell me what I can and can’t say? Or should I just pile in and make my bid to be on the team who’s in charge of deciding what should or shouldn’t be banned?

Right now, the two groups jostling most energetically for that position in the UK are the proponents of ‘intersectionality’ and the radical Islamists. If Nassim Taleb is correct, and social mores are disproportionately set by tiny ideological minorities purely based on the strength of their conviction, then whether we end up punishing those who assert that men cannot become women or those who draw cartoons of Mohammed will be a straight fight between which of those groups is more determined to blow shit up if they don’t get their way.

I don’t really like the way this argument is going. If I’m right, then social mores in a few decades will bear few resemblances to those of today And whether they’re structured with reference to authoritarian liberalism or radical Islam I don’t think I will particularly like their shape. But there’s nothing I can do about it – the moral majority in the country is firmly post-Christian and, as I’ve argued elsewhere, a society that can’t be arsed to defend its moral traditions is guaranteed to see them supplanted by ideologies with more committed adherents. And indeed, the kind of Christianity that did once upon a time get out of bed to defend its moral tenets by any means necessary would probably, in practice, be as repugnant to me as either of the likely moral futures toward which our society is heading.

 

Who cares? A response to Giles Fraser

Giles Fraser writes in Unherd today about how the worldview that extols ‘social mobility’ and ‘free movement of people’ also cuts off at the knees the ability of families to care for their youngest and oldest, encouraging everyone instead to see themselves as free, wage-earning individuals and arse wiping as the responsibility of each individual or failing that the state. He asks: whose responsibility should it be to care for those who can’t wipe their own?

First, let me answer the question. Children have a responsibility to look after their parents. Even better, care should be embedded within the context of the wider family and community. It is the daughter of the elderly gentleman that should be wiping his bottom. This sort of thing is not something to subcontract.
Ideally, then, people should live close to their parents and also have some time availability to care for them. But instead, many have cast off their care to the state or to carers who may have themselves left their own families in another country to come and care for those that we won’t.

Now this is all very well and cuts to the heart of the question that more than any other makes me want to wrangle with feminism: the question of who cares. Not as in who gives a stuff, but who wipes the arses of those who can’t wipe their own? Somehow, still, the implicit answer still always seems to be ‘it should be women’. Fraser’s reference to ‘the daughter of the elderly gentlement’ is a reference to an anecdote but I think it goes beyond that: Fraser genuinely thinks that daughters should be wiping their parents’ arses.

I agree with Giles’ assessment insofar as it’s plain to me that the liberal vision of society has some shortcomings on this front. When the vision of the good life says that each of us is (or should be) an individual free of social expectations and obligations, when freedom is seen as a liberation from social obligations (such as arse wiping) that may be boring or unpleasant, then we are left with no happy answers to the question ‘who cares?’.

A feminism that holds this vision of autonomy above all else must necessarily skirt around motherhood and the elderly, because to focus on motherhood and the elderly would be to raise the question of who is wiping those arses, now that we’re all emancipated from domestic drudgery. For wealthier emancipated women, the answer today in practice is: less wealthy women. But who wipes the arses of children and elderly parents for those women who are paid to wipe arses by women who don’t want to wipe arses?

Clearly the blind spot in this discussion is that 50% of the population that Giles also omits to mention in his discussion of our lost bonds of reciprocal caring. Men. The emancipation of (some) women from caring obligations has not been swiftly followed by a stampede of men keen to pick up the shortfall. Men are not clamouring to stay home and look after elderly, incontinent parents. Rather, the assumption seems to be that liberation, liberal-style, means that no-one need wipe arses now unless they’re being paid for it. It’s paid carers all the way down, getting cheaper and more uncaring the further down the economic scale you go, until finally you’re back at women doing it unpaid.

Lost in all of this pass-the-hot-potato attitude to arse wiping is the notion that far from being an infra dig imposition on free individuals, reciprocal caring actually matters – indeed is the glue that binds any functioning society together. So as I wrestle with the question of how we square the evidently painful loss in our society of a valuable set of reciprocal, mutual caretaking obligation with my wish, as a woman, to have at least SOME hours of activity outside domestic drudgery, the only conclusion I can come to is that we – all of us that is, not just women – need to revalorise caring. And that goes for all of us. Those feminists who seem not to want to talk about arse wiping, preferring to focus on workplace sexual mores or female representation in elite career positions or traduce anyone who asks about caring as a fifth columnist for those (presumably closet Nazi) reactionaries who would see us return to the rigidly defined sex-based social roles of days gone by:

E A G E R@ElephantEager

God hates you Giles.

Sarah Ditum

@sarahditum

kinder küche kirche

See Sarah Ditum’s other Tweets

It also goes for Fraser, and everyone like him who wishes we could be more communitarian but seems to assume without a moment’s reflection that he can sign women back up for the role they held 50 years ago, without any kind of discussion about maybe improving on those working conditions or asking men to step up as well. It also goes for the rest of us, every time we take a decision that increases our autonomy at the expense of our ability to care. Regardless of our sex.

Motherhood blew up my feminism 1: the shadows of equality

I hope to write a few pieces on this theme. I’m not an academic feminist, or even an academic. But I’ve always considered myself strongly in favour of women’s rights, and have found myself rethinking a great deal of what I thought that meant since becoming a mother. 

Before I had a baby, I was a pretty standard second-wave feminist, with some third-wave gender woo thrown in. People couldn’t really change sex but their identification mattered more; equal representation in business and public life was the key thing for women.

Having a baby blew a lot of that out of the water for me. I never really understood the permanent sense of being spread too thin that is so weakly implied by the hackneyed phrase ‘balancing work and family life’. I’d always imagined I’d go briskly back to work and childcare would be an easy solution. Then, when my daughter did arrive, it turned out that there wasn’t really anything I wanted to do so much out in the world that it overrode the visceral desire to stay close to her and prioritise caring for her. I was working freelance when I got pregnant so was never really on the ‘return to work’ conveyor belt and just haven’t felt at all inspired to claw my way back onto it. So, nearly 20 months later, I’m still not back at work. For the most part I’m happy with that choice, and I’m fortunate that my husband earns enough for me to have had the choice to care for our daughter beyond the initial 12 months maternity leave permitted in UK law, without needing to return to work part- or full-time.

Two aspects of this have blown up much of what I took as axiomatic in feminism prior to this. One, that women simply and unambiguously have more choices now than they did 50 years ago, and two, that the earnings gap between men and women was certainly down to prejudice.

On the matter of choice and motherhood, then. I’ve found myself saying often to people in recent months: ‘I’m lucky to have the choice to stay home with my daughter’. Thinking on this, it struck me that within the phrase ‘balancing work and family life’ is a whole story about the shadow side of second-wave feminism, specifically of women’s push into the workplace on equal terms to men. Very understandably, educated women burned to free themselves from the stifling expectation that they would be wives and mothers, and no more. My mother’s generation were still expected to quit work when they got married. But as that changed, and more women stayed in work, negotiated maternity pay etc, over the second half of the twentieth century the arrival of women in the workplace contributed along with de-industrialisation, the rise of the knowledge economy and a host of other things to a gradual adjustment of the economy via inflation, salary levels, housing costs and the like. And the consequence of this adjustment is that where it was once possible for even modest earners to keep a family on one salary, now – because women work as well – it takes both incomes to keep body and soul together.

That’s all very well for families where both parents have careers that are as rewarding as raising one’s children, but for parents (and yes, especially for mothers) who have jobs rather than careers it forces many back into work whether they might well prefer to stay home with children. After all, in most cases people actually like spending time with their children. It’s not all snot and shitwork. So while it’s certainly true that on paper I have a lot more choice than my mother did, partly that’s because my husband earns more than average. I’m not sure how much the feminist revolution really has increased choice for working- and lower-middle-class women at all, as much as it has widened the scope of life for women in the cognitive elite.

Though I don’t have the figures to prove it, my hunch as well is that assortative mating – the tendency of people of similar social and economic status to marry and reproduce with one another – has created a multiplier on the gap between richer and poorer families. Put more simply, lawyers, doctors and bankers tend to marry one another and combine their six-figure salaries, which in turn means that aggregate family incomes are far more widely distributed than would be the case in a society where most families had a single parent earning income. Thus top-end house prices skyrocket; independent school fees rise; and people who would have had a traditionally middle-class lifestyle 30 years ago such as journalists and academics find themselves priced out of their native cultural territory.

Now this isn’t necessarily a bad thing, unless you are overly concerned about income disparities. (Though the people most concerned about possible increases in the gap between rich and poor are usually also strongly supportive of the right of both men and women to work on equal terms, so it is maybe unsurprising that this unintended consequence is not often discussed.) I note it because I like to follow thoughts where logic dictates they must go; not because I think things should go back to the way they were in 1965. But I’ve found myself concluding that the entry of women to the workplace has probably contributed to the gradual widening of the gap between rich and poor; and that it has forced substantial numbers of mothers – who do not particularly love their jobs – to spend more time working and less time with their children than they would otherwise choose.

Of course it wasn’t that long ago that most women had very little choice about not going out to work, whether they wanted to stay home with children or not. And now it seems to me the pendulum has swung so far in the other direction that it is limiting women’s choices in the opposite way.

So in summary I’m no longer sure that the sex pay gap is unambiguously the result of a downward pressure exerted by reactionary forces of chauvinism on an otherwise driven and capable female workforce. And nor am I convinced that women precisely have more choice than before, say, the 1960s. Those choices have changed, but so have the social and economic pressures and I’m not convinced that has been wholly for the better. To be clear, this isn’t to say that feminism is redundant. Quite the opposite, But I do think we could do with looking a little more clear-sightedly at where we are, at what that looks like for women who have jobs rather than careers, and what exactly we think the work of motherhood is worth today.

 

On Reconstruction: surviving the trauma of postmodernism

I’ve been mentally composing a version of this essay for a long time. I thought perhaps its relevance might have passed but the explosion in the last few years of postmodern identity politics into the mainstream convinces me that far from being something that happened briefly to one not very happy undergraduate in the early 00s, the mental distress I experienced as a result of exposure to ‘critical theory’ has expanded to encompass much of contemporary discourse. I don’t claim to have a solution to that, but I want to share how I survived.

I went to a moderately eccentric school by ordinary standards, but for the purposes of this essay we can treat it as a classical education, inasmuch as we learned about great civilisations that came before ours and this knowledge was treated as important and still relevant to us and the world and culture today. Built into the form of the curriculum was a tacit teleology, that implied (whether or not it was ever stated) an evolutionary relation of each civilisation to the one that preceded it. It was a narrative that led to where we are now, and the civilisation we currently inhabit.

Imagine my surprise, then, when as an English Literature undergraduate at Oxford in around 2000, I discovered postmodernist thought, and its many schools of critical theory.

By ‘critical theory’ I mean the body of thought emanating initially mostly from France, with Saussure and Derrida, then expanding out to include such figures as Paul de Man, Slavoj Žižek and Judith Butler. Many more names have joined that list since, and taken together I believe it is referred to as ‘cultural studies’ today, or, in the words of the Sokal Squared hoaxsters, ‘grievance studies’. Back in 2002 at Oxford, critical theory was a looming presence at the edge of the arts but seemed most pertinent to the study of literature; it has subsequently, I gather, swallowed most of the humanities and is mounting siege against the sciences as I write.

But I digress. The central insight of this discipline was the destabilising one, and that I think has not changed. To summarise: Saussure proposed that instead of treating language as transparent, its meaning rising off the page without any need for elucidation, we should split language into ‘sign’ and ‘signified’. That is, what a word means is separable from the word that means it. We can thus, he argued, institute a new discipline of ‘semiotics’: the study of signs – a study that reaches far beyond language and was immediately influential in the social sciences.

This insight was developed by Jaques Derrida, whose simple but devastating observation was that if this is the case, we cannot define any given ‘signified’ except with reference to further signs, which then in turn themselves require definition with reference to further signs. It’s turtles all the way down. We have no means, through language, of arriving at any kind of truth that we are able to experience directly. Furthermore, the concerted efforts by centuries of culture since the Enlightenment to obscure the fact that it’s turtles all the way down is in fact a cunning effort to shore up vested interests, and to conceal the operations of power. Recourses to authority are null and void. There is no solid foundation, no God, no truth, no authority. Only power, and a form of consensus reality arrived at through the glacial accretion of a billion tiny operations of power that have, in sum, decreed that the world should be thus and not thus.

I freely admit that I was a bit loopy anyway when I reached that point on my reading list, for unrelated personal reasons. But this insight hit me like a freight train. I spent most of one Trinity term feeling as though I was in the midst of a psychotic experience. Instead of seeing the ‘dreaming spires’ around me as the accumulation of centuries of carefully-tended tradition, a representation in architecture of the ideal of the university as a custodian of the best that has been thought and said to date, I saw each building as a kind of nightmarish extrusion into physical space of power structures that were indifferent if not hostile to me as a sexual minority and a woman. I felt suffocated: stifled by a kind of blaring architectural triumphalism that declared at every corner, with every church tower, every statue of every great man, ‘YOU CANNOT CHANGE ANY OF THIS, WE WILL ALWAYS WIN’. And stifled again by the nihilistic twist postmodernism places on this reading of the world, and culture, in that it assures us that there is nowhere to stand outside the push and pull of power. There is nothing outside the text. So in trying to challenge these operations of power, we will probably just end up re-inscribing them.

By now you’re probably thinking ‘wow, she sounds nuts’. Well yes, I was a bit at that point, as I said before. But I describe this experience in detail because 1) it was so distressing and 2) the state I spent the next few years in following my fall from the Eden of pre-post-modernism[1] sounded, in my inner monologue, so similar to what I read of the toxic ‘social justice’ debate that rolls around our social media some 15 or so years on that I feel they must be related. What if today’s SJWs are in fact acting out a traumatic state of mind engendered by exposure to ‘cultural studies’ at university? If that is the case, then there may be someone out there who will find some comfort in my story of how I recovered from that experience to the point where I was able to make any decisions at all.

Because make no mistake, the Fall engendered by internalising the idea that ‘there is nothing outside the text’ is a horrible place to be. Consider that iconic scene in the Matrix where Neo wakes from the dream he believed to be normal life, in a slime-covered capsule, to discover that he and the rest of the human species are in fact mindless peons farmed by forces beyond their power to change. Then bin the rest of the Matrix franchise, shoot Morpheus and the rest of the resistance, end the film with Neo back in his pod as a human generator, just without his connection to the Matrix. Eyes staring helplessly into the machine-farm abyss. That’s a bit how it feels.

Forget political radicalism. There’s nothing left, this worldview says, but a continuous action of ‘disruption’ from within the system. There is no way to change the world for the better because what even is the better anyway? All you have left available to you is a kind of carping from the sidelines. Calling out particularly brazen efforts by the collective voice of consensus reality to perpetutate itself in its current form and to silence potentially disruptive voices. Maybe trying to widen the range of voices permitted to contribute to the operations of power. Maybe you can see now how this could be a mindset conducive to (for example) the contemporary popularity of ‘call-out culture’ and quixotic obsession of public discourse with ensuring the identity categories of figures in public life and Hollywood films precisely replicate their demographic proportions in the population at large.

No truth, no authority, no meaning, no means of striving for the good without producing more of the same. Just power. For the longest time I couldn’t find a way out of the dragging nihilism engendered by this worldview. Eventually though it occurred to me that I just didn’t have to be absolutist about it. I just had to be a bit more pragmatic. So what if we can never be wholly certain that what we mean to say to someone else is exactly what they hear, because every definition we use in theory needs to be defined in its turn, and so on ad infinitum? If I ask my friend to pass the salt, and he passes the salt, I really don’t need to waste energy mulling over the social forces underlying the particular rituals around eating and table manners that obtain in my current cultural context. I thank my friend and add some salt to my dinner.

This is a tiny example but I decided to try and apply this principle to life in general. If I needed to get on with something, instead of getting bogged down, within every social context and every situation, with the subterranean operations of power, patriarchy, compulsory heterosexuality etc etc etc, I’d try and bracket all that stuff and act as if things were still as stable as they were before the Fall. I coined the term ‘temporary certainties’ for this state of mind. It took a bit of mental effort (and you probably still think I sound mad) but far less mental effort than inwardly deconstructing every utterance, object and situation I found myself in for signs of Western-colonialist cisheteropatriarchal blahdeblah.

Gradually, the psychosis waned. Now, 10 or so years on from arriving at this solution, it’s still working for me. The world can never be as solid-seeming as it was before my Fall. Truth still seems a bit relative depending on where one is standing. But the important insight is that many categories, many tropes, objects and structures, are stable enough to treat them ‘as if’ they were pre-post-modernist type solid. You don’t need to waste time deconstructing everything; indeed, trying to do so is a fast track to a sense of perpetual victimisation and bitter, impotent rage. And trying to build any kind of transformative politics on a foundation of perpetual victimisation and bitter, impotent rage is not going to turn out as a net force for good, however radically you relativise the notion of ‘good’.

This doesn’t have to mean buying in wholesale to things as they are and becoming a cheerleader for keeping things unchanged. But to anyone currently struggling to focus in a world that seems hostile and composed entirely of operations of power, I say: pick your battles. Much of the world is still good (for a temporarily certain value of good), many people are kind and well-meaning. Creating new interpersonal dynamics around the anxious effort to avoid the accidental replication in ordinary speech of sociocultural dynamics you find oppressive (aka ‘microaggressions’ may not, in the end, make for a more functional society. It’s possible to treat as a temporary certainty the hypothesis that in asking ‘Where are you from?’ someone is not in fact unconsciously othering you by virtue of your apparent ethnic difference, but simply – from maybe a naïve position in a social background that does not include many ethnic minorities – seeking to know more about you, in order to befriend you.

The beauty of a temporary certainty is that, choosing such a vantage point, we can say of any given cultural phenomenon (the institution of marriage, say) ‘we are where we are’. We are no longer stuck with the Hobson’s choice of either pretending to buy into something as an absolute that we see as contingent and culturally constructed, or else setting ourselves pointlessly in opposition to it, protesting that as it is culturally constructed we should make all efforts to disrupt or transform it into some form that might appear more ‘just’. Instead, we can accept that despite this phenomenon being, strictly speaking, contingent, it remains stable enough that we can and should find a pragmatic relation to it. (In my case, that was to get married. One of the best decisions I ever made.)

You may object that my argument here amounts to a strategy for recouping something for cultural conservatism from the rubble of the post-modernist project. I beg to differ. Rather, what I’m advocating here is more along the lines of a plea to those who see themselves as political radicals to think deeply about what really matters and to focus on that. As it stands, ‘social justice’ social media suggests that thanks to the post-Fall malaise I postulate as infecting most of our young people, radical politics is resiling into a kind of nihilistic shit-slinging incapable of going beyond critiquing the contingency of what it seeks to change in order to advocate for anything better.

[1] I don’t mean modernism, hence the clumsy construction. I mean something more like ‘the popular twentieth century Enlightenment-ish consensus about truth, reason and meaning’