This post was prompted by the quite extraordinarily patronising Better Together advert of a woman, effectively telling the electorate that she didn't understand any of the issues in the debate and was therefore going to vote "No." There has been uproar in the Scottish Labour party that Danny Alexander let this advert through, but it got me thinking about broader questions of the nature of political debate.
I am just old enough to remember a time when politics was about impassioned ideologues addressing mass meetings to rail against the iniquities of capitalism or the unworkability of socialism. Politics was, almost by definition, about big ideas and grand concepts. You voted for an MP because he or she (mostly he back then, if the truth be told) shared your ideals- spoke to your deepest convictions about what is right and proper and true.
It was actually one of the most impassioned ideologues of them all who changed all that. Margaret Thatcher had the grand vision (I use the term loosely) and the overarching principles (even more loosely this time) and went straight for the big changes, rather than tinkering round the edges. Yet probably this disconcerting sound recording tells you all you need to know about her big ideas. They were in the end not grand at all but utterly mundane- even petty. Her political vision was founded on self interest. Her "no such thing as society" line was precisely about that- reducing the grandness of the post-war social project to a simple question of what is best for you and your family in the immediate future.
And in political terms this has been her legacy even more than the subsequent drift to the right or the madness unleashed by her and Reagan's conversion of the financial markets into vast casinos. It is almost axiomatic now that politicians need to stay away from big questions of principle. They bore and confuse the electorate, received wisdom seems now to say. Interpret everything in terms of how it will affect Essex man, or Worcester woman, or one of an increasing congregation of 'ordinary people.' People will become politically engaged once they see the relevance of what you are talking about to their own lives, and not before.
Of course this hasn't really worked at all. Levels of political engagement have plummeted to a quite embarrassing degree. Turnout in the police and crime commissioner elections is so low that a candidate has a significant advantage if he/she has a big family. So long as their family turns out to vote of course. Basically, no one really believes that political engagement is going to make any direct difference to any of the specifics of their own lives, so they don't bother voting. And who can blame them? A prospective MP may base their entire campaign on protecting a local hospital, despite being in a party that is committed to reducing the number of local hospitals.
And whether they recognise that the reality of government is much more complex than the simplicity of electoral campaigns, or whether they just think politicians are a load of lying twats, the electorate have learned pretty well that political engagement and petty self-interest are more or less incompatible. Occasionally a single issue will arise that politicians can jump on, whipping up petty self-interest into a simulacrum of principled politics, and that is what has led to the ludicrous rise in UKIP support, but I guarantee that the turnout at the next election will still be under 50%.
Yet in Scotland, it seems, something very odd indeed has started happening. Alex Salmond has suggested that turnout in the referendum may reach 80%, and whilst he has his own reasons for saying that I have certainly been made aware of a quite extraordinary degree of engagement in the debate amongst ordinary voters. And what has triggered this hugely uncharacteristic level of interest in an issue that is all about politics and the constitution? Well not principally narrow self-interest it seems, which is where the Better Together campaign is suddenly beginning to flounder.
For those who haven't been following the debate, the Better Together camp have pretty much been telling the Scottish electorate that if they vote for independence they will lose the pound sterling, will be kicked out of the EU, will be saddled with debt and will no longer be supported by the rest of the UK. So far better not risk any of that- better just follow the example of the woman in the advert. Don't worry your head about the big issues, just vote No.
Only it seems that the Scottish electorate WANT to engage with the big political issues: about nuclear disarmament, and social welfare and reducing inequality, and finding alternatives to unbridled free-market philosophies. It's like something out of the 70s for God's sake!
Of course I am not pretending that Scotland is some sort of Utopia. If they do vote for independence then there is every chance that once the euphoria dies down the political classes will descend into a series of very unseemly cat-fights. And Scots, deprived of a Tory-voting England to rebel against might start drifting rightwards themselves. And quite possibly some demagogic Scottish Thatcher will at some point whip up all their basest instincts.
But none of that is the point of this post. The point is that it turns out that the way to get people genuinely and passionately politically engaged is not to remind them of their own narrow self-interest. It is the very opposite. It is about raising the really big questions: what sort of country do we want to be? what are our core principles? what unites and what divides us? And raising them with a genuine sense that if people really take these questions seriously then there is a chance that we could actually do something about answering them. Collectively not individually. As a nation, not as an assemblage of self-interested individuals.
Sunday, 31 August 2014
Thursday, 28 August 2014
The novel and the short story compared
I find myself contemplating developing my writing in two different directions: the novel and the short story. I have applications pending for both the Word Factory's short story writing apprenticeship and the York Festival of Writing novel writers' workshop. Which leads me to wonder to what extent the two forms are compatible with each other. What in the end is the difference between writing novels and writing short stories?
The short story is memorably described on the Word Factory site as "an espresso shot" and in many ways this is the perfect analogy. Small and intense, an espresso packs all the power and flavour of a normal coffee into one small but satisfying cup. The effect can be fulfilling, exhilarating or even terrifying, giving rise in the susceptible to palpitations and a racing heartbeat. The aroma can be almost unbelievably complex- absorbed in an instant yet filling the senses for some time afterwards.
The term espresso covers a large variety of forms, from the half-cup favoured in Northern France to the bare teaspoon you get in Naples: from the satisfying completeness of a near-novella to the pithy force of the best flash fiction. There is also a place in the market for a huge variety of producers, from the globally successful high street chains to the obsessive ex-backpacker who imports beans green from a Peruvian village and roasts them by hand in the back of the shop: from the internationally renowned superstars of short fiction to the passionate amateur whose writing "reinvents the form" to such an extent as to be virtually unreadable. Here, mind you, the analogy fails somewhat, for whilst I would rather drink heated cat-piss than a Starbucks espresso I regard Alice Munro as not only one of the most successful but also one of the best short story writers around.
However all in all, the espresso remains the perfect metaphor for a short story. So what then of the novel? To equate a novel with a cup of filter coffee would be entirely unfair. A cup of filter coffee is, to coffee aficionados, no more than an espresso that has not been subjected to sufficient pressure to extract the flavour from the beans and has then been adulterated with half a pint of scalding water. Perhaps a closer analogy would be a bottle of wine- intoxicating, often complex in flavour and lasting a lot longer than a single espresso.
Doesn't quite catch it though, does it? For a start, a bottle of wine is all one thing, the first glass tasting identical to the last. And whilst it is possible to consume a bottle of wine alone, I have found that this is frowned on by the general public, particularly on the Tube in the morning rush hour. Most importantly though, however good a bottle of wine it is fundamentally unsubstantial. All you are left with the following day is a headache and a lingering sense of regret.
So another analogy then. Perhaps we need to broaden the range to incorporate food as well as drink. A full dinner is both more satisfying and more complex than a bottle of wine. Does that make it more like a novel? Well not really. Again, a large meal consumed alone speaks of nothing more than gluttony and leaves one feeling bloated and almost as regretful as a bottle of wine would. Furthermore, unless one is involved in a Mediterranean country wedding even the largest meal is not something one consumes over several days.
No, in the end the only true food-and-drink-based analogy I could find for the novel to set against the short-story-as-espresso was a rather odd one: a fruit tree. Fruit trees bear their bounty over many days, or even weeks. You can disregard them for a time, then go back and pick a few more fruits, or lie underneath and gorge yourself until time and the world disappears and there is only you and the tree. The fruits can be soft and luscious, like the pears on the tree in my garden: easy to pick and with soft and yielding flesh, or they can be as challenging and difficult as sloes. The blackthorn tree makes its fruits virtually inaccessible through an impenetrable tangle of dark, thorny branches and the sloes when picked are almost unpalatably bitter, setting the teeth on edge with their jarring force. Yet steeped in plenty of gin and left to marinate for a few weeks and their flavour is as subtle and complex as any you can find.
And in fact the two analogies speak to some extent of their mode of production too. Creating the perfect espresso takes experience, fine judgment and the ability to focus one's attention entirely on that one moment- that one cup. It is about juggling the interconnected factors of pressure, temperature and volume with the subtle blend of aromas in the coffee beans until what emerges into the cup is the pure and intense essence of the thing.
Growing a fruit tree is a much longer-scale thing. It involves intense effort, a considerable amount of patience and the willingness to prune ruthlessly when the tree is out of shape or fails to produce fruit. The tree grower is never entirely in control of the finished thing: can never fully know what fruits it will produce for those who visit it. All that he or she can do is to raise it in the best possible conditions, tend it carefully over a long period of time, then walk away and leave it for others to discover.
So there you go, I think. If a short story is a shot of espresso then a novel is a fruit tree. Obvious when you come to think about it.
The short story is memorably described on the Word Factory site as "an espresso shot" and in many ways this is the perfect analogy. Small and intense, an espresso packs all the power and flavour of a normal coffee into one small but satisfying cup. The effect can be fulfilling, exhilarating or even terrifying, giving rise in the susceptible to palpitations and a racing heartbeat. The aroma can be almost unbelievably complex- absorbed in an instant yet filling the senses for some time afterwards.
The term espresso covers a large variety of forms, from the half-cup favoured in Northern France to the bare teaspoon you get in Naples: from the satisfying completeness of a near-novella to the pithy force of the best flash fiction. There is also a place in the market for a huge variety of producers, from the globally successful high street chains to the obsessive ex-backpacker who imports beans green from a Peruvian village and roasts them by hand in the back of the shop: from the internationally renowned superstars of short fiction to the passionate amateur whose writing "reinvents the form" to such an extent as to be virtually unreadable. Here, mind you, the analogy fails somewhat, for whilst I would rather drink heated cat-piss than a Starbucks espresso I regard Alice Munro as not only one of the most successful but also one of the best short story writers around.
However all in all, the espresso remains the perfect metaphor for a short story. So what then of the novel? To equate a novel with a cup of filter coffee would be entirely unfair. A cup of filter coffee is, to coffee aficionados, no more than an espresso that has not been subjected to sufficient pressure to extract the flavour from the beans and has then been adulterated with half a pint of scalding water. Perhaps a closer analogy would be a bottle of wine- intoxicating, often complex in flavour and lasting a lot longer than a single espresso.
Doesn't quite catch it though, does it? For a start, a bottle of wine is all one thing, the first glass tasting identical to the last. And whilst it is possible to consume a bottle of wine alone, I have found that this is frowned on by the general public, particularly on the Tube in the morning rush hour. Most importantly though, however good a bottle of wine it is fundamentally unsubstantial. All you are left with the following day is a headache and a lingering sense of regret.
So another analogy then. Perhaps we need to broaden the range to incorporate food as well as drink. A full dinner is both more satisfying and more complex than a bottle of wine. Does that make it more like a novel? Well not really. Again, a large meal consumed alone speaks of nothing more than gluttony and leaves one feeling bloated and almost as regretful as a bottle of wine would. Furthermore, unless one is involved in a Mediterranean country wedding even the largest meal is not something one consumes over several days.
No, in the end the only true food-and-drink-based analogy I could find for the novel to set against the short-story-as-espresso was a rather odd one: a fruit tree. Fruit trees bear their bounty over many days, or even weeks. You can disregard them for a time, then go back and pick a few more fruits, or lie underneath and gorge yourself until time and the world disappears and there is only you and the tree. The fruits can be soft and luscious, like the pears on the tree in my garden: easy to pick and with soft and yielding flesh, or they can be as challenging and difficult as sloes. The blackthorn tree makes its fruits virtually inaccessible through an impenetrable tangle of dark, thorny branches and the sloes when picked are almost unpalatably bitter, setting the teeth on edge with their jarring force. Yet steeped in plenty of gin and left to marinate for a few weeks and their flavour is as subtle and complex as any you can find.
And in fact the two analogies speak to some extent of their mode of production too. Creating the perfect espresso takes experience, fine judgment and the ability to focus one's attention entirely on that one moment- that one cup. It is about juggling the interconnected factors of pressure, temperature and volume with the subtle blend of aromas in the coffee beans until what emerges into the cup is the pure and intense essence of the thing.
Growing a fruit tree is a much longer-scale thing. It involves intense effort, a considerable amount of patience and the willingness to prune ruthlessly when the tree is out of shape or fails to produce fruit. The tree grower is never entirely in control of the finished thing: can never fully know what fruits it will produce for those who visit it. All that he or she can do is to raise it in the best possible conditions, tend it carefully over a long period of time, then walk away and leave it for others to discover.
So there you go, I think. If a short story is a shot of espresso then a novel is a fruit tree. Obvious when you come to think about it.
Wednesday, 27 August 2014
When is political correctness not political correctness?
The now universally pejorative term "politically correct" has reared its head again in the discussion of the appalling Rotherham child abuse scandal. The term is nowhere used in the report by Professor Alexis Jay that exposes the full extent of the tragedy but that has not stopped commentators from declaring that political correctness was what was to blame for the abuse being allowed to go unchecked for so long.
The narrative is clear: almost all of the abusers were of Pakistani heritage, this fact being repeatedly reported by abuse victims, and the police, social services, (Labour) councillors and other public officials consistently refused to acknowledge that fact for fear of appearing racist. So abusers were left unchallenged and the girls' pleas for help ignored, purely as a result of stultifying political correctness.
This is an appalling charge, and the danger is of course that it appears to call into question the belief system that underpins any positive interpretation of the term "political correctness": that it is unacceptable in today's society to make judgments purely on the basis of factors such as race. If political correctness can lead to the turning of a blind eye while 1400 children are being abused then we are better off without it, aren't we?
The term is an interesting one, coined apparently in the mid 20th Century by serious-minded communists and socialists to describe the acceptability or otherwise of any thought or utterance within their particular political belief system. However it quickly proved a term and a way of thinking that was easy to parody and ridicule. Hardly surprising, given the etymology. "Correct", from the Latin corrigere (to set to a rule) implies a certain rigidity and absolutism. Politics on the other hand is the art of the possible. The connotations of the two words are poles apart.
So political correctness came to sum up a certain blind rigidity of thought amongst those over-influenced by some particular political ideology. And slowly its meaning began to spread, until it now encompasses anyone who puts any sort of ideology (political or otherwise) above what the critical observer regards as common sense.
So was political correctness in these terms what lay behind the repeated failure by various Rotherham officials to act on allegations of child abuse? The Sun's leader column asks when left wing politicians and the police are going to place child safety over political correctness, and bizarre as it may seem, the consensus seems to be that it was an unwillingness to appear racist that prevented the police from intervening earlier.
Professor Jay's report certainly identifies "collective failures of political and officer leadership" that she describes as "blatant." She talks of reports into allegations of abuse being suppressed and ignored and of no action being taken. She states that "Some at a senior level in the Police and children's social care continued to think the extent of the problem, as described by youth workers, was exaggerated" and says that the Council leader's 2013 apology "should have been made years earlier, and the issue given the political leadership it needed."
Yet on the specific issue of the perpetrators' Pakistani heritage and its importance in how the situation was dealt with her criticisms do not perhaps make the case against "political correctness" quite as strongly as commentators appear to suggest. In her executive summary the only comment on this subject is this: "Several staff described their nervousness about identifying the ethnic origins of perpetrators for fear of being thought racist; others remembered clear direction from their managers not to do so."
This is a serious point, but it is not her first comment on the issue of the ethnicity of the abusers. Her primary criticism is that "throughout the entire period, councillors did not engage directly with the Pakistani-heritage community to discuss how best they could jointly address the issue." In what sense could an unwillingness to discuss crucially important issues with members of a non-white ethnic group be regarded as political correctness?
The large majority of Professor Jay's criticisms though are on another issue entirely, and this is what really calls into question the interpretation of her report as a condemnation of political correctness. She says that "the scale and seriousness of the problem was underplayed by senior managers"; that "Police gave no priority to CSE [Child Sexual Exploitation], regarding many child victims with contempt and failing to act on their abuse as a crime"; and that "Some at a senior level in the Police and children's social care continued to think the extent of the problem, as described by youth workers, was exaggerated, and seemed intent on reducing the official numbers of children categorised as CSE."
This speaks of a systemic failure at senior level to respond seriously to concerns that were being relayed to them by social workers who "appeared to be overwhelmed by the numbers involved" and were "acutely understaffed and over stretched, struggling to cope with demand." There was clearly a fear that there was a can of worms here which, once opened, would lead to even more overwhelming pressures, and "Some councillors ... hoped [the problem] would go away."
The key issue though was the nature of the children who were being exploited. Professor Jay's report contains this damningly concise summary: "The majority of children whose files we read had multiple reported missing episodes. Addiction and mental health emerged as common themes in the files. Almost 50% of children who were sexually exploited or at risk had misused alcohol or other substances ... and two thirds had emotional health difficulties. There were issues of parental addiction in 20% of cases and parental mental health issues in over a third of cases."
Put this picture of the children involved against the "contempt" of the police and other agencies and the view at senior level that the problem was "exaggerated" and you get to the real nub of the issue. The girls were not believed and their stories not taken seriously because they were just not the kind of children whose concerns one ever takes seriously. They were runaways, delinquents and troublemakers and the daughters of drug addicts. One can almost hear the contempt in the voices of "those at senior level" when describing such people.
There is an interesting twist here. Many of Rotherham's councillors are of Pakistani heritage and, one imagines, successfully middle class too. Perhaps more than most these would be the people who would look down their noses at low-class trash like the girls described here. They may not actively have believed that such people deserve what they get, but from the security of their stable, affluent and well-regarded family lives they would be less than inclined to believe everything such girls said.
So if anything it was a failure in political correctness rather than its over-enthusiastic implementation that let these girls down. The victims of this abuse were ignored and their testimony was not believed because, primarily, of their social class. And that, far more than the unwillingness of some staff to confront the issue of race, was the enormous failing here.
The narrative is clear: almost all of the abusers were of Pakistani heritage, this fact being repeatedly reported by abuse victims, and the police, social services, (Labour) councillors and other public officials consistently refused to acknowledge that fact for fear of appearing racist. So abusers were left unchallenged and the girls' pleas for help ignored, purely as a result of stultifying political correctness.
This is an appalling charge, and the danger is of course that it appears to call into question the belief system that underpins any positive interpretation of the term "political correctness": that it is unacceptable in today's society to make judgments purely on the basis of factors such as race. If political correctness can lead to the turning of a blind eye while 1400 children are being abused then we are better off without it, aren't we?
The term is an interesting one, coined apparently in the mid 20th Century by serious-minded communists and socialists to describe the acceptability or otherwise of any thought or utterance within their particular political belief system. However it quickly proved a term and a way of thinking that was easy to parody and ridicule. Hardly surprising, given the etymology. "Correct", from the Latin corrigere (to set to a rule) implies a certain rigidity and absolutism. Politics on the other hand is the art of the possible. The connotations of the two words are poles apart.
So political correctness came to sum up a certain blind rigidity of thought amongst those over-influenced by some particular political ideology. And slowly its meaning began to spread, until it now encompasses anyone who puts any sort of ideology (political or otherwise) above what the critical observer regards as common sense.
So was political correctness in these terms what lay behind the repeated failure by various Rotherham officials to act on allegations of child abuse? The Sun's leader column asks when left wing politicians and the police are going to place child safety over political correctness, and bizarre as it may seem, the consensus seems to be that it was an unwillingness to appear racist that prevented the police from intervening earlier.
Professor Jay's report certainly identifies "collective failures of political and officer leadership" that she describes as "blatant." She talks of reports into allegations of abuse being suppressed and ignored and of no action being taken. She states that "Some at a senior level in the Police and children's social care continued to think the extent of the problem, as described by youth workers, was exaggerated" and says that the Council leader's 2013 apology "should have been made years earlier, and the issue given the political leadership it needed."
Yet on the specific issue of the perpetrators' Pakistani heritage and its importance in how the situation was dealt with her criticisms do not perhaps make the case against "political correctness" quite as strongly as commentators appear to suggest. In her executive summary the only comment on this subject is this: "Several staff described their nervousness about identifying the ethnic origins of perpetrators for fear of being thought racist; others remembered clear direction from their managers not to do so."
This is a serious point, but it is not her first comment on the issue of the ethnicity of the abusers. Her primary criticism is that "throughout the entire period, councillors did not engage directly with the Pakistani-heritage community to discuss how best they could jointly address the issue." In what sense could an unwillingness to discuss crucially important issues with members of a non-white ethnic group be regarded as political correctness?
The large majority of Professor Jay's criticisms though are on another issue entirely, and this is what really calls into question the interpretation of her report as a condemnation of political correctness. She says that "the scale and seriousness of the problem was underplayed by senior managers"; that "Police gave no priority to CSE [Child Sexual Exploitation], regarding many child victims with contempt and failing to act on their abuse as a crime"; and that "Some at a senior level in the Police and children's social care continued to think the extent of the problem, as described by youth workers, was exaggerated, and seemed intent on reducing the official numbers of children categorised as CSE."
This speaks of a systemic failure at senior level to respond seriously to concerns that were being relayed to them by social workers who "appeared to be overwhelmed by the numbers involved" and were "acutely understaffed and over stretched, struggling to cope with demand." There was clearly a fear that there was a can of worms here which, once opened, would lead to even more overwhelming pressures, and "Some councillors ... hoped [the problem] would go away."
The key issue though was the nature of the children who were being exploited. Professor Jay's report contains this damningly concise summary: "The majority of children whose files we read had multiple reported missing episodes. Addiction and mental health emerged as common themes in the files. Almost 50% of children who were sexually exploited or at risk had misused alcohol or other substances ... and two thirds had emotional health difficulties. There were issues of parental addiction in 20% of cases and parental mental health issues in over a third of cases."
Put this picture of the children involved against the "contempt" of the police and other agencies and the view at senior level that the problem was "exaggerated" and you get to the real nub of the issue. The girls were not believed and their stories not taken seriously because they were just not the kind of children whose concerns one ever takes seriously. They were runaways, delinquents and troublemakers and the daughters of drug addicts. One can almost hear the contempt in the voices of "those at senior level" when describing such people.
There is an interesting twist here. Many of Rotherham's councillors are of Pakistani heritage and, one imagines, successfully middle class too. Perhaps more than most these would be the people who would look down their noses at low-class trash like the girls described here. They may not actively have believed that such people deserve what they get, but from the security of their stable, affluent and well-regarded family lives they would be less than inclined to believe everything such girls said.
So if anything it was a failure in political correctness rather than its over-enthusiastic implementation that let these girls down. The victims of this abuse were ignored and their testimony was not believed because, primarily, of their social class. And that, far more than the unwillingness of some staff to confront the issue of race, was the enormous failing here.
Friday, 15 August 2014
An unthinkable solution to the Higher Education quandary
In my last post I bemoaned the increasing numbers of students on 'vocational' Higher Education courses. As of course have many others. The problem is that any sort of alternative seems to be fraught with difficulties. I don't believe that increasing the number of technical training courses is the answer, as I argued in my last post, and simply eliminating any degree course that did not meet some high-minded ideal about the pursuit of knowledge and understanding would be a retrograde and elitist step that would take us back to the 1970s.
So what can be done about it? First, it is important to recognise why it is that this increasingly utilitarian approach to higher education has taken hold. The central and unquestioned aim of any nation today is economic growth, and higher education is seen first and foremost as an engine of growth. Whether at a national or an individual level the aim (we are told) has to be to increase both production and consumption, to maximise economic activity. If we fail to do that we are failing to keep our place, whether as individuals or as a nation. If economic activity declines, or even fails to grow, then we are doomed.
This level of economic activity is even described (interchangeably) as our standard of living, and it is fair to say that until fairly recently in the rich West, and today in poorer countries, that is a reasonable connection to make. If increased economic activity means moving from a subsistence economy without clean water or adequate healthcare to one with these facilities then level of economic activity = standard of living.
The thing is though that in the rich West we are long past that point. Now the connection between level of economic activity and standard of living is pretty much defunct. For a start, increasing economic activity seems to go alongside increasing economic inequality, and inequality is bad for everyone's standard of living, even the richest. Secondly, a large proportion of the population of Western countries are at the stage where an increase in their personal economic activity will be likely to decrease rather than increase their standard of living. Once you have everything material you need to lead a comfortable life, relentless pursuit of the newest electronic devices and the means to pay for them leads not to improved standard of living but to affluenza.
In broader terms, national economic growth goes hand in hand these days with incomprehensibly vast gambles on the financial markets. Some ludicrously high percentage of the world's economic activity is actually in the form of abstruse and vastly complex financial transactions with no actual goods changing hands but literally trillions of dollars wafting to and fro on the electronic breeze. We have already seen the catastrophic damage this sort of thing can cause, and I don't believe that anyone believes that we will not face another global financial meltdown at some stage. Where economic activity is pretty much abstract anyway there is really nothing to control its growth.
So, whilst the pursuit of economic growth was (and is still for most countries) an essential phase in reaching acceptable standards of hygiene, nutrition, housing and healthcare, can it possibly remain as a realistic aspiration for those countries which have already exceeded the level of economic activity necessary to achieve those goals? Is there not a danger of something like the notorious potlatch of native American tribes, where vast amounts of valuable goods are simply thrown away in order for the relentless machinery of economic growth to keep turning? Should the richer nations not be focussing on actual standards of living- including contentment, social cohesion and stability- rather than simply on economic growth? That would involve a massive shift in direction of course, and individual aspirations would have somehow to be decoupled from the relentless acquisition of more and more increasingly irrelevant affluence, but perhaps soon we will be forced into that change of direction. I honestly cannot see our current obsession with economic growth as sustainable.
So what has all this got to do with higher education?
Well the question should be, I believe, how higher education can contribute to raising standards of living, rather than levels of economic activity. And as a passionate educator I absolutely believe it can. Everybody can and should be able to benefit from the unique opportunities higher education provides actually to learn to think- to explore and question and imagine and create. This is what we, as a country and a world, need to invest in, if we are to see a genuine improvement in living standards across the globe. And invest we must, because you can't ask a young person effectively to shell out up to £9,000 a year just to be taught how to think, with no clear prospect of a job at the end of it. Yet if as a society we make it possible for those young people to take that time, then we have a chance of ending up not with David Brents but with the next generation of filmmakers and artists, social entrepreneurs and creative thinkers. And those are the people we are really going to need.
This is a Utopian vision of course. It would involve substantially raising taxes, so that today's middle managers might have to make do with a Ford Focus and an Asus tablet rather than an Audi TT and an iPad. It would mean the UK losing its international ranking on the GDP growth tables. It would mean a fundamental rethink, so that the goal for young people was not to be a millionaire and live in a big house behind electric gates, but to be contented and involved and creative.
And none of that is going to happen, is it? So shall we just carry on saddling generation of generation of young people with unaffordable student debts so that they can get on the first rung of that much-vaunted ladder towards increased prosperity, increased affluence, increased isolation and (if I am not being too dramatic) the ultimate death of their souls?
So what can be done about it? First, it is important to recognise why it is that this increasingly utilitarian approach to higher education has taken hold. The central and unquestioned aim of any nation today is economic growth, and higher education is seen first and foremost as an engine of growth. Whether at a national or an individual level the aim (we are told) has to be to increase both production and consumption, to maximise economic activity. If we fail to do that we are failing to keep our place, whether as individuals or as a nation. If economic activity declines, or even fails to grow, then we are doomed.
This level of economic activity is even described (interchangeably) as our standard of living, and it is fair to say that until fairly recently in the rich West, and today in poorer countries, that is a reasonable connection to make. If increased economic activity means moving from a subsistence economy without clean water or adequate healthcare to one with these facilities then level of economic activity = standard of living.
The thing is though that in the rich West we are long past that point. Now the connection between level of economic activity and standard of living is pretty much defunct. For a start, increasing economic activity seems to go alongside increasing economic inequality, and inequality is bad for everyone's standard of living, even the richest. Secondly, a large proportion of the population of Western countries are at the stage where an increase in their personal economic activity will be likely to decrease rather than increase their standard of living. Once you have everything material you need to lead a comfortable life, relentless pursuit of the newest electronic devices and the means to pay for them leads not to improved standard of living but to affluenza.
In broader terms, national economic growth goes hand in hand these days with incomprehensibly vast gambles on the financial markets. Some ludicrously high percentage of the world's economic activity is actually in the form of abstruse and vastly complex financial transactions with no actual goods changing hands but literally trillions of dollars wafting to and fro on the electronic breeze. We have already seen the catastrophic damage this sort of thing can cause, and I don't believe that anyone believes that we will not face another global financial meltdown at some stage. Where economic activity is pretty much abstract anyway there is really nothing to control its growth.
So, whilst the pursuit of economic growth was (and is still for most countries) an essential phase in reaching acceptable standards of hygiene, nutrition, housing and healthcare, can it possibly remain as a realistic aspiration for those countries which have already exceeded the level of economic activity necessary to achieve those goals? Is there not a danger of something like the notorious potlatch of native American tribes, where vast amounts of valuable goods are simply thrown away in order for the relentless machinery of economic growth to keep turning? Should the richer nations not be focussing on actual standards of living- including contentment, social cohesion and stability- rather than simply on economic growth? That would involve a massive shift in direction of course, and individual aspirations would have somehow to be decoupled from the relentless acquisition of more and more increasingly irrelevant affluence, but perhaps soon we will be forced into that change of direction. I honestly cannot see our current obsession with economic growth as sustainable.
So what has all this got to do with higher education?
Well the question should be, I believe, how higher education can contribute to raising standards of living, rather than levels of economic activity. And as a passionate educator I absolutely believe it can. Everybody can and should be able to benefit from the unique opportunities higher education provides actually to learn to think- to explore and question and imagine and create. This is what we, as a country and a world, need to invest in, if we are to see a genuine improvement in living standards across the globe. And invest we must, because you can't ask a young person effectively to shell out up to £9,000 a year just to be taught how to think, with no clear prospect of a job at the end of it. Yet if as a society we make it possible for those young people to take that time, then we have a chance of ending up not with David Brents but with the next generation of filmmakers and artists, social entrepreneurs and creative thinkers. And those are the people we are really going to need.
This is a Utopian vision of course. It would involve substantially raising taxes, so that today's middle managers might have to make do with a Ford Focus and an Asus tablet rather than an Audi TT and an iPad. It would mean the UK losing its international ranking on the GDP growth tables. It would mean a fundamental rethink, so that the goal for young people was not to be a millionaire and live in a big house behind electric gates, but to be contented and involved and creative.
And none of that is going to happen, is it? So shall we just carry on saddling generation of generation of young people with unaffordable student debts so that they can get on the first rung of that much-vaunted ladder towards increased prosperity, increased affluence, increased isolation and (if I am not being too dramatic) the ultimate death of their souls?
Thinking the unthinkable about Higher Education
It is an inevitable fact that in a lot of areas of life those who make policy are of a different generation from those it principally affects. This is particularly true of education, an area where the experience of the policy-makers' generation (my generation) and that which it chiefly affects (today's young) is very different indeed.
I have written in a post some time back about how privileged my generation was, and nowhere more so than in the expectations placed on those who went to university. Most of us received generous grants, there were no fees and there was no pressure on us to find a graduate-level job as soon as we left. Some courses were principally vocational of course, but even those studying medicine were, as I remember it, more focussed on where the next pint (or 15) was coming from than what their future employment prospects were going to be like.
University study was in retrospect seen largely as an opportunity for intellectual exploration offered free and without condition to (an admittedly small proportion of) the nation's youth. Certainly there were tabloid stories of students spending their grants on drink and drugs without attending a single lecture, and the occasional moral panic about student militancy and sit-ins and revolutionary fervour, but such behaviour came to be seen as an intrinsic part of Higher Education: the actual course almost secondary. And yet a university degree had genuine currency, once the possessor got their act together sufficiently to start looking for jobs, giving a significant boost to lifetime earnings.
Then, over time, politicians woke up to the fact that these indulgences were being offered only to a tiny minority of the population and a massive expansion in Higher Education began, to the point where the HE participation rate is now around 50% and suddenly we find that the whole game has changed. Now, university degrees are far less about intellectual exploration and self-discovery and far more about preparation for the world of work. It is a ruthless market out there (as today's young are constantly reminded) and no one can afford three years of self-indulgent time-wasting.
The language of utilitarianism is everywhere. Universities advertise their wares on the basis of the proportion of alumni now in graduate level employment, and most crucially the whole thing is now far from free. University education involves a massive financial commitment from the student nowadays and as with any financial commitment they expect a return on their investment. Indeed it seems that that sort of hardening up of attitudes was pretty much the point of the tuition fee rises- since various studies have shown that they will raise no money at all for the government.
Higher Education, it seems, has become a tool for economic advancement above everything else. At an individual level students are told over and over again that without a degree their lifetime employment prospects will be minimal, and nationally investment in HE is seen as crucial for the UK to compete in tomorrow's high-tec global economy (or some such cliched formulation).
All this has happened quickly but incrementally, and with no significant national debate on the point of principal. The debate has been all about the level of tuition fees, the iniquity of student loans and the unfairness of access to the elite universities. What has seldom been asked is whether it is right for 50% of the country's population to be studying university degree courses, or whether it is right for the principal purpose of said degree courses to be improving employability.
A proportion of today's students certainly are following what one might characterise as HE-appropriate vocational courses- medicine, for instance, or engineering. Another tranche (largely the most able and/or middle class) are studying traditional degree courses. Many of these will duly proceed to graduate-level employment- the maths and science graduates as city traders and the Arts graduates as teachers or social workers, as civil servants or working in the charity sector.
But what of the rest- now the majority? Most of these will currently be on a variety of broadly business/management oriented courses at a selection of lesser-regarded universities. They may not have shelled out the full £27,000 in tuition fees but they will still be accruing a pretty substantial debt, and for what? Principally, it seems, to mensure that the next generation of David Brents have an even more secure command of management-speak bullshit with which to bore and demotivate their workforce.
I am not denigrating 'mickey-mouse degrees' here, neither am I seeking to undermine the right of the less well off or the less academically able to as extensive a period of education as I benefited from as a young man. What I am questioning is the justifiability of encouraging huge numbers of young people to go to university and saddle themselves with a mountain of student debt in order to follow supposedly vocational courses that neither afford them any significant vocational skills nor give them space and encouragement to broaden their minds more generally.
Neither, incidentally, am I joining the traditional clamour for more technical training courses. It is a seductive argument that what the less academic (whatever that means) need is to be taught a trade. However I do not personally believe that it is as simple as that. It is not just a question of disparity of esteem, and the fact that a return to technical training colleges would set in stone a whole social system based on class division of employment. The other problem is that the world is changing so rapidly that training for a particular trade might soon be seen as having been as cruelly useless as teaching Sheffield lads of a generation ago to work in the steel industry. "We will always need plumbers," the cry goes up. Well maybe, but recently British plumbers have found themselves out-competed by their Eastern European colleagues, and changes in technology are making a lot of the old plumbing skills (like soldering joints) redundant.
So what is the answer? Well it seems to me that before attempting to find an answer to this quandary, society actually has to consider some much deeper questions first. Which I think means another blog post... -->
I have written in a post some time back about how privileged my generation was, and nowhere more so than in the expectations placed on those who went to university. Most of us received generous grants, there were no fees and there was no pressure on us to find a graduate-level job as soon as we left. Some courses were principally vocational of course, but even those studying medicine were, as I remember it, more focussed on where the next pint (or 15) was coming from than what their future employment prospects were going to be like.
University study was in retrospect seen largely as an opportunity for intellectual exploration offered free and without condition to (an admittedly small proportion of) the nation's youth. Certainly there were tabloid stories of students spending their grants on drink and drugs without attending a single lecture, and the occasional moral panic about student militancy and sit-ins and revolutionary fervour, but such behaviour came to be seen as an intrinsic part of Higher Education: the actual course almost secondary. And yet a university degree had genuine currency, once the possessor got their act together sufficiently to start looking for jobs, giving a significant boost to lifetime earnings.
Then, over time, politicians woke up to the fact that these indulgences were being offered only to a tiny minority of the population and a massive expansion in Higher Education began, to the point where the HE participation rate is now around 50% and suddenly we find that the whole game has changed. Now, university degrees are far less about intellectual exploration and self-discovery and far more about preparation for the world of work. It is a ruthless market out there (as today's young are constantly reminded) and no one can afford three years of self-indulgent time-wasting.
The language of utilitarianism is everywhere. Universities advertise their wares on the basis of the proportion of alumni now in graduate level employment, and most crucially the whole thing is now far from free. University education involves a massive financial commitment from the student nowadays and as with any financial commitment they expect a return on their investment. Indeed it seems that that sort of hardening up of attitudes was pretty much the point of the tuition fee rises- since various studies have shown that they will raise no money at all for the government.
Higher Education, it seems, has become a tool for economic advancement above everything else. At an individual level students are told over and over again that without a degree their lifetime employment prospects will be minimal, and nationally investment in HE is seen as crucial for the UK to compete in tomorrow's high-tec global economy (or some such cliched formulation).
All this has happened quickly but incrementally, and with no significant national debate on the point of principal. The debate has been all about the level of tuition fees, the iniquity of student loans and the unfairness of access to the elite universities. What has seldom been asked is whether it is right for 50% of the country's population to be studying university degree courses, or whether it is right for the principal purpose of said degree courses to be improving employability.
A proportion of today's students certainly are following what one might characterise as HE-appropriate vocational courses- medicine, for instance, or engineering. Another tranche (largely the most able and/or middle class) are studying traditional degree courses. Many of these will duly proceed to graduate-level employment- the maths and science graduates as city traders and the Arts graduates as teachers or social workers, as civil servants or working in the charity sector.
But what of the rest- now the majority? Most of these will currently be on a variety of broadly business/management oriented courses at a selection of lesser-regarded universities. They may not have shelled out the full £27,000 in tuition fees but they will still be accruing a pretty substantial debt, and for what? Principally, it seems, to mensure that the next generation of David Brents have an even more secure command of management-speak bullshit with which to bore and demotivate their workforce.
I am not denigrating 'mickey-mouse degrees' here, neither am I seeking to undermine the right of the less well off or the less academically able to as extensive a period of education as I benefited from as a young man. What I am questioning is the justifiability of encouraging huge numbers of young people to go to university and saddle themselves with a mountain of student debt in order to follow supposedly vocational courses that neither afford them any significant vocational skills nor give them space and encouragement to broaden their minds more generally.
Neither, incidentally, am I joining the traditional clamour for more technical training courses. It is a seductive argument that what the less academic (whatever that means) need is to be taught a trade. However I do not personally believe that it is as simple as that. It is not just a question of disparity of esteem, and the fact that a return to technical training colleges would set in stone a whole social system based on class division of employment. The other problem is that the world is changing so rapidly that training for a particular trade might soon be seen as having been as cruelly useless as teaching Sheffield lads of a generation ago to work in the steel industry. "We will always need plumbers," the cry goes up. Well maybe, but recently British plumbers have found themselves out-competed by their Eastern European colleagues, and changes in technology are making a lot of the old plumbing skills (like soldering joints) redundant.
So what is the answer? Well it seems to me that before attempting to find an answer to this quandary, society actually has to consider some much deeper questions first. Which I think means another blog post... -->
Tuesday, 12 August 2014
I suppose it's time to go back to the novel again...
Over recent weeks virtually everything I have written (in this blog) has been in a sense journalism, but my real ambition is to become a novelist. Next month I am going to the Festival of Writing at York and will be attending workshops on various aspects of the novelist's craft. I will also be pitching my work to two agents in face to face meetings- my chance to hear what professionals might actually think of what I have written. And since that is only a month away, maybe I should stop writing these blog posts and get back to my novel.
The question is whether the two sorts of writing can coexist, and whether blog post writing such as this will tend to help or hinder my novel writing. Robert McCrum seems to argue here that there is no conflict, and cites a number of novelists who have been journalists, including PG Wodehouse, Graham Greene and George Orwell. However my initial surprise was that the list he compiles is so short. Writing is writing, isn't it? If you are good at one sort, why would you not be good at the other too? Why do far more writers not work in both genres?
When you look a little closer, there are actually a number of quite significant similarities between novel writing and journalism. Both types of writing depend for their effect on an understanding of character, narrative and the power of language. Both use research and/or creative imagination to a greater or lesser degree and both, crucially, are required to engage the reader quickly and maintain their interest.
Yet novelists (and others) have always seemed to look down their noses at journalism as "hack" writing, seeing novels as unquestionably the higher form: as an art rather than a trade. Stella Gibbons satirises this attitude brilliantly in the spoof dedication of Cold Comfort Farm to "Anthony Pookworthy Esq.":
The life of the journalist is poor, nasty, brutish and short. So is his style. You, who are so adept at the lovely polishing of every grave and lucent phrase, will realize the magnitude of the task which confronted me when, after spending ten years as a journalist, learning to say exactly what I meant in short sentences, that I must learn, if I was to achieve literature and favourable reviews, to write as though I were not quite sure about what I meant but was jolly well going to say something all the same in sentences as long as possible."
Her point is, of course, that if journalism is a trade it is a very demanding one. Both forms of writing may be required to engage the reader quickly, but whilst a novelist probably has a chapter or two a journalist has the amount of time it takes to get from Highbury and Islington to Kings Cross on the Victoria line. A novelist may be asked to cut 20,000 words from their novel prior to publication and be given three months to do it, but a journalist will be told to get the bloody piece down to 200 words by one o clock or it's not going in the paper.
However, though it would be pointless to debate which form of writing requires more skill it is clear that they are very different, both in intention and (therefore) in form. The purpose of a piece of journalism is (in the words of John Reith) to inform, educate and entertain, probably in that order. It seeks to engage its readers in an issue or situation in the world and encourage them to think about it in a new way. A novel would seek to place the three verbs in the reverse order, with almost all of the emphasis on "entertain." Rather than seek to engage its readers in a real world situation in a new way it encourages them to inhabit an entirely new world (even if it is one that closely resembles the real world).
What is interesting is the role of the author of a piece of journalism vis-a-vis the reader. I would argue that the reader is always very much aware of the presence of the journalist. In opinion pieces this is obvious of course. Many use the first person, whether single or plural, but even where they do not it is very clear that this is an individual's opinion being expressed. In reportage the presence of the journalist may seem less obvious, but such pieces always read (in my head at least) as an account being given by someone who has collated all the relevant information for me. The journalist in a sense stands between the reader and the events being reported. Even at its most engaging and immersive, journalism does not truly make us feel we were there: rather it makes us feel that we can imagine the journalist being there.
Novels are rather different. Gone are the days when authorial voice intruded directly into novels (as it does in Tom Jones, for instance, or the Life and Opinions of Tristram Shandy, Gentleman). Now we expect the author effectively to disappear when we read a novel, because we engage with the events and characters of a novel directly and personally. When a novel is engaging and immersive we truly do feel we are there and have no concept whatever of an author intruding in that process.
The form of such writing reflects some fundamental differences too. Journalists know that the first paragraph must essentially carry the entire import of the piece (because many readers will never make it past the first paragraph). Indeed, wherever one stops reading an article that article must be able to hold together as a complete and coherent argument up to that point. No holding back the key point until the killer last paragraph, because a proportion of readers will never even get there.
A successful novel is the precise opposite. It builds in tension, complexity and engagement with characters. Sometimes the start may seem slow or baffling or you may be unsure where it is going, but you stick with it because you are confident that the author knows what they are doing. Then slowly the novel gets its teeth into you, so you can't stop reading until the end, which duly floors you with the intensity and force of its emotional energy. Whilst a fair proportion of readers do not read every word of a newspaper article (including many who found it quite interesting, but couldn't be bothered reading it to the end) hardly anyone gets properly into a novel and then fails to finish it.
However the biggest difference for me between journalism and novel writing is an element of the process by which it is produced: namely the time it takes. Journalists write quickly- they have to. There is simply no point in producing even an opinion piece about something if people have stopped talking about it. Journalism is about deadlines and quick turnarounds and responsiveness to changing events. What is more it results almost entirely from the functioning of the conscious mind. A journalist cannot afford to go for a long walk or sit daydreaming and waiting for vague ideas to coalesce in her head. Her job is about obtaining, collating and processing information and presenting it in a form that is readily accessible to the reader. It is partly why the journalist is so upfront in the finished piece. We are aware that she worked to produce this- that this article arises from the sweat of her brow.
Some novelists write quickly (though nowhere near as quickly as journalists) but extensive periods of waiting seem utterly intrinsic to the process of getting a novel onto bookshelves. Jane Austen reputedly put a draft of a novel she had completed into a drawer, locked it, gave the key to Cassandra, and told her sister not to give the key back to her for a year. Only then would she be able to redraft the novel well. And even today every stage of producing a novel seems designed to take the finished book away from any sort of journalistic immediacy of writing it.
And novels are produced at least as much from the unconscious as the conscious mind of the author (as I have argued here for instance). The effect of this, I would argue, is to reduce still further the sense of the author's presence in the finished work, because it is the novel we engage with, not the author's efforts in producing it. In fact when we start noticing the latter too much it can kill our enjoyment and engagement entirely. A successful novel takes us directly into a world of the author's unconscious imagination and we live in it and explore it as if we were the first people there.
So what does this tell me in regards to the questions I posed in the second paragraph of this piece? Well, not much I suppose. Except perhaps that a novelist needs time away from their creation in a way that a journalist simply does not.
So maybe it's no bad thing that I haven't so much as looked at my novel in weeks.
The question is whether the two sorts of writing can coexist, and whether blog post writing such as this will tend to help or hinder my novel writing. Robert McCrum seems to argue here that there is no conflict, and cites a number of novelists who have been journalists, including PG Wodehouse, Graham Greene and George Orwell. However my initial surprise was that the list he compiles is so short. Writing is writing, isn't it? If you are good at one sort, why would you not be good at the other too? Why do far more writers not work in both genres?
When you look a little closer, there are actually a number of quite significant similarities between novel writing and journalism. Both types of writing depend for their effect on an understanding of character, narrative and the power of language. Both use research and/or creative imagination to a greater or lesser degree and both, crucially, are required to engage the reader quickly and maintain their interest.
Yet novelists (and others) have always seemed to look down their noses at journalism as "hack" writing, seeing novels as unquestionably the higher form: as an art rather than a trade. Stella Gibbons satirises this attitude brilliantly in the spoof dedication of Cold Comfort Farm to "Anthony Pookworthy Esq.":
The life of the journalist is poor, nasty, brutish and short. So is his style. You, who are so adept at the lovely polishing of every grave and lucent phrase, will realize the magnitude of the task which confronted me when, after spending ten years as a journalist, learning to say exactly what I meant in short sentences, that I must learn, if I was to achieve literature and favourable reviews, to write as though I were not quite sure about what I meant but was jolly well going to say something all the same in sentences as long as possible."
Her point is, of course, that if journalism is a trade it is a very demanding one. Both forms of writing may be required to engage the reader quickly, but whilst a novelist probably has a chapter or two a journalist has the amount of time it takes to get from Highbury and Islington to Kings Cross on the Victoria line. A novelist may be asked to cut 20,000 words from their novel prior to publication and be given three months to do it, but a journalist will be told to get the bloody piece down to 200 words by one o clock or it's not going in the paper.
However, though it would be pointless to debate which form of writing requires more skill it is clear that they are very different, both in intention and (therefore) in form. The purpose of a piece of journalism is (in the words of John Reith) to inform, educate and entertain, probably in that order. It seeks to engage its readers in an issue or situation in the world and encourage them to think about it in a new way. A novel would seek to place the three verbs in the reverse order, with almost all of the emphasis on "entertain." Rather than seek to engage its readers in a real world situation in a new way it encourages them to inhabit an entirely new world (even if it is one that closely resembles the real world).
What is interesting is the role of the author of a piece of journalism vis-a-vis the reader. I would argue that the reader is always very much aware of the presence of the journalist. In opinion pieces this is obvious of course. Many use the first person, whether single or plural, but even where they do not it is very clear that this is an individual's opinion being expressed. In reportage the presence of the journalist may seem less obvious, but such pieces always read (in my head at least) as an account being given by someone who has collated all the relevant information for me. The journalist in a sense stands between the reader and the events being reported. Even at its most engaging and immersive, journalism does not truly make us feel we were there: rather it makes us feel that we can imagine the journalist being there.
Novels are rather different. Gone are the days when authorial voice intruded directly into novels (as it does in Tom Jones, for instance, or the Life and Opinions of Tristram Shandy, Gentleman). Now we expect the author effectively to disappear when we read a novel, because we engage with the events and characters of a novel directly and personally. When a novel is engaging and immersive we truly do feel we are there and have no concept whatever of an author intruding in that process.
The form of such writing reflects some fundamental differences too. Journalists know that the first paragraph must essentially carry the entire import of the piece (because many readers will never make it past the first paragraph). Indeed, wherever one stops reading an article that article must be able to hold together as a complete and coherent argument up to that point. No holding back the key point until the killer last paragraph, because a proportion of readers will never even get there.
A successful novel is the precise opposite. It builds in tension, complexity and engagement with characters. Sometimes the start may seem slow or baffling or you may be unsure where it is going, but you stick with it because you are confident that the author knows what they are doing. Then slowly the novel gets its teeth into you, so you can't stop reading until the end, which duly floors you with the intensity and force of its emotional energy. Whilst a fair proportion of readers do not read every word of a newspaper article (including many who found it quite interesting, but couldn't be bothered reading it to the end) hardly anyone gets properly into a novel and then fails to finish it.
However the biggest difference for me between journalism and novel writing is an element of the process by which it is produced: namely the time it takes. Journalists write quickly- they have to. There is simply no point in producing even an opinion piece about something if people have stopped talking about it. Journalism is about deadlines and quick turnarounds and responsiveness to changing events. What is more it results almost entirely from the functioning of the conscious mind. A journalist cannot afford to go for a long walk or sit daydreaming and waiting for vague ideas to coalesce in her head. Her job is about obtaining, collating and processing information and presenting it in a form that is readily accessible to the reader. It is partly why the journalist is so upfront in the finished piece. We are aware that she worked to produce this- that this article arises from the sweat of her brow.
Some novelists write quickly (though nowhere near as quickly as journalists) but extensive periods of waiting seem utterly intrinsic to the process of getting a novel onto bookshelves. Jane Austen reputedly put a draft of a novel she had completed into a drawer, locked it, gave the key to Cassandra, and told her sister not to give the key back to her for a year. Only then would she be able to redraft the novel well. And even today every stage of producing a novel seems designed to take the finished book away from any sort of journalistic immediacy of writing it.
And novels are produced at least as much from the unconscious as the conscious mind of the author (as I have argued here for instance). The effect of this, I would argue, is to reduce still further the sense of the author's presence in the finished work, because it is the novel we engage with, not the author's efforts in producing it. In fact when we start noticing the latter too much it can kill our enjoyment and engagement entirely. A successful novel takes us directly into a world of the author's unconscious imagination and we live in it and explore it as if we were the first people there.
So what does this tell me in regards to the questions I posed in the second paragraph of this piece? Well, not much I suppose. Except perhaps that a novelist needs time away from their creation in a way that a journalist simply does not.
So maybe it's no bad thing that I haven't so much as looked at my novel in weeks.
Sunday, 10 August 2014
What defines a nation state?
Many of the world's crises recently have revolved around the question of nation states and how they are defined. Ukraine's turmoil is caused in part by being a pawn in power games between Putin and the West, but in part too by the history of Ukraine as a nation state, and the way that Crimea was allegedly added to it by Khrushchev when he was drunk. The ability and right of the Palestinian territories to function as nation states is of course central to the Gaza conflict, and we appear to be witnessing in Iraq the demise of that country as a nation state at all. And in a more domestic (and far less serious) context, both the Scottish independence referendum and the EU debate centre around the changing nature of the nation state.
All of which cases lead me to ponder on how a nation state can be defined. We tend to think of the concept as being inevitable and permanent but it is in fact a relatively recent phenomenon: Germany and Italy for instance only came into existence in the late 19th Century. In reality in many cases nation states are pretty arbitrary constructs: many do not have a unifying language (look at Belgium for instance), religion (think of Iraq) or ethnicity (virtually any African nation) or even coherent shape in terms of borders. On the face of it France seems to have very regular and logical borders (mainland France is called "the Hexagon" by the French), but look at the country's actual geography:
All of which cases lead me to ponder on how a nation state can be defined. We tend to think of the concept as being inevitable and permanent but it is in fact a relatively recent phenomenon: Germany and Italy for instance only came into existence in the late 19th Century. In reality in many cases nation states are pretty arbitrary constructs: many do not have a unifying language (look at Belgium for instance), religion (think of Iraq) or ethnicity (virtually any African nation) or even coherent shape in terms of borders. On the face of it France seems to have very regular and logical borders (mainland France is called "the Hexagon" by the French), but look at the country's actual geography:
What is more, in today's world there are often more connections between individuals and groups in different countries than those within the nation state itself. Jet travel is no respecter of national boundaries and neither is the internet. Multinational companies (as the adjective implies) ignore differences between nation states, except for reasons of minimizing tax. A high street in London is now more similar to a high street in Brisbane than it is to one in Lerwick and it is almost impossible to discover where the goods we consume have actually been produced.
So how can a nation state be defined, if not by the companies trading in it, the ethnicity or religion of its peoples or the language spoken. Even social groupings won't do the job, as social media creates and maintains social groups that transcend national boundaries, and if the neo-cons have their way government won't either, as all the erstwhile government services are outsourced to (multinational) private sector companies. In the UK we already have a significant proportion of our public services delivered by French, German or American companies, and China is to be developing our new generation of power stations apparently.
It seems that all we are left with to define a nation state are the following: the national anthem and the flag. Both are on display (in the event of victory) at games such as the Commonwealth Games and the Olympics, and the visual image of the victorious athlete mouthing the words of their anthem in front of their nation's flag is about the strongest symbol of nationhood one gets to see.
So what of these national anthems and these flags? Most were chosen some time ago of course, so one might think that they are too out of date now to tell us anything meaningful about the nation states they symbolise, but in fact there are some surprising insights to be gained just by looking at them more closely.
Take for instance the flags and national anthems of Britain, the United Sates and France- three countries locked in a long history of mutual support and distrust since the American Revolution itself (it was France that donated the Statue of Liberty to the nascent United States). The flags are an interesting comparison as they all use the same three colours: red, white and blue, but the use made of these colours is very different:
The French flag is by far the simplest and boldest. To the French it speaks of clarity of thought and the Age of Reason that saw the creation of the Republic. It is the one true Tricolore, and as such the pattern for countless flags that followed, whilst remaining unique and archetypal. The problem is that to everyone else, France's flag is that one with the red, white and blue stripes, but are they horizontal or vertical? Or is that Hungary anyway. And which colour is it on the left? Who bloody cares anyway. They all look the same.
The British flag speaks of unity in diversity with its complex of overlaid crosses, and of the centrality of our great nation in the converging lines of power and influence that reach across the globe. Except that it is a bugger to draw, is probably hardly ever hung the right way up (which is the right way up? Does anyone actually know?) and shouldn't even be called the Union Jack at all.
The US flag is both powerfully simple and somehow on a different scale to the other two. It is instantly recognisable and would be the easiest to win Pictionary with, even without coloured pens, yet it has an unfeasibly large number of elements that always make it look bigger than it actually is. And its visual symbolism is on an epic scale: the red lines representing the lands and oceans and the stars the overarching skies. The earth beneath and the skies above- all are ours, the flag seems to say.
The anthems are just as different one from another, and the connotations of their words just as telling. The Marseillaise, for all its warlike evocation of revolutionary struggle is also surprisingly intimate and familial in its language. It refers to enfants (children), bras (arms) and fils (sons), and evokes a rural scene of campagnes (fields) and sillons (furrows). What is more, however defiant its tone there is an air of defeatism about the song. It is the citizens the chorus calls to arms, not the soldiery, and it calls them to put up barricades- surely a futile last-ditch attempt to resist the inexorable march of the feroces soldats who are approaching to égorger (slay) our sons and our companions.
The British national anthem is almost ludicrously overblown. One use of an adjective such as "gracious" one could maybe get away with, but in the first verse alone (the only one anyone knows) we have "gracious," "noble," "victorious" and "glorious." And what is telling is what the song in the end wishes for. Unlike the French and US anthems, the British national anthem positively invites subjugation. "rule us," or "be our Queen" would be one thing, but the anthem actually asks for the monarch to "rule over us."
So if the Marseillaise encapsulates France's parochial yet truculent defeatism and God Save the Queen manages to sum up Britain's pompous subservience, what of the Star Spangled Banner? Like the Marseillaise the US national anthem is an evocation of revolutionary struggle against a tyrannical oppressor, and if its depiction of "the rocket's red glare, [and] the bombs bursting in air" is nowadays more reminiscent of Hamas or the Taliban than of the all-conquering US Army then that is simply one of history's ironies.
In essence the Star Spangled Banner is a simple yet powerful summation of a myth that has kept the US at the top of the heap for a very long time. For a start it is a song about the nation's flag, so the two symbols work hand in hand. Secondly its imagery is actually quite surprisingly uplifting. There is a great deal about light ("the dawn's early light," "twilight," "gleaming", "gleam," "morning's first beam," "reflected," "shines," and of course the "star spangled banner" itself). There is powerful evocation of place too, with "the shore, dimly seen through the mists of the deep," "the breeze," "the towering steep" and "the stream." Together these give a strong sense of the land as the first European pioneers found it: unsullied, vast and shining. The people in the song are "brave" and "free" and this is their "land" and their "home."
Small wonder then that Americans have seen themselves as the undisputed leaders of the free world. The Star Spangled Banner is an anthem that can really only be sung with one's fist held over one's heart, unlike God Save the Queen which must be droned out in an embarrassed dirge and the Marseillaise which can only really be bellowed whilst in a state of inebriation.
So does any of this matter at all? Do these symbols prove that there is some indefinable essence to nationhood that was somehow captured by the designers of flags and the composers of anthems, and that still holds true today? Or is this some variation of nominative determinism, and a country's people learns over time to live up (or down) to that country's national symbols?
Who knows? What I would say though, to my fellow Scots is this: if you do vote for independence then think long and hard before you officially adopt Flower of Scotland as the national anthem.
Friday, 8 August 2014
It is time to reclaim the word "reform."
Now that Michael Gove has left the DfE his name seems more than ever inextricably linked with the word "reform(s)." I suspect astute news management here: by constant repetition of the phrase "Gove reforms" when referring to the unprecedentedly disruptive and reactive changes brought about in his time as Secretary of State, Gove and his henchmen have inveigled into the public consciousness a link between what he did and the largely positive connotations of the word "reform."
I have written here, here, here and here (for example) of some of the casually destructive things Michael Gove did to education whilst in office and this is not another post bemoaning the changes he brought about. Rather I would like to focus on the word "reform" itself, to pose the question as to why so many commentators see fit to talk and write about Gove's education reforms.
The etymology of the word reform is simple- from the Latin reformare, the prefix re- meaning back and the verb formare meaning to form or shape. So in purely etymological terms the word reform would seem to imply a process of putting things back to the way they were. Not so inappropriate then, for Michael Gove's attempts to reestablish a half-remembered version of 50s schooling.
Except that words acquire most of their meaning through a combination of usage and connotation (a concept I have explored here for instance). To take usage first, consider some of the other contemporary uses of the word reform: we have Obama's healthcare reforms; repeated calls from Cameron and others for EU reform; and vague talk of political reform, usually in countries sufficiently far away that reform becomes a less threatening concept. Because actually the connotations the word reform has acquired in current usage are all to do with change, and moving forward not backwards.
The other key feature of the usage of the word reform is that it has long been defined in opposition to two other words: reaction and revolution. From the Protestant Reformation to the Reformist Movement reformers have long been defined as being the opposite of reactionaries. Reform is emphatically not about turning the clock back any more, whatever its etymology might be. These sorts of usages are also notable in their virtually socialist connotations: the process of reform has always been about defeating the reactionary forces of an oppressive higher power. Maybe not so appropriate for what Gove sought to achieve then.
Similarly, reform is now clearly established as the opposite of revolution. Revolutionaries seek to overturn (etymologically as well as by usage) whilst reformers work from the ground up, bringing in more organic, gentler changes. Indeed revolutionaries have often seen reform as inimical to their aims. Dario Fo put it well when he said, "They want a revolution, and we'll give them reforms- lots of reforms; we'll drown them in reforms."
This sort oppositional definition of words is actually how we come to refine their meanings. What is reform? Well, it isn't reaction and it isn't revolution, so it's something in between. And as a result, since reform is by definition (or by usage anyway) not an extremist position, the connotations it has come to acquire are almost universally positive. Reform has connotations of gentleness, of looking to the future, of responding to the needs of the weak and oppressed, of high-minded idealism.
And yet we allow the word to be linked to what Michael Gove did to the English education system in his time as Secretary of State!
I propose a different word. For a start, the prefix has to be de- (from the Latin for "down from," "away from," or "out of" and implying reversal or negation.) So deform maybe? Hence Gove's deformations of England's education system. Not quite, I don't think. Deformation is far too slow and organic a process. We need something more dramatic to describe what he did.
Disruption is another good word. Etymologically it seems to mean "breaking apart," which seems appropriate. However again, connotations are key, and the problem is that disruption has strong connotations of temporariness. After a disruption normal service is resumed. Pretty quickly, so long as Network Rail isn't involved. So no, disruption simply isn't the word, because the damage Gove did will take decades to undo, even suppose anyone tries.
We need another word. I know, not reform, or disrupt, but destroy. Yes, that's it. That seems a much better description of what happened. So not Gove's reforms, but Gove's destruction.
Yup. Happy with that. Carry on.
I have written here, here, here and here (for example) of some of the casually destructive things Michael Gove did to education whilst in office and this is not another post bemoaning the changes he brought about. Rather I would like to focus on the word "reform" itself, to pose the question as to why so many commentators see fit to talk and write about Gove's education reforms.
The etymology of the word reform is simple- from the Latin reformare, the prefix re- meaning back and the verb formare meaning to form or shape. So in purely etymological terms the word reform would seem to imply a process of putting things back to the way they were. Not so inappropriate then, for Michael Gove's attempts to reestablish a half-remembered version of 50s schooling.
Except that words acquire most of their meaning through a combination of usage and connotation (a concept I have explored here for instance). To take usage first, consider some of the other contemporary uses of the word reform: we have Obama's healthcare reforms; repeated calls from Cameron and others for EU reform; and vague talk of political reform, usually in countries sufficiently far away that reform becomes a less threatening concept. Because actually the connotations the word reform has acquired in current usage are all to do with change, and moving forward not backwards.
The other key feature of the usage of the word reform is that it has long been defined in opposition to two other words: reaction and revolution. From the Protestant Reformation to the Reformist Movement reformers have long been defined as being the opposite of reactionaries. Reform is emphatically not about turning the clock back any more, whatever its etymology might be. These sorts of usages are also notable in their virtually socialist connotations: the process of reform has always been about defeating the reactionary forces of an oppressive higher power. Maybe not so appropriate for what Gove sought to achieve then.
Similarly, reform is now clearly established as the opposite of revolution. Revolutionaries seek to overturn (etymologically as well as by usage) whilst reformers work from the ground up, bringing in more organic, gentler changes. Indeed revolutionaries have often seen reform as inimical to their aims. Dario Fo put it well when he said, "They want a revolution, and we'll give them reforms- lots of reforms; we'll drown them in reforms."
This sort oppositional definition of words is actually how we come to refine their meanings. What is reform? Well, it isn't reaction and it isn't revolution, so it's something in between. And as a result, since reform is by definition (or by usage anyway) not an extremist position, the connotations it has come to acquire are almost universally positive. Reform has connotations of gentleness, of looking to the future, of responding to the needs of the weak and oppressed, of high-minded idealism.
And yet we allow the word to be linked to what Michael Gove did to the English education system in his time as Secretary of State!
I propose a different word. For a start, the prefix has to be de- (from the Latin for "down from," "away from," or "out of" and implying reversal or negation.) So deform maybe? Hence Gove's deformations of England's education system. Not quite, I don't think. Deformation is far too slow and organic a process. We need something more dramatic to describe what he did.
Disruption is another good word. Etymologically it seems to mean "breaking apart," which seems appropriate. However again, connotations are key, and the problem is that disruption has strong connotations of temporariness. After a disruption normal service is resumed. Pretty quickly, so long as Network Rail isn't involved. So no, disruption simply isn't the word, because the damage Gove did will take decades to undo, even suppose anyone tries.
We need another word. I know, not reform, or disrupt, but destroy. Yes, that's it. That seems a much better description of what happened. So not Gove's reforms, but Gove's destruction.
Yup. Happy with that. Carry on.
Thursday, 7 August 2014
The DfE website: another casualty of Gove's Stalinist regime
I wrote a piece yesterday about the writing out from history of the entire concept of Assessment for Learning. One of the most extraordinary things I discovered was the removal from the DfE website of any document whatever on the subject, and this led me to investigate further. If under Gove such a central document as the AfL National Strategy document could have been expunged from the government's Education website, what else has disappeared?
So I visited what used to be www.dfe.gov.uk and my first surprise was that that address no longer exists. The website is now at https://www.gov.uk/government/organisations/department-for-education, and lest one think that this is simply a matter of web redirect, look at the URL: the word "education" appears only after the third backslash. So what? I hear you ask. Well, what this means for instance is that the default behaviour of the site's search box is to return results from any of the government departments. So if you put in the word "learning" and hit return without changing any of the default options you will get a large number of results dealing with subjects as diverse as driving, justice and births, marriages and deaths. The first link that returns anything remotely to do with learning in schools is the 25th. Put in a more generic term (but one that is central to the effective delivery of high quality education in schools) like "leadership and management" and fewer than one in ten of the results returned are anything to do with schools at all.
This is not the only change though. The first visual impression of the site is of something put together by one of the more budget school website providers back in the late 90s. Carry out a search for instance and the only way back to the Department for Education's section of the site is via your browser's "Back" button. Click on the Gov.Uk link at the top of the page and you get to this page which has no mention of the DfE on it anywhere!
This is extraordinary stuff. With the internet now firmly established as any teacher's (or school leader's. Or anyone's) primary tool for research and planning how can the DfE provide a website that is so appallingly badly designed and unwelcoming for its users?
And even that isn't the main issue. It is when you do finally track down the content on the site that you face the biggest surprise: there isn't any. Or hardly any, at least. The DfE website used to be crammed full of advice and guidance, case studies, examples of best practice, curriculum resources and a myriad of links to external sites of all sorts that might be of use to the web-savvy teacher or school leader. And now?
Well, the resemblance to a late 90s school website continues as regards content too. There are a number of policy documents, all pdfs and written in the sort of quasi-legalese that marks them out as the sort of thing one takes to a tribunal but otherwise leaves prominently displayed but unread on one's shelf. There are some press releases on the main page, and some of what the site calls "Collections" of guidance douments. Ah ha! Here we are, I thought. And found this. Guess how many of these documents have anything to do with the delivery of education to students in school? Give up? I'll tell you, thirteen, and all, it turns out, the same quasi-legalese pdfs from the policy document section.
There were only two links to external websites that I could find, one to the DfE performance tables and one to the "Get Into Teaching" website. The latter incidentally must represent an as-yet unreformed aspect of the DfE's work that Gove never noticed: it is colourful, informative, packed full of content and actually useful.
Maybe I am missing something. Maybe there is a wealth of government guidance, case studies and sharing of best practice somewhere else on the web, but if so I would be delighted if someone could point it out to me. Out of curiosity I visited the Scottish government education websites and the contrast could not be more dramatic. Even the main departmental schools site is worlds better than the English equivalent, but as well as that there are two other enormous sites: http://www.educationscotland.gov.uk/ and http://www.gtcs.org.uk/. There used to be an English equivalent of the latter of course- the General Teaching Council, or GTC, but Gove killed it off and its website is no more.
So aside from anything else he has done, Michael Gove has been responsible for the wholesale removal from the internet of anything that could reasonably be considered useful Governmental advice and resources for school teachers and leaders. It is not just the scale of his scorched-earth policy that amazes though, but its intention. This is a Stalinist elimination from the public sphere of anything ever produced by "the Blob" of those who know anything about education. In fact it is almost akin to Mao's policy of anti-intellectualism and what I find utterly extraordinary is the apparent lack of public outcry against it.
So I visited what used to be www.dfe.gov.uk and my first surprise was that that address no longer exists. The website is now at https://www.gov.uk/government/organisations/department-for-education, and lest one think that this is simply a matter of web redirect, look at the URL: the word "education" appears only after the third backslash. So what? I hear you ask. Well, what this means for instance is that the default behaviour of the site's search box is to return results from any of the government departments. So if you put in the word "learning" and hit return without changing any of the default options you will get a large number of results dealing with subjects as diverse as driving, justice and births, marriages and deaths. The first link that returns anything remotely to do with learning in schools is the 25th. Put in a more generic term (but one that is central to the effective delivery of high quality education in schools) like "leadership and management" and fewer than one in ten of the results returned are anything to do with schools at all.
This is not the only change though. The first visual impression of the site is of something put together by one of the more budget school website providers back in the late 90s. Carry out a search for instance and the only way back to the Department for Education's section of the site is via your browser's "Back" button. Click on the Gov.Uk link at the top of the page and you get to this page which has no mention of the DfE on it anywhere!
This is extraordinary stuff. With the internet now firmly established as any teacher's (or school leader's. Or anyone's) primary tool for research and planning how can the DfE provide a website that is so appallingly badly designed and unwelcoming for its users?
And even that isn't the main issue. It is when you do finally track down the content on the site that you face the biggest surprise: there isn't any. Or hardly any, at least. The DfE website used to be crammed full of advice and guidance, case studies, examples of best practice, curriculum resources and a myriad of links to external sites of all sorts that might be of use to the web-savvy teacher or school leader. And now?
Well, the resemblance to a late 90s school website continues as regards content too. There are a number of policy documents, all pdfs and written in the sort of quasi-legalese that marks them out as the sort of thing one takes to a tribunal but otherwise leaves prominently displayed but unread on one's shelf. There are some press releases on the main page, and some of what the site calls "Collections" of guidance douments. Ah ha! Here we are, I thought. And found this. Guess how many of these documents have anything to do with the delivery of education to students in school? Give up? I'll tell you, thirteen, and all, it turns out, the same quasi-legalese pdfs from the policy document section.
There were only two links to external websites that I could find, one to the DfE performance tables and one to the "Get Into Teaching" website. The latter incidentally must represent an as-yet unreformed aspect of the DfE's work that Gove never noticed: it is colourful, informative, packed full of content and actually useful.
Maybe I am missing something. Maybe there is a wealth of government guidance, case studies and sharing of best practice somewhere else on the web, but if so I would be delighted if someone could point it out to me. Out of curiosity I visited the Scottish government education websites and the contrast could not be more dramatic. Even the main departmental schools site is worlds better than the English equivalent, but as well as that there are two other enormous sites: http://www.educationscotland.gov.uk/ and http://www.gtcs.org.uk/. There used to be an English equivalent of the latter of course- the General Teaching Council, or GTC, but Gove killed it off and its website is no more.
So aside from anything else he has done, Michael Gove has been responsible for the wholesale removal from the internet of anything that could reasonably be considered useful Governmental advice and resources for school teachers and leaders. It is not just the scale of his scorched-earth policy that amazes though, but its intention. This is a Stalinist elimination from the public sphere of anything ever produced by "the Blob" of those who know anything about education. In fact it is almost akin to Mao's policy of anti-intellectualism and what I find utterly extraordinary is the apparent lack of public outcry against it.
Wednesday, 6 August 2014
What has happened to Assessment for Learning?
There are unquestionably advantages to not being employed in the education sector any more: I get to enjoy the summer holidays for one thing. As a head the summer holidays were no kind of break at all, and the looming key dates of exam results and start of term would start haunting me before the summer term had even ended.
However not being in the thick of things does mean that I can lose track of what used to be central concerns of mine. If educational issues don't make the news (and it is generally only scandals and stories of failure that do) then I rarely get to hear of them any more. Occasionally I do get curious though, and this morning, for reasons best known to myself, I put Assessment for Learning into Google and chose "News" as the search type. After the first 5 pages of search results I was forced to admit that there was simply nothing from any UK site about developments in AfL.
Yet this is the time when pedagogues do their most creative thinking. These are the precious weeks when, freed from the daily grind of Year 9 Geography, teachers plan and share ideas and look to the future. With the new academic year approaching and "Back to School" sections already opening in supermarkets (it's the beginning of August for God's sake! Why do they DO that?) there seems to be nothing to reinvigorate the nation's teaching profession in the crucial business of effective assessment for learning.
I then searched the Department for Education site, and more scarily still there was nothing there either on what I would call AfL. It is as if the phrase has vanished. As if a concept that was utterly central to my thinking for years as a teacher and head has slipped quietly away while my back was turned.
This is maybe not that surprising once I come to think about it. Despite impassioned pleas such as this to Michael Gove I am not aware of him ever having said anything on the subject of Assessment for Learning. Assessment was for him, it is clear, nothing to do with learning at all. Assessment meant testing and nothing else. Testing was there to determine who had succeeded and who failed and what was tested was principally to be knowledge, not skills, and certainly not something as woolly and amorphous as learning. AfL, it is pretty clear, was a product of the Blob, and so to be ignored and marginalised.
Just to be clear on the issue, by assessment for learning I do not just mean the old National Strategy document (which I eventually tracked down in the government's web archive) but a broader way of thinking about education that actually engages with how students learn and how teachers can support and assess that learning. Experienced educationalists know all this stuff of course, but if the Department for Education doesn't then maybe I should clear it up.
Assessment is crucial to effective teaching. There is still a weird belief that, because many teachers are critical of the SATs, the revised GCSEs and A levels and league tables in their current form that they are anti-assessment. This is utter crap. Any teacher with any experience of education in the real world knows that assessment is something without which they simply could not function, because unless you know what students know, understand and can do then how can you possibly decide what to teach and how?
For effective assessment there have to be targets against which performance can be measured. I know the language of targets is anathema to some (because of its ubiquity these days) but actually the setting of targets has always been central to the relationship between teacher and pupil. What is a conversation like, "That's excellent Jasmine. How about putting in a bit of conversation too? Remember what we learned yesterday about speech punctuation," if it is not setting targets for learning?
However there is often insufficient thought given (or no thought at all, in Mr Gove's case) as to the different types of learning targets teachers can set. Personally, I find the following diagram useful in this context:
However not being in the thick of things does mean that I can lose track of what used to be central concerns of mine. If educational issues don't make the news (and it is generally only scandals and stories of failure that do) then I rarely get to hear of them any more. Occasionally I do get curious though, and this morning, for reasons best known to myself, I put Assessment for Learning into Google and chose "News" as the search type. After the first 5 pages of search results I was forced to admit that there was simply nothing from any UK site about developments in AfL.
Yet this is the time when pedagogues do their most creative thinking. These are the precious weeks when, freed from the daily grind of Year 9 Geography, teachers plan and share ideas and look to the future. With the new academic year approaching and "Back to School" sections already opening in supermarkets (it's the beginning of August for God's sake! Why do they DO that?) there seems to be nothing to reinvigorate the nation's teaching profession in the crucial business of effective assessment for learning.
I then searched the Department for Education site, and more scarily still there was nothing there either on what I would call AfL. It is as if the phrase has vanished. As if a concept that was utterly central to my thinking for years as a teacher and head has slipped quietly away while my back was turned.
This is maybe not that surprising once I come to think about it. Despite impassioned pleas such as this to Michael Gove I am not aware of him ever having said anything on the subject of Assessment for Learning. Assessment was for him, it is clear, nothing to do with learning at all. Assessment meant testing and nothing else. Testing was there to determine who had succeeded and who failed and what was tested was principally to be knowledge, not skills, and certainly not something as woolly and amorphous as learning. AfL, it is pretty clear, was a product of the Blob, and so to be ignored and marginalised.
Just to be clear on the issue, by assessment for learning I do not just mean the old National Strategy document (which I eventually tracked down in the government's web archive) but a broader way of thinking about education that actually engages with how students learn and how teachers can support and assess that learning. Experienced educationalists know all this stuff of course, but if the Department for Education doesn't then maybe I should clear it up.
Assessment is crucial to effective teaching. There is still a weird belief that, because many teachers are critical of the SATs, the revised GCSEs and A levels and league tables in their current form that they are anti-assessment. This is utter crap. Any teacher with any experience of education in the real world knows that assessment is something without which they simply could not function, because unless you know what students know, understand and can do then how can you possibly decide what to teach and how?
For effective assessment there have to be targets against which performance can be measured. I know the language of targets is anathema to some (because of its ubiquity these days) but actually the setting of targets has always been central to the relationship between teacher and pupil. What is a conversation like, "That's excellent Jasmine. How about putting in a bit of conversation too? Remember what we learned yesterday about speech punctuation," if it is not setting targets for learning?
However there is often insufficient thought given (or no thought at all, in Mr Gove's case) as to the different types of learning targets teachers can set. Personally, I find the following diagram useful in this context:
Learning outcomes are the only part of this diagram Michael Gove thought about. They are things like exam results, thought they can be smaller things too, like producing a piece of software that works, or being able to conduct a conversation in French.
Barriers to learning are obvious really. If a student isn't in school then they are unlikely to be making much progress and if they mess around in class they will impede their own and others' learning. So removing those barriers is a necessary, but certainly not sufficient, step in achieving learning outcomes.
Learning behaviour targets are what a lot of teachers spend their time and energy setting students when they have to write reports. They are things like, "Do your homework," "Always bring a pen," or "Work collaboratively with other." These are crucial behaviours of course, and without them students will be very unlikely to reach their desired learning outcomes. However they are no guarantee of getting there. A student who just doesn't understand calculus can dutifully turn up to every lesson on time and properly equipped, every week plugging away at homework they simply don't get and working collaboratively with a group of peers who also do not understand calculus, and if that is all they do they will never make any real progress.
The key to this diagram, and what assessment for learning is all about, is the box in the centre- learning gains targets. If students are to achieve meaningful learning outcomes they need to know what specific things they will have to learn in order to get there. Good teachers knew this before the term AfL was ever coined, and gave students targets like:
Embed analysis of how writers use language to create effects in your essays
Learn the past subjunctive tense of some key verbs and practise putting them into sentences from memory.
Break a complex calculation into simpler steps, choosing and using appropriate and efficient operations and methods.
Develop your use of a wider range of media and techniques in creating work
The best teachers also assessed their students' progress against these targets, which is why their students made such significant progress towards the broader learning outcomes targets. The government's Assessment for Learning strategy, whilst cumbersome and bureaucratic, was at least an attempt to embed and validate that approach- to force everyone to recognise that you cannot simply deliver content to students and then test whether they have absorbed it or not (whatever Michael Gove might think).
So if assessment for learning as a concept really has slipped away in the brief time since I was a head then I despair even more as to what is happening in education. And thank my lucky stars yet again that I don't work in schools any more.
Tuesday, 5 August 2014
My demand for inaction from the world's leaders
The Gaza conflict is the latest area on which the world's spotlight has fallen and the latest conflict concerning which people across the world have demanded action. Quite right too: the impact of the last few weeks' carnage on the Palestinian population, already suffering overpopulation, crippling sanctions and an utter lack of basic freedoms, is unimaginable. What is happening to them is not just unforgivable but incomprehensible. So how can we in our safe, secure neighbourhoods look on as the situation spirals downwards?
The same spotlight fell earlier on the Ukraine and Syria, before that on Libya and Egypt, and before that on Afghanistan and Iraq (it is difficult to remember now, but before the US invasion there was real and pressing concern over the treatment of women by the Taliban in Afghanistan and of Kurds and Marsh Arabs by Saddam Hussein in Iraq). The impulse to act to prevent injustice and suffering is a noble one, and surely if the world's leaders cared about anything beyond their own shores then these are the situations in which they should intervene, aren't they?
In fact this is a relatively recent perception. Not so long ago the only circumstances in which national leaders would intervene in a conflict, let alone seek to use force to influence the behaviour of another regime, was if their own geopolitical interests were directly threatened. No one expected anything else and I am not aware of anyone counselling intervention even when the Khmer Rouge were engaging in their bloody reign of terror. More recently there seemed very little international pressure for intervention in the appalling and genocidal civil war in Rwanda. But we are all internationalists now, it seems. Social media and citizen journalism have brought the world's conflicts into our living rooms and onto our handheld devices and we can no longer stand idly by.
This is a truly excellent thing and I applaud it heartily. I am firmly of the belief that the trajectory of humanity's development is upwards and this is an excellent example. When the world's people will no longer stand idly by as injustice and brutality is being inflicted on our neighbours we know that we have taken a great step forwards.
However that is not the same as saying that what such situations need is intervention and action from the world's leaders. As I have said, we haven't got much of a history to go on to gauge the efficacy of such interventions, but it doesn't look good so far. Just list the countries in which world leaders took direct action, not primarily because they were under threat themselves but in the name of wider geopolitical interests and/or the welfare of that country's citizens: Bosnia, Sudan, Iraq, Afghanistan and Libya. Ignore for a moment the indisputable fact that this is hardly a comprehensive list of countries that have suffered from internal conflict and/or brutal regimes in the last 20 years. Simply look at the success rate for such intervention. Hmm. See my point?
Or look at more recent times. Four days ago the closest thing yet to action by world leaders in the Gaza conflict was announced with great fanfare. The US and the UN (specifically John Kerry and Ban Ki Moon), making the most of their undisputed status as world leaders, declared that they had delivered a 72 hour ceasefire that would lead to a lasting peace. A few hours later the killing started up again. I shall probably be proved wrong by the time this post is finished, but my personal belief is that today's much lower-key ceasefire brokered by Egypt has a better chance of sticking. To use my daughter's wonderful analogy the first was like a pompous and overbearing police officer intervening in a dispute between rival groups of youths; the second like a neighbour saying, "Listen lads. I've had enough of this, right. Keep it down or I'm going to tell your mums. And you KNOW that'll mean big trouble."
The thing is that world leaders, having typically reached their positions of power through an unassailable belief in their own wisdom and forcefulness, tend to overestimate massively their ability to resolve situations through intervention. In fact they are often crap at it, frequently because they simply do not take the time to understand the situations in which they intervene. Leave aside the iniquities of the US/UK invasion of Iraq: it was notable also for its incompetence. Through its policy of De-Baathification of Iraq the Coalition Provisional Authority sowed the seeds of religious conflict in that country and the recent Isis insurgency.
The EU has often been viciously attacked for its sclerotic inability or unwillingness to act decisively and quickly when the situation demands it. On economic issues it has been repeatedly derided for "kicking the can down the road" rather than taking action and on foreign policy issues many have bemoaned the absence of a strong and unifying voice.
The implied comparison is with the US of course. Now there is a world superpower that acts decisively and speaks with one voice. And just look at the results. Unencumbered freedom to act led first to the extraordinary and unsustainable financial bubble started by Reagan (and Thatcher) in the 90s. And although that almost led to the downfall of the world's financial system the response has been not less but more extreme action. Quantitative easing has been and is an incomprehensibly vast gamble, pumping literally trillions of dollars into precisely the financial institutions that almost gambled away the world's prosperity once before.
In foreign affairs terms, the Us' freedom to act decisively and speak with one voice has led to a huge and virtually unquestioned campaign of extra-judicial killings in Pakistan and the Yemen, countries with which the US is not even at war. It has also led to the US wresting control of vast quantities of data away from the citizenry in its huge surveillance programme.
Perhaps EU leaders would have done all these things too, had they had the power and the unified voice behind them, but thankfully they havent. The EU is characterised by compromise, negotiation and fudge, and what that leads to typically is inaction rather than action. And all power to the bureaucrats' elbows, I say, if it prevents EU leaders from acting as decisively (and disastrously) as US leaders have.
Of course I recognise that this argument tends perilously close to the laissez-faire insularity that actually hands control of the world to the world's despots and multinational corporations. We need governments and we need international bodies to help deliver on our growing awareness that the world is interconnected and that no man, and no country, is an island (in the metaphorical sense, anyway). However we kid ourselves if we believe that it will be decisive action by these governments or these international bodies that will resolve the world's conflicts.
So what will? Well, there is no easy answer to that, and each conflict is different. But the key, I firmly believe, is increasing awareness of and political engagement in such situations across the world amongst ordinary people. It is one thing for regimes such as Netanyahu's to "stand firm" against pressure from world leaders; it is another for the Israeli people to ignore the constant and unrelenting flood of Facebook and twitter posts that must be slowly making them realise just how isolated they are becoming in the world over this issue.
So share those FB posts and RT those tweets (the constructive and helpful ones anyway). Recognise the limitations in the power of leaders to change anything much for the better outside their own sphere of influence and understanding but recognise too the immense power of the world's citizenry- you- to influence the beliefs and actions of your neighbours, wherever in the world they live. What world leaders need to do is to use their influence (through sanctions, UN resolutions, expressions of international disapproval) to support that of their populace. What they do not need to do is think they alone can march in and resolve an intractable situation overnight.
And to return to my daughter's analogy, I have seen at first hand which approach is more effective at calming and then resolving a potentially volatile situation between rival youths. The pompous and overbearing policeman will pretty soon have a full-on gang battle on his hands. The canny neighbour may just have a chance of getting the lads to calm it, and even perhaps to begin talking again.
The same spotlight fell earlier on the Ukraine and Syria, before that on Libya and Egypt, and before that on Afghanistan and Iraq (it is difficult to remember now, but before the US invasion there was real and pressing concern over the treatment of women by the Taliban in Afghanistan and of Kurds and Marsh Arabs by Saddam Hussein in Iraq). The impulse to act to prevent injustice and suffering is a noble one, and surely if the world's leaders cared about anything beyond their own shores then these are the situations in which they should intervene, aren't they?
In fact this is a relatively recent perception. Not so long ago the only circumstances in which national leaders would intervene in a conflict, let alone seek to use force to influence the behaviour of another regime, was if their own geopolitical interests were directly threatened. No one expected anything else and I am not aware of anyone counselling intervention even when the Khmer Rouge were engaging in their bloody reign of terror. More recently there seemed very little international pressure for intervention in the appalling and genocidal civil war in Rwanda. But we are all internationalists now, it seems. Social media and citizen journalism have brought the world's conflicts into our living rooms and onto our handheld devices and we can no longer stand idly by.
This is a truly excellent thing and I applaud it heartily. I am firmly of the belief that the trajectory of humanity's development is upwards and this is an excellent example. When the world's people will no longer stand idly by as injustice and brutality is being inflicted on our neighbours we know that we have taken a great step forwards.
However that is not the same as saying that what such situations need is intervention and action from the world's leaders. As I have said, we haven't got much of a history to go on to gauge the efficacy of such interventions, but it doesn't look good so far. Just list the countries in which world leaders took direct action, not primarily because they were under threat themselves but in the name of wider geopolitical interests and/or the welfare of that country's citizens: Bosnia, Sudan, Iraq, Afghanistan and Libya. Ignore for a moment the indisputable fact that this is hardly a comprehensive list of countries that have suffered from internal conflict and/or brutal regimes in the last 20 years. Simply look at the success rate for such intervention. Hmm. See my point?
Or look at more recent times. Four days ago the closest thing yet to action by world leaders in the Gaza conflict was announced with great fanfare. The US and the UN (specifically John Kerry and Ban Ki Moon), making the most of their undisputed status as world leaders, declared that they had delivered a 72 hour ceasefire that would lead to a lasting peace. A few hours later the killing started up again. I shall probably be proved wrong by the time this post is finished, but my personal belief is that today's much lower-key ceasefire brokered by Egypt has a better chance of sticking. To use my daughter's wonderful analogy the first was like a pompous and overbearing police officer intervening in a dispute between rival groups of youths; the second like a neighbour saying, "Listen lads. I've had enough of this, right. Keep it down or I'm going to tell your mums. And you KNOW that'll mean big trouble."
The thing is that world leaders, having typically reached their positions of power through an unassailable belief in their own wisdom and forcefulness, tend to overestimate massively their ability to resolve situations through intervention. In fact they are often crap at it, frequently because they simply do not take the time to understand the situations in which they intervene. Leave aside the iniquities of the US/UK invasion of Iraq: it was notable also for its incompetence. Through its policy of De-Baathification of Iraq the Coalition Provisional Authority sowed the seeds of religious conflict in that country and the recent Isis insurgency.
The EU has often been viciously attacked for its sclerotic inability or unwillingness to act decisively and quickly when the situation demands it. On economic issues it has been repeatedly derided for "kicking the can down the road" rather than taking action and on foreign policy issues many have bemoaned the absence of a strong and unifying voice.
The implied comparison is with the US of course. Now there is a world superpower that acts decisively and speaks with one voice. And just look at the results. Unencumbered freedom to act led first to the extraordinary and unsustainable financial bubble started by Reagan (and Thatcher) in the 90s. And although that almost led to the downfall of the world's financial system the response has been not less but more extreme action. Quantitative easing has been and is an incomprehensibly vast gamble, pumping literally trillions of dollars into precisely the financial institutions that almost gambled away the world's prosperity once before.
In foreign affairs terms, the Us' freedom to act decisively and speak with one voice has led to a huge and virtually unquestioned campaign of extra-judicial killings in Pakistan and the Yemen, countries with which the US is not even at war. It has also led to the US wresting control of vast quantities of data away from the citizenry in its huge surveillance programme.
Perhaps EU leaders would have done all these things too, had they had the power and the unified voice behind them, but thankfully they havent. The EU is characterised by compromise, negotiation and fudge, and what that leads to typically is inaction rather than action. And all power to the bureaucrats' elbows, I say, if it prevents EU leaders from acting as decisively (and disastrously) as US leaders have.
Of course I recognise that this argument tends perilously close to the laissez-faire insularity that actually hands control of the world to the world's despots and multinational corporations. We need governments and we need international bodies to help deliver on our growing awareness that the world is interconnected and that no man, and no country, is an island (in the metaphorical sense, anyway). However we kid ourselves if we believe that it will be decisive action by these governments or these international bodies that will resolve the world's conflicts.
So what will? Well, there is no easy answer to that, and each conflict is different. But the key, I firmly believe, is increasing awareness of and political engagement in such situations across the world amongst ordinary people. It is one thing for regimes such as Netanyahu's to "stand firm" against pressure from world leaders; it is another for the Israeli people to ignore the constant and unrelenting flood of Facebook and twitter posts that must be slowly making them realise just how isolated they are becoming in the world over this issue.
So share those FB posts and RT those tweets (the constructive and helpful ones anyway). Recognise the limitations in the power of leaders to change anything much for the better outside their own sphere of influence and understanding but recognise too the immense power of the world's citizenry- you- to influence the beliefs and actions of your neighbours, wherever in the world they live. What world leaders need to do is to use their influence (through sanctions, UN resolutions, expressions of international disapproval) to support that of their populace. What they do not need to do is think they alone can march in and resolve an intractable situation overnight.
And to return to my daughter's analogy, I have seen at first hand which approach is more effective at calming and then resolving a potentially volatile situation between rival youths. The pompous and overbearing policeman will pretty soon have a full-on gang battle on his hands. The canny neighbour may just have a chance of getting the lads to calm it, and even perhaps to begin talking again.
Monday, 4 August 2014
Never such innocence again?
It is the centenary of the outbreak of the First World War and a lot will be said today about sacrifice and nobility and loss of innocence. The 1914-18 War stands as a powerful symbol, it seems, of a better time, when the world wasn't as messed up and confused and people knew what they believed in and were decent and true. Phillip Larkin, that most curmudgeonly and un-nostalgic of poets speaks for us all it seems when he writes, in MCMXIV, of "never such innocence again."
Only that's all bollocks of course, and Larkin knew it. In fact the 1914-18 war, as programmes like Radio 4's 1914: Day by Day have made clear, was the result of tangled, messy diplomacy going horribly and pointlessly wrong. The lead-up to it makes the current international wranglings over Ukraine and even Gaza look purposeful and measured. And not all soldiers signed up in a spirit of selfless nobility. There was xenophobic jingoism too, and a pathetic naivety that saw War as something like a grand rugby game. And then there was the appalling social pressures of the white feather movement and the shameful treatment of conscientious objectors. During the war soldiers were shot for displaying PTSD and on the first day of the Somme officers ignored spotters' revelations that the barbed wire was still undamaged and ordered their soldiers to walk, not run, into the German machine gun fire. Never such innocence again.
And yet. Still some symbols have survived from the 1914-18 War that speak to us of nobility and sacrifice, and have us bow our heads in solemn remembrance of a better time. I would like to look at two of them: the two minute silence at 11 am on the 11th of November, and the battlefield war graves of Northern France and Belgium.
The two minute silence has always moved me. As a head teacher I always insisted on it and was impressed every year by the seriousness with which students took it. Communal silence is always powerful- it is the core strength of Quaker worship- and the connotations of remembrance and loss that come with the 11/11 silence speak to everyone. We forget that of course the silence is not observed in Germany, because it doesn't feel like a celebration of victory. There are no patriotic anthems or waving flags. The visual symbol is the blood-red poppy and we stand with heads bowed.
But every year one of the central thoughts in my mind as I stand in silence is that of how the silence comes to be at 11 am at all. The armistice was actually agreed in the early hours of that Monday morning, the 11th of November 1918, but at some point someone must have noticed the powerful symbolism of the date. So the decision was taken for the ceasefire to take place not immediately, but at the 11th hour. And the morning of the 11th day of the 11th month of 1918 saw fierce fighting and several thousand casualties. Part of the issue was that newly-arrived reinforcements, who had yet to see any action in the war, wanted to see some "fun" before it was too late. There is an article on the American angle to this here but the principle is a more general one: lives were pointlessly cut short in an unnecessary few hours of fighting that morning, just so that we could experience the powerful symbolism of the sonorous phrase, "The eleventh hour of the eleventh day of the eleventh month."
Pointless loss of life at any time anywhere is shocking and unforgivable. I am as appalled as anyone by the shooting down of MH17 or by the shelling of UN schools in Gaza. Yet there is a particularly vicious pointlessness for me in the loss of life on that morning nearly 100 years ago. There may be a lot of crap going on in the world right now, but for me none of it surpasses the slaughter of young men simply in pursuit of a poignant symbol for the generations to come.
For anyone who has not visited them the first world war cemeteries of the Western Front are deeply moving places, rich in symbolism. The simple Portland stone headstones stand in silent serried ranks, tucked away in the French and Belgian countryside, mute reminders of the countless thousands who lost their lives there. All follow a similar simple pattern, with a cross near the centre and, at the entrance, a large Portland stone block inscribed with the words "Their Name Liveth Forevermore." Most headstones have a name and a regimental badge, and sometimes a short phrase added at the request of friends or relatives. The most popular is "Greater love hath no man than this, that he lay down his life for his friends." Many are simply inscribed "To a Soldier of the Great War. Known unto God," because of course many corpses remained unidentified, sometimes bundled into shell holes during brief breaks in the fighting.
They are beautiful, serene and noble places and they cannot help but make you think higher thoughts, about sacrifice and heroism and loss. There is little concession to the the large numbers of non-Christian dead, many from Britain's colonies, but Christian or not, even the cross makes for a solemn symbol of suffering and loss, and the quotation from Ecclesiasticus (the whole verse is, "Their bodies are buried in peace; but their name liveth for evermore.") surely resonates with everyone.
Yet there is one cemetery in Northern France that my late wife and I visited that brings these symbols into focus in a new and disturbing way. I cannot now track it down, but it is an appallingly moving place. On the face of it it is a Commonwealth War Graves Commission cemetery like any other. It is placed on a low hill, overlooking the Somme valley and at the rear is one of those long walls inscribed with the names of the missing whose bodies were never found. It has the cross and the Portland stone block at the entrance, and the row upon row of silent headstones.
What marks this cemetery out as different though was not revealed until we noticed, on the wall of names at the rear, a small sign that said that this monument had received damage during fierce fighting in the 1939-45 war. There was a tower at the centre of the wall, and all around the small window near the top were the scars of heavy machine-gun rounds. Suddenly I pictured a sniper crouching at that window, firing at soldiers who fought their way through the cemetery. And indeed on closer inspection we found that many of the headstones were chipped and scarred from sniper or machine-gun rounds. Attacking soldiers in the Second World War had used them for cover of course. Why not?
Yet the most potent symbol was one that I had probably seen before but not noticed. Because some of the damage had been carefully repaired, leaving no more than its ghost in the stone, but once you knew what to look for it was obvious. And there, right in the middle of the word "Evermore" on the Portland stone block was the scar of some substantial explosion. A mortar bomb maybe, fired by troops attacking up the hill, or a shell from a tank.
There is a prevailing sense of doom in the air at the moment, and a sense of lessons unlearned from the past and history repeating itself in the devastation of Gaza. Yet what today's commemoration of the outbreak of World War I reminds me is that, to quote Ecclesiastes this time, "The thing that hath been, it is that which shall be; and that which is done is that which shall be done: and there is no new thing under the sun." And maybe even it shows me that as a species we have progressed rather than declined. Because however awful current affairs may be there is nothing happening today that is as appallingly, criminally wasteful of young lives as what lies behind precisely those noble symbols of the First World War that we remember with such reverence.
Only that's all bollocks of course, and Larkin knew it. In fact the 1914-18 war, as programmes like Radio 4's 1914: Day by Day have made clear, was the result of tangled, messy diplomacy going horribly and pointlessly wrong. The lead-up to it makes the current international wranglings over Ukraine and even Gaza look purposeful and measured. And not all soldiers signed up in a spirit of selfless nobility. There was xenophobic jingoism too, and a pathetic naivety that saw War as something like a grand rugby game. And then there was the appalling social pressures of the white feather movement and the shameful treatment of conscientious objectors. During the war soldiers were shot for displaying PTSD and on the first day of the Somme officers ignored spotters' revelations that the barbed wire was still undamaged and ordered their soldiers to walk, not run, into the German machine gun fire. Never such innocence again.
And yet. Still some symbols have survived from the 1914-18 War that speak to us of nobility and sacrifice, and have us bow our heads in solemn remembrance of a better time. I would like to look at two of them: the two minute silence at 11 am on the 11th of November, and the battlefield war graves of Northern France and Belgium.
The two minute silence has always moved me. As a head teacher I always insisted on it and was impressed every year by the seriousness with which students took it. Communal silence is always powerful- it is the core strength of Quaker worship- and the connotations of remembrance and loss that come with the 11/11 silence speak to everyone. We forget that of course the silence is not observed in Germany, because it doesn't feel like a celebration of victory. There are no patriotic anthems or waving flags. The visual symbol is the blood-red poppy and we stand with heads bowed.
But every year one of the central thoughts in my mind as I stand in silence is that of how the silence comes to be at 11 am at all. The armistice was actually agreed in the early hours of that Monday morning, the 11th of November 1918, but at some point someone must have noticed the powerful symbolism of the date. So the decision was taken for the ceasefire to take place not immediately, but at the 11th hour. And the morning of the 11th day of the 11th month of 1918 saw fierce fighting and several thousand casualties. Part of the issue was that newly-arrived reinforcements, who had yet to see any action in the war, wanted to see some "fun" before it was too late. There is an article on the American angle to this here but the principle is a more general one: lives were pointlessly cut short in an unnecessary few hours of fighting that morning, just so that we could experience the powerful symbolism of the sonorous phrase, "The eleventh hour of the eleventh day of the eleventh month."
Pointless loss of life at any time anywhere is shocking and unforgivable. I am as appalled as anyone by the shooting down of MH17 or by the shelling of UN schools in Gaza. Yet there is a particularly vicious pointlessness for me in the loss of life on that morning nearly 100 years ago. There may be a lot of crap going on in the world right now, but for me none of it surpasses the slaughter of young men simply in pursuit of a poignant symbol for the generations to come.
For anyone who has not visited them the first world war cemeteries of the Western Front are deeply moving places, rich in symbolism. The simple Portland stone headstones stand in silent serried ranks, tucked away in the French and Belgian countryside, mute reminders of the countless thousands who lost their lives there. All follow a similar simple pattern, with a cross near the centre and, at the entrance, a large Portland stone block inscribed with the words "Their Name Liveth Forevermore." Most headstones have a name and a regimental badge, and sometimes a short phrase added at the request of friends or relatives. The most popular is "Greater love hath no man than this, that he lay down his life for his friends." Many are simply inscribed "To a Soldier of the Great War. Known unto God," because of course many corpses remained unidentified, sometimes bundled into shell holes during brief breaks in the fighting.
They are beautiful, serene and noble places and they cannot help but make you think higher thoughts, about sacrifice and heroism and loss. There is little concession to the the large numbers of non-Christian dead, many from Britain's colonies, but Christian or not, even the cross makes for a solemn symbol of suffering and loss, and the quotation from Ecclesiasticus (the whole verse is, "Their bodies are buried in peace; but their name liveth for evermore.") surely resonates with everyone.
Yet there is one cemetery in Northern France that my late wife and I visited that brings these symbols into focus in a new and disturbing way. I cannot now track it down, but it is an appallingly moving place. On the face of it it is a Commonwealth War Graves Commission cemetery like any other. It is placed on a low hill, overlooking the Somme valley and at the rear is one of those long walls inscribed with the names of the missing whose bodies were never found. It has the cross and the Portland stone block at the entrance, and the row upon row of silent headstones.
What marks this cemetery out as different though was not revealed until we noticed, on the wall of names at the rear, a small sign that said that this monument had received damage during fierce fighting in the 1939-45 war. There was a tower at the centre of the wall, and all around the small window near the top were the scars of heavy machine-gun rounds. Suddenly I pictured a sniper crouching at that window, firing at soldiers who fought their way through the cemetery. And indeed on closer inspection we found that many of the headstones were chipped and scarred from sniper or machine-gun rounds. Attacking soldiers in the Second World War had used them for cover of course. Why not?
Yet the most potent symbol was one that I had probably seen before but not noticed. Because some of the damage had been carefully repaired, leaving no more than its ghost in the stone, but once you knew what to look for it was obvious. And there, right in the middle of the word "Evermore" on the Portland stone block was the scar of some substantial explosion. A mortar bomb maybe, fired by troops attacking up the hill, or a shell from a tank.
There is a prevailing sense of doom in the air at the moment, and a sense of lessons unlearned from the past and history repeating itself in the devastation of Gaza. Yet what today's commemoration of the outbreak of World War I reminds me is that, to quote Ecclesiastes this time, "The thing that hath been, it is that which shall be; and that which is done is that which shall be done: and there is no new thing under the sun." And maybe even it shows me that as a species we have progressed rather than declined. Because however awful current affairs may be there is nothing happening today that is as appallingly, criminally wasteful of young lives as what lies behind precisely those noble symbols of the First World War that we remember with such reverence.
Subscribe to:
Posts (Atom)