What ever became of all the good in the world?

I am always astonished at the limitless capacity humanity has for intellectualising itself away from care and kindness.

Quick - burn the intruding historian! Avenge ourselves!

School. If you’re accused, you’re guilty!

Many years ago, when I was at school, there was a coat cupboard at the back of the classroom. Next to the cupboard was a trestle table on which had been set a class construction project. The bell went. The class joyously leaped from their chairs and surged to the cupboard, shoving and ramming each other as they fought to get their coats and escape.

I’d hung back to wait for the scrum to clear and saw the cupboard door being forced back by the desperate mob, into the trestle table. I rushed to try and rescue it – too late. The whole lot collapsed to the floor as I got there. Needless to say I was blamed. Everybody had seen me standing over the ruin and it (again) proved what a stupid and worthless child I was, and how dare I claim I was trying to save it, I totally deserved what was coming to me.

So much for trying to be a Good Samaritan.

But – but you say – surely I had rights? No. I had absolutely none. Back then, teachers given power by the system used it to smash those the system had defined as powerless, the kids, and so validate their own sense of worth. If I was seen near a broken table and the teacher decided I had done it – well, then obviously I’d done it, and how dare I protest my innocence.

The main ethical problem with this sort of behaviour is that guilt-on-accusation and summary justice stand not just against the principles of our justice system, but also of the values of care on which western society prides itself. But that is how society seems to work, certainly these days. We have trial-and-conviction by media even before someone alleged of a crime has been charged, just as one instance.

All of it is a symptom of one side of human nature. A symptom of the way humans intellectualise themselves into unkindness. It stands against what we SHOULD be doing – stands against the values of care, compassion, kindness and tolerance that, surely, must form a cornerstone any society.

There is only one answer. We have to bring kindness back into the world – together. Who’s with me?

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

If writing’s art, what should we deliver?

It’s over a decade since I paid a stupid amount of money to attend a lecture given by Malcolm McLaren – yes, that Malcolm McLaren. It was touted as a ‘cyber lecture’ in which he was going to reveal the philosophy of his approach to art. And after he’d dribbled on about nothing for about four hours, he did.

Yes, this IS my typewriter. What's it doing on the Wellington Writers Walk? Er - introductions...

Never mind the bollocks, here’s my typewriter.

It was really simple. Deliver paying customers nothing. Emptiness. As an art statement, you understand. He insisted it had apparently underpinned his direction of the ‘Sex Pistols’ back in the seventies. Kind of clever in a rather anarchic-in-the-UK sort of way.

Alas, as McLaren continued to blather on in verbal circles about what always turned out to be – well, nothing, I realised he’d managed to export that particular art statement to New Zealand. The fact that he was sustaining it for so long made clear that his particular brand of ‘nothing’ was, indeed, very cleverly thought out.

But time was getting towards midnight and, as he showed no signs of flagging in his delivery of empty, I felt I should respond in kind by rising to my feet and engaging in a conceptual ‘nothing march’ to the nearest exit. It wasn’t easy, because a fair number of others in the audience had decided this was also going to be the way they expressed their art. McLaren suddenly realised what was happening. ‘Wait, wait,’ he began calling from the stage. ‘I’ve got more to say’.

Actually, he hadn’t, and the stage manager evidently also thought so because he shortly had the lecture shut down so the stage crew could all go home.

Conceptually, I could see what McLaren was getting at by punking art, just as he had punked music. And art is in the eye of the beholder. But I still felt vaguely ripped off. And that, to me, raises some obvious questions about writing, which is a form of art.

The onus is on writers to produce material that takes their readers on an emotional journey – which isn’t going to be the personal emotional journey the writer has creating the stuff. The emotional experience a reader has may not even be what the author intended to create in the recipient. But it’s still valid. It’s one of the reasons why writing, by any measure, classifies as art – because it invokes that abstract multi-dimensionality of emotion on so many levels, in both creator and recipient.

The nature of that journey is, very much, up to the writer. That’s how the art of writing is personalised; it’s how it’s given its individual character. The issue is being able to deliver something – an expression of writing as art – that achieves a result, both for the artist (writer) and for the recipient.

I believe, on my own experience, that McLaren chose ‘empty’ as his art expression. That certainly isn’t mine. And there’s no room for pretension or snobbery – not if the artist wants to be genuine. Thoughts?

Copyright © Matthew Wright 2015

How long is the ‘now’ moment we live in?

How long is ‘now’ – you know, the evanescent moment we live in and usually let past without properly experiencing it.

Wright_AuthorPhoto2014_LoNow, like time itself, is largely seen as a philosophical issue; a personal perception that stretches or shrinks depending on what we are doing. For a kid, an hour spent in a classroom listening to the teacher drone on about stuff the kid neither knows nor care about is an eternity; yet an hour hurtling about with friends at play disappears in a flash. Adults have a different perception of time again; that same elasticity flowing from interest and enthusiasm, but metered often by a sense of purpose. Yes the job’s boring, but it has to be done.

Beyond that is the concept of the ‘moment’ itself. What is ‘now’? In Buddhist philosophy it means being mindful – fully and properly aware of one’s immediate self, immediate place, and immediate environment. It means having awareness of the fullness of the moment, even in its transience, even as we think about past or future.

But what ‘is’ a ‘moment’, scientifically? The reported research indicated that a ‘moment’, to most people, is two or three seconds. Then that perception of ‘now’ vanishes and is replaced by a new one.

If we match that to attention spans we find that the typical time spent on any one item on the internet is literally only a couple of ‘moments’. And then when we realise how shallow the internet must be.

It also underscores just how important and valuable mindfulness actually is. Because a couple of blinks, literally, and the ‘now’ moment is gone.

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

Should we be dispassionate about writing – like Spock?

The other week I argued that Terry Brooks’ Sword of Shannara was a poorly written Tolkien rip-off that put me off the rest of the novels. Responses fell into two camps – people who agreed and thought the whole Shannara series was dismal; and those who were offended.

Wright_Typewriter2Fair point. People don’t have to agree – indeed, differing opinions are great, because they push discussion. And maybe something nobody thought of will come out of it. That’s what counts. Good stuff.

But what intrigued me about the discussion was the level of emotion it provoked in one or two places. A couple of of the responses were – well, a bit personal. Surely it’s possible to chat about the abstract value or otherwise of books? And then I got thinking. In some ways it isn’t, because the purpose of both reading and writing is emotional.

Authors write because they get an emotional satisfaction from doing so. Readers read because of the emotional journey it produces. By describing the opinion I and apparently others have of Brooks, I’d affirmed one sort of opinion. But I’d also trodden on the toes of others, who got a positive charge from reading his material.

The question, then, is whether writers and readers should step back from the emotion? In some ways I don’t think it’s possible for reading, because the very purpose of reading is to have an emotional experience. People read to become entangled in the emotional journey – be it to learn something, to feel validated, to find place, or simply to be distracted. However, I think it’s essential for writers to step back.

Yes, authors write because they get their own emotional satisfaction from doing so – from producing material that meets a need of their own and which will take others on an emotional journey. But at the same time, the clarity of thought that this process requires demands abstraction. How often have you written something in the heat of a moment and then, later, read through it and realised it’s foolish?

Authors have to be able to not only include the intended emotion, but also to step back from their own entanglements from time to time – to look at what they are producing from a more abstract perspective. Only then can the content and intent become properly clear – and the emotional journey on which they are going to take the reader emerge in balance. Really, we all have to approach writing like Spock would.

Seething with emotion underneath – sure – but not letting that get in the way of careful thought and analysis. Thoughts?

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

Do societies re-package their narratives of recent events? And is that ‘history’?

The other day a reader commented on a post I’d written about 9/11 as history and pointed out, quite rightly, that it doesn’t take long for events to be ‘packaged’ in ways that stand against the more dispassionate requirement of historians to understand.

The cover of 'Shattered Glory'. Now out of print.

The cover of ‘Shattered Glory’. Out of print (sigh…)

I agree. There’s no doubt in my mind that dramatic events affecting whole societies are swiftly re-invented by those who live through them. Not least because of emotional entanglement with what’s just happened. This is normal, historically. I traced just such a re-invention of New Zealand’s 1915 Gallipoli defeat in my book Shattered Glory (Penguin 2010). By April 1916, just five months after the stalled campaign ended in an ignominious retreat, it had been re-cast as a glorious victory, because it was a sacrifice for Empire. This reflected prevailing pop-sentiment of the day towards our place in a wider British Empire and helped address grief at the death toll, which was colossal for a country of just under 1 million souls. But the conception of Gallipoli as triumph was the exact opposite of the military defeat and human truth; a demonstration of the way societies, en masse, rationalise events to suit immediate emotional needs. And it had an impact on our view of history because, in a demonstration of the stickiness of re-invention, that view is largely what guides the popular conception of New Zealand’s Gallipoli experience today, nearly a century on.

So can we analyse recent events ‘historically’, in the same sense that we can analyse something that happened a century or two ago? History-as-discipline is one of the intellectual pursuits that self-examines its analytical philosophy. Hobsbawm, for instance, didn’t divide history via round-number centuries but by events, typically, political and social (‘social’, inevitably, encompasses ‘economic’, which despite the ‘hardening’ of economics with a mathematical over-gloss since the late 1940s, is at heart about society).

To Hobsbawm, the nineteenth century was ‘long’, book-ended by the French revolution of 1789 and the First World War of 1914. Whereas the twentieth century was ‘short’, framed by the outbreak of the First World War in 1914 and the end of the Cold War in 1992.

Those arguments were possible because Hobsbawm stood at the end of the cycles; they were evident to him and he had a distance to perceive what had happened in fully historical terms, certainly as far as the ‘long’ nineteenth century was concerned. But what about things that have just happened? Things we popularly call ‘historic’ but which still burn fresh in memory and haven’t achieved the more sonorous quiet of a deeper past?

To me there are several issues. The first is the problem of context. Sometimes, the deeper over-arching forces that drive the widest patterns of history – combinations of long-standing technological, social, political, ideological and, it seems, environmental factors – aren’t obvious for decades afterwards. We can’t tell precisely what a particular development may mean until it’s put into place not only of what went before, but also of what went after – and, usually, some time after. Last week’s, last year’s or even last decade’s news won’t cut it in these terms.

The second issue is the related one of emotional perspective. It takes about 25-30 years, or more, for one generation’s problem to be resolved and replaced by another; and also for the people primarily involved in it to be far enough back to be treated with the (ideally) abstract dispassion of history.  It is only now, for instance, that we are seeing treatment of Winston Churchill that moves beyond the pro- and anti- partisanship of his life and immediate decades after his death.

Me, on the Bridge over the River Kwai.

Me, on the Bridge over the River Kwai, a place that brings the human condition into sharp relief. Something happened to me five minutes after this photo was taken that gives the lie to notions of ‘rational egoism’. Ask me in the comments.

Thirdly there’s the ‘recency’ phenomenon, in which we tend to view events just gone as larger than those further back, to the cost of proportion. This also fuels a tendency to view whatever just happened as the arbiter of the future. Take the Cold War, which – via Hobsbawm’s thesis – was a temporary product of the way the old world collapsed in 1914-19. But you wouldn’t have known that living in the middle of it. And when it did finish with the predictable collapse of the Communist economy, Francis Fukuyama insisted that history had ended – that Western capitalist ideology, as he defined it, had won, and there would be no further change. Ouch. This was ‘recency’ in full display.

The reality of abstract historical analysis, of course, is that it has nothing to do with ‘direction’ or ‘progress’ towards an inevitable or ideal one-dimensional ‘end’ such as I believe was implied by Fukuyama. Indeed, by definition, history cannot end. It’s a product of human change through time; and the onus is on historians to understand that deeper human condition, the ‘unity in diversity’ beloved of social anthropology, as a pre-requisite to being able to understand how that then expresses itself in ever-smaller scales of detail when framed by a specific society.

I’ve found through my own work in the field that practical detail changes affecting a specific society usually happen generationally – sometimes imperceptibly, sometimes with sharper impact as happened in the 1960s when the generation brought up in wake of the Second World War objected to the philosophy of their parents.

And so we have the tools with which to approach the issue of ‘recent’ history. The pitfalls of those tools may not be fully overcome – indeed, logically, they cannot be; but to know they are there and to understand how these limitations work is, I think, a very great step towards being able to couch recent events in more dispassionate light.

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

Thoughts about Shakespeare spurred by Fay Weldon’s ‘Letters to Alice On First Reading Jane Austen’

A while ago I found myself glancing at a copy of Fay Weldon’s Letters To Alice on first reading Jane Austen (1984) and wondering if such a book could ever be published today, mainstream.

William Shakespeare, the 'Flower' portrait c1820-1840, public domain via Wikimedia Commons.

William Shakespeare, the ‘Flower’ portrait c1820-1840, public domain via Wikimedia Commons.

Call it meta-literature; a book by a writer thinking about someone’s response to another writer’s book. Which makes this post meta-meta literature, I suppose – my thoughts on a book by a writer thinking about someone’s response to another writer’s book. Alice – Weldon’s fictive ‘niece’, doesn’t like being made to read Austen. She also doesn’t actually exist; she is merely a clever device for Weldon to expound her own thoughts on writing, and people, and – of course – why Jane Austen’s novel-writing is interesting.

This ability to point out the interest was a skill totally absent in my high school English teacher, who we shall refer to as Frog (because everybody did at school). He unerringly failed to tell the class anything about context or meaning – anything at all, in fact, that might have made the work meaningful and thus interesting. In a few short years at senior high he managed to annihilate any interest I might have had in literature. He rendered Catch-22 boring, reduced One Flew Over The Cuckoo’s Nest to an endurance test, and made clear that studying Shakespeare was about as gripping as watching paint dry.

What Frog missed – and what Weldon spends her book pointing out – is that all things have interest. You just have to find it. I won’t rant on about Austen because Weldon’s already done it. But take Shakespeare, for instance. To the uninitiated teenager these are filled with barely-comprehensible Elizabethan slang and strange characters you can’t connect with. But then figure that most of his plays were intended to tweak the beards of the administration and social mores when England was being run as a police state. And they were rude. He used words like fie, for instance. Fie! The naughtiest word of the era.

They were also high entertainment – the blockbuster movie equivalent of the age. Suddenly, Shakespeare isn’t some boring academic study. It’s interesting. It’s not too many steps from there to discovering that most of his plays nailed the human condition pretty closely (so did Austen, Weldon tells us).

Shakespeare’s plays are timeless in that sense, which is why it’s been so easy to adapt them to any setting and time period. Any? Go watch Forbidden Planet (1956). See? Interesting. And that, of course, runs to the heart of all writing – fiction or non-fiction, it has to address the human condition in some way. Austen said so. Shakespeare said so. Weldon said so. I say so. Which, I guess, answers the question about whether Weldon’s book would be picked up by a mainstream publisher today, post-Amazon revolution. I think it would, because Weldon’s meta-story is, by definition, all about the human condition on multiple levels.

Shakespeare? Not so sure. You see, I can download his stuff from MIT.

Copyright © Matthew Wright 2015

The stupidity of Nazi super-science. And hurrah for British boffins!

Anyone remember Nazi super-science? You know, the science for when ordinary super-science isn’t evil enough. I’m not talking about atomic Nazi super-soldiers led by Zombie Robo-Hitler. I’m talking real Nazi ‘super-science’ of the early 1940s – the ‘secret weapons’ Hitler insisted would win the war.

Heinkel He-177 four-engined bomber in Denmark, 1944. The engine arrangement - two engines in parallel - virtually guaranteed fires and bomber never worked properly. Public domain.

Heinkel He-177 four-engined bomber in Denmark, 1944. The engine arrangement – two DB 601 motors in tandem (dubbed ‘DB 606′) per nacelle – led to fires. Public domain.

Of course there were a couple of problems. One was that by the time the Nazis ordered German industry to build ‘super’ weapons, the war had already been lost – the tipping point came in mid-1943 when Hitler broke his own army trying to take Kursk against the advice of his generals. The Eastern Front was the decisive front of the war; after the Germans lost Kursk it was only a matter of time before superior Allied production was able to fuel a Soviet drive west.

The other problem was that Nazi super-weapons weren’t very ‘super’, even by 1940s standards. Hitler and his cronies thought they were. But what can you expect from people for whom conviction trumped reason? A regime convinced of their own destiny, buoyed by their sense of exceptionalism, and where state power pivoted around a tight integration of industrial complex with economy and government.

The main thing the Nazis were good at was evil – epitomised by one of the nastiest super-weapons their science devised; pure methamphetamine (‘P’), exploiting the prior discovery of pep-pills. This was the outcome of their quest to find a drug that could turn their own soldiers into psychotic killers immune to pain, no matter how much damage the drug did. It was actually used in 1944 by the Waffen SS as a combat aid. Alas, the recipe didn’t die with the Nazi regime – meaning ‘P’ is actually a Nazi drug. Uh – thanks, Adolf, Heinrich, et al. Yet another legacy you’re still inflicting on the world.

Messerschmitt Me-262 captured by the Allies, on test flight in the US. Public Domain.

Messerschmitt Me-262 captured by the Allies, on test flight in the US. Allied pilots during the war referred to these aircraft as ‘blow jobs’, presumably because they flew by jet thrust. Public domain.

The Nazis also encouraged rocketry, thanks to Werhner von Braun, an honorary SS Lieutenant and member of the Nazi party who was responsible, later, for America’s Saturn V Moon rockets. The problem was that the V2 missile project soaked up colossal resources – and lives. More people died making von Braun’s missile than were killed by it. But the rocket was pushed by Hitler’s regime anyway – a symptom of ‘conviction mentality’ presented as ‘logic’ and ‘reason’.

Other Nazi super-weapons that soaked up more than they delivered included August Cönders’ V3 ‘Fleißiges Lieschen’ ultra-long-range gun, which never worked; and Ferdinand Porsche’s Maus 188 tonne tank, which was too heavy for most bridges. That was dwarfed by Edward Grotte’s 1000-tonne ‘Ratte’ land battleship armed with 11-inch naval guns and powered by U-boat motors. Hitler was a fan, but Albert Speer cancelled that particular expression of Nazi megalomania in 1943, before it got to hardware.

Heinkel He-162 'Volksjager' emergency fighter, captured by the US, at Freeman Field in 1945. This wooden jet was meant to be produced in huge numbers to tip the air balance. Actually it was difficult even for experienced pilots to control, and in the hands of the half-trained boys the Nazis intended to use as pilots would have been a death trap.

Heinkel He-162 ‘Volksjager’, captured by the US, at Freeman Field in 1945. This wooden jet was meant to be produced in huge numbers to tip the air balance. Actually it was difficult even for experienced pilots to control, and in the hands of the half-trained boys the Nazis intended to use as pilots would have been a death trap.

Super-weapons that did work included the Fritz-X TV-guided bomb that sank the Italian battleship Roma in 1943, and a plethora of jet and rocket fighter designs since beloved of the “Luftwaffe 1946” fantasy brigade. Of these, the Me-262 made it to combat in 1944-45. These jets were about 100 mph faster than the best Allied piston-engined fighters, such as the P-51 Mustang flown by Chuck Yeager – but he shot down an Me-262 anyway, and damaged two others at the same time for good measure. That’s not hyperbole – here’s his combat report of 6 November 1944.

The Nazis also deployed the Tiger II tank, underpowered but with gun and armour comparable with Cold War tanks into the early 1960s. And other stuff, like tapered-barrel guns – since standard – and automatic rifles.

All of which, Hitler insisted, would win the war. They didn’t, partly because the real arbiter was industrial scale. In a war of attrition, Germany couldn’t build enough super-weapons that did work to make a difference, and the ones that didn’t soaked up resources. It has to be said that the Allies also pursued dead-ends, such as the giant Panjandrum – but to nothing like the Nazi extent.

Gloster Meteor Mk III's, seen here during operations in 1944 - yup, the Allies had jet fighters at the same time as the Nazis.

Gloster Meteor Mk III’s on operations in 1944 – yup, the Allies had jet fighters at the same time as the Nazis. Public domain.

Even so, the Allies had it all over the Germans when it came to super-weapons. Starting with the atomic bomb, the most powerful weapon in the history of the world. The German effort failed partly because their competent physicists fled to the United States in the face of Nazi persecution, partly because the Nazi bomb program was never fully resourced.

The Allies built two other key war-winning devices – effective radar, based on the British cavity magnetron, and the first radar proximity fuse for anti-aircraft work – using thermionic valve technology. General Electric did it to a design by British scientist Sir Samuel Curran. As Vannevar Bush pointed out, that fuse was decisive in key ways. The Nazis? Rheinmetal’s parallel effort was cancelled.

That’s apart from Allied jet development which paralleled the German – the British had the Gloster Meteor and the Americans the Lockheed P-80. The difference was that the Allies didn’t prioritise them. The RAF whipped the Meteor into service to help meet the V1 threat, but industry otherwise focussed on existing weapons, which they could build in overwhelming numbers. And so – fortunately – the west won the Second World War.

Copyright © Matthew Wright 2014