What ever became of all the good in the world?

I am always astonished at the limitless capacity humanity has for intellectualising itself away from care and kindness.

Quick - burn the intruding historian! Avenge ourselves!

School. If you’re accused, you’re guilty!

Many years ago, when I was at school, there was a coat cupboard at the back of the classroom. Next to the cupboard was a trestle table on which had been set a class construction project. The bell went. The class joyously leaped from their chairs and surged to the cupboard, shoving and ramming each other as they fought to get their coats and escape.

I’d hung back to wait for the scrum to clear and saw the cupboard door being forced back by the desperate mob, into the trestle table. I rushed to try and rescue it – too late. The whole lot collapsed to the floor as I got there. Needless to say I was blamed. Everybody had seen me standing over the ruin and it (again) proved what a stupid and worthless child I was, and how dare I claim I was trying to save it, I totally deserved what was coming to me.

So much for trying to be a Good Samaritan.

But – but you say – surely I had rights? No. I had absolutely none. Back then, teachers given power by the system used it to smash those the system had defined as powerless, the kids, and so validate their own sense of worth. If I was seen near a broken table and the teacher decided I had done it – well, then obviously I’d done it, and how dare I protest my innocence.

The main ethical problem with this sort of behaviour is that guilt-on-accusation and summary justice stand not just against the principles of our justice system, but also of the values of care on which western society prides itself. But that is how society seems to work, certainly these days. We have trial-and-conviction by media even before someone alleged of a crime has been charged, just as one instance.

All of it is a symptom of one side of human nature. A symptom of the way humans intellectualise themselves into unkindness. It stands against what we SHOULD be doing – stands against the values of care, compassion, kindness and tolerance that, surely, must form a cornerstone any society.

There is only one answer. We have to bring kindness back into the world – together. Who’s with me?

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

How long is the ‘now’ moment we live in?

How long is ‘now’ – you know, the evanescent moment we live in and usually let past without properly experiencing it.

Wright_AuthorPhoto2014_LoNow, like time itself, is largely seen as a philosophical issue; a personal perception that stretches or shrinks depending on what we are doing. For a kid, an hour spent in a classroom listening to the teacher drone on about stuff the kid neither knows nor care about is an eternity; yet an hour hurtling about with friends at play disappears in a flash. Adults have a different perception of time again; that same elasticity flowing from interest and enthusiasm, but metered often by a sense of purpose. Yes the job’s boring, but it has to be done.

Beyond that is the concept of the ‘moment’ itself. What is ‘now’? In Buddhist philosophy it means being mindful – fully and properly aware of one’s immediate self, immediate place, and immediate environment. It means having awareness of the fullness of the moment, even in its transience, even as we think about past or future.

But what ‘is’ a ‘moment’, scientifically? The reported research indicated that a ‘moment’, to most people, is two or three seconds. Then that perception of ‘now’ vanishes and is replaced by a new one.

If we match that to attention spans we find that the typical time spent on any one item on the internet is literally only a couple of ‘moments’. And then when we realise how shallow the internet must be.

It also underscores just how important and valuable mindfulness actually is. Because a couple of blinks, literally, and the ‘now’ moment is gone.

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

Should we be dispassionate about writing – like Spock?

The other week I argued that Terry Brooks’ Sword of Shannara was a poorly written Tolkien rip-off that put me off the rest of the novels. Responses fell into two camps – people who agreed and thought the whole Shannara series was dismal; and those who were offended.

Wright_Typewriter2Fair point. People don’t have to agree – indeed, differing opinions are great, because they push discussion. And maybe something nobody thought of will come out of it. That’s what counts. Good stuff.

But what intrigued me about the discussion was the level of emotion it provoked in one or two places. A couple of of the responses were – well, a bit personal. Surely it’s possible to chat about the abstract value or otherwise of books? And then I got thinking. In some ways it isn’t, because the purpose of both reading and writing is emotional.

Authors write because they get an emotional satisfaction from doing so. Readers read because of the emotional journey it produces. By describing the opinion I and apparently others have of Brooks, I’d affirmed one sort of opinion. But I’d also trodden on the toes of others, who got a positive charge from reading his material.

The question, then, is whether writers and readers should step back from the emotion? In some ways I don’t think it’s possible for reading, because the very purpose of reading is to have an emotional experience. People read to become entangled in the emotional journey – be it to learn something, to feel validated, to find place, or simply to be distracted. However, I think it’s essential for writers to step back.

Yes, authors write because they get their own emotional satisfaction from doing so – from producing material that meets a need of their own and which will take others on an emotional journey. But at the same time, the clarity of thought that this process requires demands abstraction. How often have you written something in the heat of a moment and then, later, read through it and realised it’s foolish?

Authors have to be able to not only include the intended emotion, but also to step back from their own entanglements from time to time – to look at what they are producing from a more abstract perspective. Only then can the content and intent become properly clear – and the emotional journey on which they are going to take the reader emerge in balance. Really, we all have to approach writing like Spock would.

Seething with emotion underneath – sure – but not letting that get in the way of careful thought and analysis. Thoughts?

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

Do societies re-package their narratives of recent events? And is that ‘history’?

The other day a reader commented on a post I’d written about 9/11 as history and pointed out, quite rightly, that it doesn’t take long for events to be ‘packaged’ in ways that stand against the more dispassionate requirement of historians to understand.

The cover of 'Shattered Glory'. Now out of print.

The cover of ‘Shattered Glory’. Out of print (sigh…)

I agree. There’s no doubt in my mind that dramatic events affecting whole societies are swiftly re-invented by those who live through them. Not least because of emotional entanglement with what’s just happened. This is normal, historically. I traced just such a re-invention of New Zealand’s 1915 Gallipoli defeat in my book Shattered Glory (Penguin 2010). By April 1916, just five months after the stalled campaign ended in an ignominious retreat, it had been re-cast as a glorious victory, because it was a sacrifice for Empire. This reflected prevailing pop-sentiment of the day towards our place in a wider British Empire and helped address grief at the death toll, which was colossal for a country of just under 1 million souls. But the conception of Gallipoli as triumph was the exact opposite of the military defeat and human truth; a demonstration of the way societies, en masse, rationalise events to suit immediate emotional needs. And it had an impact on our view of history because, in a demonstration of the stickiness of re-invention, that view is largely what guides the popular conception of New Zealand’s Gallipoli experience today, nearly a century on.

So can we analyse recent events ‘historically’, in the same sense that we can analyse something that happened a century or two ago? History-as-discipline is one of the intellectual pursuits that self-examines its analytical philosophy. Hobsbawm, for instance, didn’t divide history via round-number centuries but by events, typically, political and social (‘social’, inevitably, encompasses ‘economic’, which despite the ‘hardening’ of economics with a mathematical over-gloss since the late 1940s, is at heart about society).

To Hobsbawm, the nineteenth century was ‘long’, book-ended by the French revolution of 1789 and the First World War of 1914. Whereas the twentieth century was ‘short’, framed by the outbreak of the First World War in 1914 and the end of the Cold War in 1992.

Those arguments were possible because Hobsbawm stood at the end of the cycles; they were evident to him and he had a distance to perceive what had happened in fully historical terms, certainly as far as the ‘long’ nineteenth century was concerned. But what about things that have just happened? Things we popularly call ‘historic’ but which still burn fresh in memory and haven’t achieved the more sonorous quiet of a deeper past?

To me there are several issues. The first is the problem of context. Sometimes, the deeper over-arching forces that drive the widest patterns of history – combinations of long-standing technological, social, political, ideological and, it seems, environmental factors – aren’t obvious for decades afterwards. We can’t tell precisely what a particular development may mean until it’s put into place not only of what went before, but also of what went after – and, usually, some time after. Last week’s, last year’s or even last decade’s news won’t cut it in these terms.

The second issue is the related one of emotional perspective. It takes about 25-30 years, or more, for one generation’s problem to be resolved and replaced by another; and also for the people primarily involved in it to be far enough back to be treated with the (ideally) abstract dispassion of history.  It is only now, for instance, that we are seeing treatment of Winston Churchill that moves beyond the pro- and anti- partisanship of his life and immediate decades after his death.

Me, on the Bridge over the River Kwai.

Me, on the Bridge over the River Kwai, a place that brings the human condition into sharp relief. Something happened to me five minutes after this photo was taken that gives the lie to notions of ‘rational egoism’. Ask me in the comments.

Thirdly there’s the ‘recency’ phenomenon, in which we tend to view events just gone as larger than those further back, to the cost of proportion. This also fuels a tendency to view whatever just happened as the arbiter of the future. Take the Cold War, which – via Hobsbawm’s thesis – was a temporary product of the way the old world collapsed in 1914-19. But you wouldn’t have known that living in the middle of it. And when it did finish with the predictable collapse of the Communist economy, Francis Fukuyama insisted that history had ended – that Western capitalist ideology, as he defined it, had won, and there would be no further change. Ouch. This was ‘recency’ in full display.

The reality of abstract historical analysis, of course, is that it has nothing to do with ‘direction’ or ‘progress’ towards an inevitable or ideal one-dimensional ‘end’ such as I believe was implied by Fukuyama. Indeed, by definition, history cannot end. It’s a product of human change through time; and the onus is on historians to understand that deeper human condition, the ‘unity in diversity’ beloved of social anthropology, as a pre-requisite to being able to understand how that then expresses itself in ever-smaller scales of detail when framed by a specific society.

I’ve found through my own work in the field that practical detail changes affecting a specific society usually happen generationally – sometimes imperceptibly, sometimes with sharper impact as happened in the 1960s when the generation brought up in wake of the Second World War objected to the philosophy of their parents.

And so we have the tools with which to approach the issue of ‘recent’ history. The pitfalls of those tools may not be fully overcome – indeed, logically, they cannot be; but to know they are there and to understand how these limitations work is, I think, a very great step towards being able to couch recent events in more dispassionate light.

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

Essential writing skills: sharing the writing skill-base

Back in 2011 I wrote a photographic history of my home district, Hawke’s Bay – Historic Hawkes’ Bay and East Coast, which Bateman issued in case-bound edition with slip-jacket, a wonderful example of the art of book-making.

HB&EC Cover BigIt’s still on sale and I recently found has been joined by a book from a local writer who has used functionally the same title, subject matter, concept, format, size and price. None of these things breach copyright, and it’s possible the guy came up with them independently, but his book is so conceptually close to mine I can’t help thinking I am being flattered in a deeply sincere way. The only difference is that he’s appended his name to the title of his version, which I take to be an assertion of ownership.

Curiously, though the author calls himself ‘an Historian’ and ‘author’, I believe he is qualified as an accountant. I’m not aware of any qualification he has in my field. It’s kind of iniquitous. I can’t just announce I’m a chartered accountant and set up in business. I don’t have that qualification. History is one of the few fields where people can assign themselves the label and, it seems, personal possession of the territory.

The Masonic Hotel (1932) - early streamline moderne, with the former T&G Building (1936) behind.

Hawke’s Bay history. The Masonic Hotel (1932) – early streamline moderne, with the former T&G Building (1936) behind.

The phenomenon isn’t limited to this guy, of course; plenty of people decide to ‘become’ what they call ‘an Historian’, often on their enthusiasm or interest for their local area. The conceit is, I think, based on the popular notion that history isn’t a skill of its own. Anybody can do history – it’s just collecting fun facts, isn’t it? How hard can it be? The general phenomenon was analysed a few years ago by psychologists Peter Dunning and Justin Kruger, as the Dunning-Kruger effect. It boils down to the fact that if you don’t know anything about a field of endeavour, you can fool yourself into thinking it’s easy, because you know so little you don’t know what there is to know. Often the people doing it are qualified in an unrelated area and imagine that expertise makes them expert in all areas.

The thing is, the same is also true of writing. A few years ago my wife went to a day course on kids’ books, hosted by New Zealand’s top childrens’ author, and found herself surrounded by silver-haired retirees who had decided to ‘become’ writers of kids books. These would-be authors were apparently bombarding their teacher with questions about the contracts they expected to be offered by eagerly waiting publishers. ‘No no,’ the author apparently said. ‘You have to learn how to write first.’

Too true. The problem, though, with Amazon welcoming all and sundry to publish, is that all and sundry promptly do, never realising they don’t have the building blocks. Unconscious incompetence. We’ve all done it, of course, but until recently publishers acted as gate keepers. Enthusiastic but inexperienced writers were rebuffed, went away, and learned writing – a ten-thousand hour, million-word task that takes a hopeful author from unconscious incompetence to conscious incompetence, to conscious competence – and finally, nirvana – unconscious competence.

But now those gates have been ripped open and hurled down a nearby ravine, opening the way for what Chuck Wendig calls a ‘shit volcano’. This buries everybody in the noise, including the hopeful writers. And that’s a pity. You see, if they’ve got to the point where they’ve written a whole book – and then self-published – they want to write. And that should be nurtured. But, like history, writing is a learned skill, and learning to write isn’t an easy path. The more you know about it, the more you realise there is to know. As Hemingway said, we are all apprentices.

And for me, writing technique isn’t something that I feel I have to append my name to and assert is mine alone. It’s something to share. I’ve been in the business over thirty years – and I’m going to spread the skills.

Want to know more? Watch this space through 2015.

Copyright © Matthew Wright 2015

How to stoke your Kindle with “Coal”

I’m delighted to announce that my book Coal: the rise and fall of King Coal in New Zealand (Bateman 2014) – which was released in print a few months ago – has also been published internationally through Kindle.

Coal is an irreplaceable resource, formed over millions of years, yet humanity has been burning it as if there is no tomorrow. Today it’s responsible for 43 percent of the world’s greenhouse gases. We stand at a cross-roads; and the story of coal – of which the New Zealand side is a microcosm and case-study – plays a large part in the journey.

Reviews of the print edition so far have been excellent:

There have been many books written about coal mining in New Zealand; however this definitive work by Matthew Wright has certainly set a new benchmark” – Robin Hughes, NZ Booksellers, 13 October 2014.

a fascinating read, and it is such a good way of understanding NZ history” – “The Library”, 15 October 2014.

…mines a rich seam of interesting content on many things relative to coal…” – Ted Fox, Otago Daily Times, 24 November 2014.

And so, without further ado – welcome to the Kindle edition:

Copyright © Matthew Wright 2014

Why I run an Apple-free household but am still cool

Apple’s theatricals this week haven’t convinced me to buy an iPhone 6 – which, as Ron Amadeo pointed out, has the same screen size and features as a 2012 Nexus 4. George Takei got it right when he tweeted that he couldn’t remember the last time so many got so excited about 4.7 inches.

Not that this an admission of being un-cool, though it might seem so to the phanbois. Earlier this week I commented on some guy’s blog that I’m Apple-free. Other products do all I want at less cost and I’m not interested in the Apple cool factor. Another commenter wondered whether I still watched black and white TV.  Absolutely. I watch shows about sarcastic assholes.

Get real folks. Apple isn’t a religion. They make consumer products. For profit.

OK, so I'm a geek. Today anyway. From the left: laptop, i7 4771 desktop, i7 860 desktop.

My Apple-free desk. From the left: ASUS laptop, i7 4771 Windows desktop (yes, the same CPU Apple use in their iMacs), i7 860 Windows desktop.

When I look at the venom displayed on some of the forums and blogs, against Apple-critics, I suppose I got off lightly. But as I say, commenting that I don’t buy Apple isn’t license for the fans to make personal attacks of any sort. Apple are a consumer product company. Competitive. But failing to buy it doesn’t, by definition, make you a luddite.

I suppose it’s not surprising, really. Apple’s schtik – originated by their late CEO, Steve Jobs – was an appeal to cool, to the social status that, we are conditioned to think, comes with this consumer product or that one. That approach underlies most big brands, of course – and it certainly worked for Apple. Hugely. In the late 1990s Apple was a dwindling computer company that had failed to compete with Microsoft. Jobs came back on board and reinvented it as a lifestyle choice – a company whose products bypassed the reason circuits and drove straight to the appeal of emotion.

It worked a treat. People didn’t buy Apple because they could get a sharply better phone, or sharply better computer. Apple’s gear was always well engineered, well designed and reliable. But so was the gear sold by other major manufacturers. Most of it was also just as easy to use. That wasn’t why people bought Apple. They bought Apple because it was a statement about themselves. They get drawn into it – I mean, I heard that some guy in Australia microchipped his own hand, on the off-chance that some rumoured feature might be built into the iPhone 6.

It was, by any measure, a brilliant recovery. Genius. But when I look at the sarcasm, the personalised anger with which some respond when anybody questions Apple products – when I suggest that, maybe, other products are as good – I have to wonder. Do people validate their own self-worth by ownership of an Apple product? Is that why they get so angry, sarcastic and abusive? So personal?

Is this where Jobs wanted his customers to go when he reinvented Apple?

For myself, I don’t feel the need to define or validate myself with any consumer product. It’s just stuff, and these days it’s increasingly short-life stuff. For me, phones, tablets and computers are things you buy for a purpose. Not to make you better than somebody else. Products. For me that’s the arbiter. Will it do the job I need it for – properly, and without compromise? And at what cost – up-front and lifetime? How reliable is it? Will the maker support it for that lifetime – and a little way beyond – at reasonable cost? If I drop a phone, what will it cost me to replace it?

All these reasons keep intruding whenever I look for any new consumer product. The fact that this path has produced a wholly Apple-free household, I think, speaks for itself.

Copyright © Matthew Wright 2014