Essential writing skills: tackling the invisible hurdle

I’ve been posting these past few weeks about the challenges facing writers in the new environment. The biggest hurdle, of course, is so huge it’s invisible.

Books on sale in a real bookshop. Some of them mine...

Books on sale in a real bookshop. Some of them mine…

Let me explain. A few years ago the challenge authors faced in being published was – being published. The road was paved with hurdles. A starting author first had to write something good enough to be competitive with the professionals. Then they had to find the agent, who in turn had to get a publisher interested in circumstance where publishers, more often than not, went with previously published authors who had an established record.

Eventually, if everything went well, the book would appear. And – usually – not do too well. Most books didn’t do much more than break even – and publishers know the odds. The figure I’ve seen is that about one book in ten does really well. The rest don’t, and publishers accept that because having a reasonably broad range of books in their lists is part of the deal.

These days the paradigm’s changed. That world is still there, but authors also have the option of self-publishing through Amazon.

I could hear the cries of ‘squee – no entry barrier!’ all the way down in New Zealand.

There are two problems with this. The first is what Chuck Wendig calls the ‘shit volcano’ quality issue. Everybody can publish, so everybody does. ‘I learned English in school, so I can write…right?’

That sudden flood of authors (no pushing at the back) creates the second issue, which is just as big a barrier as the old agent model. Discovery.

In July this year Amazon listed 32.8 million separate titles of all kinds for sale. In that same month, they shifted 120,000 e-books a day, as best-sellers, of which 31 percent were indie published. You get the picture. Any individual book is going to be lost in the noise, no matter how good – or bad – it happens to be. Yes, the review system’s there, but a good book that doesn’t get good reviews – perhaps because nobody’s found it – won’t float to the top. That isn’t a problem for Amazon – they profit from the aggregate. But it’s a major issue for any individual author.

So – all that’s happened is that one ‘filter’ has been, effectively, replaced with another. One that cannot be reasoned with because it’s part of the environment, like gravity. The question is what to do about it. How can a writer – armed with an identical tool-kit to every other hopeful out there in internet-land – get found?

And when they are, how can they sell their stuff?

It’s a new paradigm. More soon. Meanwhile – what are your thoughts?

Copyright © Matthew Wright 2014

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

Ego igitur puniar: my childhood adventures at Nelson Park School, Napier

My old primary school, Nelson Park School, is marking its centenary this weekend. Am I going to the various events? Go figure. My earliest memory there, from 1968, is of being slammed across the face by my teacher. Wham! I’d never been hit before. I was five.

Hi. I'm your teacher...

Hi. I’m your teacher…

I have no idea why the teacher hit me, but back then it didn’t take much to evoke the wrath of teachers. A friend of mine from Nelson Park School days, just this year, told me how he was punished for accidentally running into an ‘out of bounds’ area while trying to escape the school bullies. One of my wife’s colleagues, who I didn’t know as a kid – but who went to Nelson Park School at the same time – was punished for skipping for joy in jingly sandals, aged five. I am not joking.

This was the era when school had little to do with nurturing children to learn according to their strengths, and much to do with smashing them into submissive conformity to a prescribed and quiet ‘normal’, via petty army-style ‘bullshit’ routines, worth-denial, nit-picking, sarcasm and class-front humiliation, all backed with a relentless threat of pain.  I still remember the teacher who kept offering to take boys privately out the back where they would be ‘shown’ his personal ‘strap’ – the heavy leather belt with which teachers were allowed to beat children. Other staff didn’t ‘strap’ children in secret – I remember the teacher who used to whip his out and smash kids around the legs with it. The same teacher also prowled the class with a broken blackboard ruler he called his ‘Walking Whacker’. Wham! 

My class at Nelson Park School in 1969. Can you spot me? Clue: I'm the only one whose face hasn't fallen into a belt-sander.

My class at Nelson Park School in 1969, in regulation pose, including the substitute teacher. Can you spot me? Clue: I’m the only one who hasn’t face-planted into a belt-sander.

The doyen of childhood terror at that school was the deputy principal, an archetypal drill sergeant, who belted out orders and whose wrath fell on any kid that did not obey instantly to the letter. Think Gunnery Sergeant Hartman from Kubrick’s Full Metal Jacket. It’s a military technique. But instead of brow-beating adults so they’d walk into gunfire, this teacher used the method to traumatise children into submission. I heard that he even made kids go to the local dairy to buy him Alfino cigars.

Apparently some kids – and parents – admired this teacher for his ‘drill sergeant’ decisiveness, and apparently he had a ‘nice guy’ persona he used to switch on. But I never saw that side, and everyone was terrified of him. Just this year I discussed him with former pupils at Nelson Park School with me over forty years ago. The most complementary opinion was ‘he was an asshole‘. 

The school system in action, circa 1970...

The school system in action, circa 1970…

It took me years to understand my experience at Nelson Park School – I didn’t really get a handle on it until I researched the school system professionally, publishing my conclusions in 2004 and again in 2013. The problem was that the New Zealand primary school system of the late 1960s was well past its use-by date. It was built around early twentieth century notions of uniformity – a narrowly defined ‘right’ way of doing things; writing in a specific way with a specific hand, and so forth. Woe betide anybody who diverged. Practical human reality, of course, is far broader and more complex – the more so as time goes on and generational change brings new attitudes. But the school system hadn’t caught up, and by the time I got there it was dominated by teachers who had spent a lifetime bashing square pegs into round holes.

School routines clung to the pseudo-military ethos that had characterised the system through both World Wars, when school was looked on as a foundation for cadetship and territorial service. When I was there in 1968-72, children were still made to march into class, in lines, to the strains of marches such as F. J. Ricketts’ Colonel Bogey (1914). If the kids messed up that drill, they were marched into the school-ground and made to practise.

What made the whole thing so destructive was that this setup fostered opportunities for some staff to exploit the power the system gave them over those defined as powerless, the children. A recent – as in 2014 – review of data collected during a 1961 experiment by John Millward reveals that some ordinary adults become monsters in such circumstance because dominating those over whom the system has given them total power makes some people feel good about themselves. My own professional work suggests that one does not have to run an experiment to show this. It is part of the wider human condition. And moral compass, alas, is lost by increments.

Doubtless some kids had a good time at Nelson Park School at the turn of the 1970s. Nobody I knew there did, and my left-handedness ensured I also hit the sharp end of a tired system. The sad part is that the staff of Nelson Park School at that time had a choice. They could have tried to be reasonable, tried to view children as human beings and tried to nurture their development. By my measure, they did not. But perhaps these teachers found happiness for themselves later in better and more caring ways. One can but hope.

Copyright © Matthew Wright 2014

Why I run an Apple-free household but am still cool

Apple’s theatricals this week haven’t convinced me to buy an iPhone 6 – which, as Ron Amadeo pointed out, has the same screen size and features as a 2012 Nexus 4. George Takei got it right when he tweeted that he couldn’t remember the last time so many got so excited about 4.7 inches.

Not that this an admission of being un-cool, though it might seem so to the phanbois. Earlier this week I commented on some guy’s blog that I’m Apple-free. Other products do all I want at less cost and I’m not interested in the Apple cool factor. Another commenter wondered whether I still watched black and white TV.  Absolutely. I watch shows about sarcastic assholes.

Get real folks. Apple isn’t a religion. They make consumer products. For profit.

OK, so I'm a geek. Today anyway. From the left: laptop, i7 4771 desktop, i7 860 desktop.

My Apple-free desk. From the left: ASUS laptop, i7 4771 Windows desktop (yes, the same CPU Apple use in their iMacs), i7 860 Windows desktop.

When I look at the venom displayed on some of the forums and blogs, against Apple-critics, I suppose I got off lightly. But as I say, commenting that I don’t buy Apple isn’t license for the fans to make personal attacks of any sort. Apple are a consumer product company. Competitive. But failing to buy it doesn’t, by definition, make you a luddite.

I suppose it’s not surprising, really. Apple’s schtik – originated by their late CEO, Steve Jobs – was an appeal to cool, to the social status that, we are conditioned to think, comes with this consumer product or that one. That approach underlies most big brands, of course – and it certainly worked for Apple. Hugely. In the late 1990s Apple was a dwindling computer company that had failed to compete with Microsoft. Jobs came back on board and reinvented it as a lifestyle choice – a company whose products bypassed the reason circuits and drove straight to the appeal of emotion.

It worked a treat. People didn’t buy Apple because they could get a sharply better phone, or sharply better computer. Apple’s gear was always well engineered, well designed and reliable. But so was the gear sold by other major manufacturers. Most of it was also just as easy to use. That wasn’t why people bought Apple. They bought Apple because it was a statement about themselves. They get drawn into it – I mean, I heard that some guy in Australia microchipped his own hand, on the off-chance that some rumoured feature might be built into the iPhone 6.

It was, by any measure, a brilliant recovery. Genius. But when I look at the sarcasm, the personalised anger with which some respond when anybody questions Apple products – when I suggest that, maybe, other products are as good – I have to wonder. Do people validate their own self-worth by ownership of an Apple product? Is that why they get so angry, sarcastic and abusive? So personal?

Is this where Jobs wanted his customers to go when he reinvented Apple?

For myself, I don’t feel the need to define or validate myself with any consumer product. It’s just stuff, and these days it’s increasingly short-life stuff. For me, phones, tablets and computers are things you buy for a purpose. Not to make you better than somebody else. Products. For me that’s the arbiter. Will it do the job I need it for – properly, and without compromise? And at what cost – up-front and lifetime? How reliable is it? Will the maker support it for that lifetime – and a little way beyond – at reasonable cost? If I drop a phone, what will it cost me to replace it?

All these reasons keep intruding whenever I look for any new consumer product. The fact that this path has produced a wholly Apple-free household, I think, speaks for itself.

Copyright © Matthew Wright 2014

Can we view 9/11 as history? A Hobsbawmian perspective.

Do you remember what you were doing at the precise moment when you heard about the 11 September 2001 terror attacks on New York and Washington? I do – and I’m not American. I’m a Kiwi. But I remember. Here in New Zealand, on the other side of the date-line, initial news broke in the early hours of 12 September. My wife – listening to overnight talkback radio on earpieces – heard the news and jabbed me in the ribs. ‘Wake up, a plane’s hit a building in New York.’

Thinking about tragic accidents, we got up to see whether anything was on TV. It was. And then the news got worse. Way worse. The fact that there was live coverage, here in New Zealand, underscored the scale of the tragedy as a world event.

A fireman calls for 10 more colleagues amidst the ruins of the World Trade Centre, 10 September 2001. US Navy, Public Domain, via Wikimedia Commons.

A fireman calls for 10 more colleagues amidst the ruins of the World Trade Centre, 10 September 2001. US Navy, Public Domain, via Wikimedia Commons.

That reveals the huge dimension of those events 13 years ago. A human tragedy of appalling scale that became a defining moment not just for New York, not just for the United States – but for the planet. One that helped shape the first decade of the twenty-first century for everybody in the developed world, not least because of the behaviours, attitudes and oppositions that followed, ranging from tighter security for air travellers to wars in Iraq and Afghanistan.

The time is not yet ripe to consider these events history, for they are not. But when they are – in two, three generations, when young children view 2001 much as we view 1941, a distant time of grandparents and great grandparents – how will we see the 9/11 attacks then?

The answer, to me, emerges from the way that history, for professional historians, is all about meaning – about finding the broad shapes and patterns that turn the world of the past into the world of the present. These patterns seldom match the number system we use to count the passing years.

When we look at history that way we cannot go past the work of Eric Hobsbawm, who was to my mind perhaps the greatest historian of the twentieth century. I do not make such statement lightly. He took the long view. The historian’s view. A view divorced from the round-number dates into which we usually divide the past, like the end of a decade or a century.

For Hobsbawm, centuries were defined by the patterns of social and economic trends. That was why he called the nineteenth century a ‘long century’, marked by its ‘age of revolution’. To him, this century began in 1789 with the revolution that ended the ancien regime in France and which began a pattern of industrial-driven social dislocation and revolt. It ended in 1914 when the ‘guns of August’ heralded the end of the old European order in its entirety. Of course the trends that led to these pivotal moments pre-dated the specific instant by decades. Nothing, historically, comes out of a vacuum. But these dates offered defining events that, for Hobsbawm, brought the wider underlying trends into a decisive and overt reality.

USS Arizona, 7 December 1941. Public domain, http://www.ibiblio.org/hyperwar/ OnlineLibrary/photos/images/ac00001/ ac05904.jpg

Distances of history. In 2087, the tragedy of 9/11 will be as far removed in time as Pearl Harbor is today. How will people view it? Public domain.

Following the same logic, Hobsbawm also argued that the twentieth century was ‘short’ – beginning in 1914, with that collapse of the old order and the rise, in its place, of a tripartite world in which democracy was initially on the losing side of totalitarian fascism and communism. That resolved with the victory (luckily) of democracy – an event Hobsbawm argued was marked by the collapse of the Soviet Union, the revolutionary state that had emerged from the First World War.

The decisive date, for Hobsbawm, was the formal end of the Cold War in 1992. By this reasoning the twenty-first century began in 1993. But I wonder. We cannot know our future – cannot say whether there will be any long and over-arching socio-political pattern to the twenty-first century. But so far, one does seem to be emerging, for the early part of it at least.

Like Hobsbawm’s long and short centuries, this shape has been defined by trends bubbling away well before the pivotal moment. They were evident for quite some time through the late twentieth century, partially masked by the over-powering priorities of the Cold War. But if we want to point, in Hobsbawmian fashion, to a defining moment – a point where those underlying issues suddenly became present and urgent in everyday consciousness, it has to be 9/11. Sure, that leaves us with a 9-year interregnum after the end of the twentieth century – but, as I say, history at the thematic level never does tidily match up with numeric dates or round numbers.

And will future historians look back on the twenty-first as a long century? A short one? That’s up to us, really – meaning, everybody on the planet – and the choices we make.

Copyright © Matthew Wright 2014

Why celebrity phone hacking is really everyone’s problem

Until last week, I’d never heard of Jennifer Lawrence, still less known that she apparently had salacious selfies on her phone’s cloud account. Now, it seems, everybody in the world has the news, and apparently the stolen pictures will be made into an art exhibition. Do I care (just checking the care-o-meter here)? No.

But what I do care about is the fact that the celebrity selfie hacking scandal is everyone’s problem.

1195428087807981914johnny_automatic_card_trick_svg_medMy worry has got nothing to do with the way the public debate has been sidetracked by red-herrring arguments, all flowing from the cult of celebrity that began, in the modern sense, as a Hollywood marketing device during the second decade of the twentieth century. That’s why these pictures get targeted. Hey – get a life. Celebrity Bits are the same as Everybody Else’s Bits. Get over it. Celebrities are also entitled to their privacy and property, just like everybody else.

No – the problem is the principle of data security. Everybody’s data security. It’s an arms race, on-line and off. People store all sorts of things on electronic media these days. Medical records, bank account details, passwords. Some of it ends up in the cloud. Some doesn’t, but even home computers may not be safe. Hacking goes on all the time, often looking for your bank account. It’s a sad indictment of human nature that those perpetrating this vandalism look on it as an assertion of superiority. I believe the term is ‘owned’, spelt ‘pwned’.

Artwork by Plognark http://www.plognark.com/ Creative Commons license

Artwork by Plognark http://www.plognark.com/ Creative Commons license

It’s not going to be resolved by passing laws or codes of conduct. Some immoral asshole out there, somewhere, will spoil the party.

All we can do is be vigilant. Various services are introducing two-step authentication, in which you can’t just log on by password, you have to add a code that’s sent to your phone.

You still need a strong password. I am amazed that the most popular password is – uh – ‘password’, pronounced ‘Yes, I WANT you to steal my stuff’. Other stupid passwords include ‘123456’, the names of pop-culture icons (‘HarryPotter’) or something published elsewhere, like your pet’s name.

But even a password that can’t be associated with you has to meet certain criteria. The reason is mathematical – specifically, factorial, a term denoted with an exclamation mark. In point of fact, the math of password security gets complex, because any human-generated password won’t be truly random – and terms such as ‘entropy’ enter the mix when figuring crackability. But at the end of the day, the more characters the better, and the more variables per character the better. Check this out:

  1. Any English word. There are around 1,000,000 unique words in English (including ‘callipygian’) but that’s not many for a hack-bot looking for word matches. Your account can be cracked in less than a minute.
  2. Mis-spelt English word. Doesn’t raise the odds. Hackers expect mis-spellings or number substitutions.
  3. Eight truly random lower case letters. Better. There are 208,827,064,576 combinations of the 26-letter alpha set in lower case.
  4. Eight truly random lower and upper case letters. Even better. These produce 53,459,728,531,456 potential passwords.
  5. Eight truly random keystrokes chosen from the entire available set. Best. There are 645,753,531,245,761 possible passwords.

If you use 10 truly random keystrokes, you end up with 3,255,243,551,009,881,201 possible combinations. But even that is still crackable, given time – so the other step is to change the password. Often.

Make it a habit. And – just out of interest, seeing as we’re talking about true randomness, does anybody know what the term ‘one time pad’ means?

Copyright © Matthew Wright 2014

The real truth of the First World War

There has been a growing consensus among historians in recent years that the First and Second World Wars were not separate events. They were two acts in a 31-year drama that began in 1914.

Ration party of the Royal Irish Rifles on the Somme, probably 1 July 1916. Public domain, from Wikimedia Commons http://en.wikipedia.org/wiki/File:Royal_Irish_Rifles_ration_party_Somme_July_1916.jpg

Ration party of the Royal Irish Rifles on the Somme, probably 1 July 1916. Public domain, Wikimedia Commons.

Indeed, there are reasons to argue that this war was followed by a third act, set up by the collapse of the old order in the First World War – the rise of Communism, which was not resolved by the Second World War and led to the Cold War. That did not end until 1992. These events defined the society, politics and economics of the twentieth century; and it is for these reasons that Eric Hobsbawm has argued that this century – in those terms – was a ‘short’ century, beginning in 1914 and ending in 1992.

I’m inclined to agree. As far as the two World Wars are concerned there is little doubt about the integration between them. Briefly the argument is this. In 1918, the German state collapsed, but the advancing Allies were still – certainly by George Patton’s estimate – a few weeks off being able to beat the German army. The result was that Germany essentially retained an unbroken field army. This was dispersed by Versailles, but the soldiers, brought up like the rest of Germany on the notion of ‘Reich’, felt cheated. Into the breach leaped a shell-shocked veteran of the Ypres front, sporting the Charlie Chaplin moustache he’d devised for gas-mask wear.

SMS Baden, one of the last of Germany's First World War super-dreadnoughts.

SMS Baden, one of the last of Germany’s First World War super-dreadnoughts. Public domain.

It wasn’t difficult for Hitler to whip up support based on the popular sense of injustice and denied destiny, drawing power from disaffected former soldiers who formed a significant demographic group. It was also not hard for him to find a sub-culture within Germany who could be blamed. All of this was wrapped in the guise of a ‘new order’, but actually it was not – the Nazis, in short, did not come out of a vacuum; they merely re-framed an idea that already existed. This connection was realised by the British as the Second World War came to an end and they wondered how to avoid repeating the mistakes of 1919. As early as 1943, Sir Robert Vansittart argued that Hitler was merely a symptom. The deeper problem was that Versailles hadn’t broken eighty-odd years’ worth of Bismarckian ‘Reich’ mentality.

Wright_Shattered Glory coverThis perspective demands a different view of the First World War. So far, non-military historians in New Zealand – working in ignorance of the military realties – have simply added an intellectual layer to the cliche of the First World War as a psychologically inexplicable void into which the rational world fell as a result of mechanistic international systems, the pig-headedness of stupid governments and the incompetence of Chateau-bound general officers. There has even been an attempt by one New Zealand historian to re-cast Britain and the Allies as the aggressive, evil villains of the piece. Military historians have not been seduced by such fantasies, but have still been captured by a pervasive framework of sadness, remembrance and sacrifice. Into this, again for New Zealand, has been stirred mythologies of nationalism, of the ‘birth’ of today’s nation on the shores of Gallipoli in 1915. The result of this heady mix has been a narrow orthodoxy and an equally narrow exploration of events in terms of that orthodoxy.

Landing at D-Day. Photo by Chief Photographer's Mate (CPHOM) Robert F. Sargent, U.S. Coast Guard. Public Domain.

Landing on D-Day, 6 June 1944. Photo by Chief Photographer’s Mate (CPHOM) Robert F. Sargent, U.S. Coast Guard. Public Domain.

I question this framework, not least because of the argument that the Second World War was a specific outcome of the First. The implication of the two being different aspects of a single struggle is clear; there are questions yet to be investigated about the ‘why’ of the First World War. The issue is the extent to which the ‘Reich’ mentality was perceived as a genuine threat in 1914 when Britain (in particular) debated whether to enter the conflict, and whether and how that answer drove the Allies to persist even after available offence (infantry) had proven itself inadequate against the defence (wire, machine guns and trenches). We have to remember that fear of German imperialism had already driven Europe’s alliance structures from the 1880s. And, for New Zealand, the question is how did that intersect with – and potentially drive – the sense of pro-British imperialism that did so much to define our mind-set in the generation before 1914?

These sorts of questions are beginning to be asked in British historical circles now. I keep being invited to symposia at various universities over there, where these matters are being discussed. Unfortunately we are a long way off being able to properly pose such queries in New Zealand. Yet, realistically, that interpretation needs to be explored. Perhaps I should do it. What do you think?

Copyright © Matthew Wright 2014

Close encounters of the meteor kind – this weekend

Back in 2013, I wrote a piece that mashed Pope Benedict’s resignation with the science of the meteorite that exploded over Russia. I was Freshly Pressed by WordPress on the back of it. Good stuff.

The fly-by. NASA, public domain. Click to enlarge.

The fly-by. NASA, public domain. Click to enlarge.

This weekend, a similarly sized chunk of space debris – about 20 metres in diameter – is rolling past Earth with closest approach of just 40,200 km, directly over New Zealand, at 6.18 am on Monday 8 September, NZT (18:18 Zulu, 7 September).

I use the word rolling deliberately. Everything spins in space.

The meteor’s called 20214 RC (R-C) and was detected only on 31 August by the Catalina Sky Survey at Tucson, Arizona. And that raises a point. The spectre of Earth being clobbered by even a modest piece of space detritus has haunted science for decades. Right now, we’re doing something about that – scanning near-Earth space in a hunt for likely impactors.

The orbit. NASA, public domain. Click to enlarge.

The orbit. NASA, public domain. Click to enlarge.

What we’d do if we found such a thing, other than despatch Bruce Willis, isn’t clear. Nuking them isn’t an option – the evidence is growing that some of these space rocks are just clumps of loose-ish ice and dirt. In any case, you’d end up with a cloud of debris, still hurtling for Earth and still able to deliver virtually the same kinetic blow to the planet. Personally I think we should splash one side of any likely impactor with black paint, but that method (which exploits asymmetric re-radiation of absorbed thermal energy) requires several years’ warning. This new encounter comes just a week after discovery – with all that this implies.

There’s no danger from 20214 RC (R-C). It’s got an orbital period of just over 541.11 days, which is different enough from Earth’s to mean there won’t be another encounter any time soon. But one day the orbital mechanics will mesh and it’ll be back in our vicinity. It won’t be an impact danger. But we don’t know what else is out there.

Yup, you’ve got it. That old sci-fi doom scenario involving a meteor suddenly sloshing the Atlantic into the US Eastern Seaboard and Europe? It’s baaaack…

Copyright © Matthew Wright 2014