The stupidity of Nazi super-science. And hurrah for British boffins!

Anyone remember Nazi super-science? You know, the science for when ordinary super-science isn’t evil enough. I’m not talking about atomic Nazi super-soldiers led by Zombie Robo-Hitler. I’m talking real Nazi ‘super-science’ of the early 1940s – the ‘secret weapons’ Hitler insisted would win the war.

Heinkel He-177 four-engined bomber in Denmark, 1944. The engine arrangement - two engines in parallel - virtually guaranteed fires and bomber never worked properly. Public domain.

Heinkel He-177 four-engined bomber in Denmark, 1944. The engine arrangement – two DB 601 motors in tandem (dubbed ‘DB 606′) per nacelle – led to fires. Public domain.

Of course there were a couple of problems. One was that by the time the Nazis ordered German industry to build ‘super’ weapons, the war had already been lost – the tipping point came in mid-1943 when Hitler broke his own army trying to take Kursk against the advice of his generals. The Eastern Front was the decisive front of the war; after the Germans lost Kursk it was only a matter of time before superior Allied production was able to fuel a Soviet drive west.

The other problem was that Nazi super-weapons weren’t very ‘super’, even by 1940s standards. Hitler and his cronies thought they were. But what can you expect from people for whom conviction trumped reason? A regime convinced of their own destiny, buoyed by their sense of exceptionalism, and where state power pivoted around a tight integration of industrial complex with economy and government.

The main thing the Nazis were good at was evil – epitomised by one of the nastiest super-weapons their science devised; pure methamphetamine (‘P’), exploiting the prior discovery of pep-pills. This was the outcome of their quest to find a drug that could turn their own soldiers into psychotic killers immune to pain, no matter how much damage the drug did. It was actually used in 1944 by the Waffen SS as a combat aid. Alas, the recipe didn’t die with the Nazi regime – meaning ‘P’ is actually a Nazi drug. Uh – thanks, Adolf, Heinrich, et al. Yet another legacy you’re still inflicting on the world.

Messerschmitt Me-262 captured by the Allies, on test flight in the US. Public Domain.

Messerschmitt Me-262 captured by the Allies, on test flight in the US. Allied pilots during the war referred to these aircraft as ‘blow jobs’, presumably because they flew by jet thrust. Public domain.

The Nazis also encouraged rocketry, thanks to Werhner von Braun, an honorary SS Lieutenant and member of the Nazi party who was responsible, later, for America’s Saturn V Moon rockets. The problem was that the V2 missile project soaked up colossal resources – and lives. More people died making von Braun’s missile than were killed by it. But the rocket was pushed by Hitler’s regime anyway – a symptom of ‘conviction mentality’ presented as ‘logic’ and ‘reason’.

Other Nazi super-weapons that soaked up more than they delivered included August Cönders’ V3 ‘Fleißiges Lieschen’ ultra-long-range gun, which never worked; and Ferdinand Porsche’s Maus 188 tonne tank, which was too heavy for most bridges. That was dwarfed by Edward Grotte’s 1000-tonne ‘Ratte’ land battleship armed with 11-inch naval guns and powered by U-boat motors. Hitler was a fan, but Albert Speer cancelled that particular expression of Nazi megalomania in 1943, before it got to hardware.

Heinkel He-162 'Volksjager' emergency fighter, captured by the US, at Freeman Field in 1945. This wooden jet was meant to be produced in huge numbers to tip the air balance. Actually it was difficult even for experienced pilots to control, and in the hands of the half-trained boys the Nazis intended to use as pilots would have been a death trap.

Heinkel He-162 ‘Volksjager’, captured by the US, at Freeman Field in 1945. This wooden jet was meant to be produced in huge numbers to tip the air balance. Actually it was difficult even for experienced pilots to control, and in the hands of the half-trained boys the Nazis intended to use as pilots would have been a death trap.

Super-weapons that did work included the Fritz-X TV-guided bomb that sank the Italian battleship Roma in 1943, and a plethora of jet and rocket fighter designs since beloved of the “Luftwaffe 1946” fantasy brigade. Of these, the Me-262 made it to combat in 1944-45. These jets were about 100 mph faster than the best Allied piston-engined fighters, such as the P-51 Mustang flown by Chuck Yeager – but he shot down an Me-262 anyway, and damaged two others at the same time for good measure. That’s not hyperbole – here’s his combat report of 6 November 1944.

The Nazis also deployed the Tiger II tank, underpowered but with gun and armour comparable with Cold War tanks into the early 1960s. And other stuff, like tapered-barrel guns – since standard – and automatic rifles.

All of which, Hitler insisted, would win the war. They didn’t, partly because the real arbiter was industrial scale. In a war of attrition, Germany couldn’t build enough super-weapons that did work to make a difference, and the ones that didn’t soaked up resources. It has to be said that the Allies also pursued dead-ends, such as the giant Panjandrum – but to nothing like the Nazi extent.

Gloster Meteor Mk III's, seen here during operations in 1944 - yup, the Allies had jet fighters at the same time as the Nazis.

Gloster Meteor Mk III’s on operations in 1944 – yup, the Allies had jet fighters at the same time as the Nazis. Public domain.

Even so, the Allies had it all over the Germans when it came to super-weapons. Starting with the atomic bomb, the most powerful weapon in the history of the world. The German effort failed partly because their competent physicists fled to the United States in the face of Nazi persecution, partly because the Nazi bomb program was never fully resourced.

The Allies built two other key war-winning devices – effective radar, based on the British cavity magnetron, and the first radar proximity fuse for anti-aircraft work – using thermionic valve technology. General Electric did it to a design by British scientist Sir Samuel Curran. As Vannevar Bush pointed out, that fuse was decisive in key ways. The Nazis? Rheinmetal’s parallel effort was cancelled.

That’s apart from Allied jet development which paralleled the German – the British had the Gloster Meteor and the Americans the Lockheed P-80. The difference was that the Allies didn’t prioritise them. The RAF whipped the Meteor into service to help meet the V1 threat, but industry otherwise focussed on existing weapons, which they could build in overwhelming numbers. And so – fortunately – the west won the Second World War.

Copyright © Matthew Wright 2014

Collisions of coal: an author’s perspective

My biography of coal in New Zealand was published this month by David Bateman Ltd. It’s a book taking as its subject a ‘thing’, but in reality telling the human side of that ‘thing’ in all its dimensionality.

Coal 200 pxReview comments so far have been excellent – ‘this definitive work by Matthew Wright has certainly set a new benchmark‘ and ‘a fascinating read…such a good way of understanding NZ history‘ among them.

It was certainly fascinating to write. I’ve been trunking on in this blog about ways and techniques of writing – well, this book represents one way I put those things into practise.

All writing – fiction and non-fiction alike – must have structure, a theme, a dynamic around which to take the reader on an emotional journey. In fiction, that’s the character arc. In non-fiction, the author has to find something else; and for me the obvious angle was the intersection between humanity and this unique – almost chance – product of nature. That gave me the organising principle for the book, the thread around which I could weave the story. To do that I had to draw together a whole lot of thinking in areas that – on the face of it – seem quite disparate, but which in reality are all expressions of the one thing, our relationship with the world and with ourselves.

It was a story of collisions. You can’t tell the story of coal without delving into how it came to be, product of peat swamps and geological processes that, in New Zealand’s case, stretch over sixty million years. To give that context I decided to set it against the span of human existence – which, at best, is a tiny fraction of that time. The time during which we have dug up and burned that coal is shorter still, a tiny eye-blink against the span of years during which our coal resources formed.

This digger at the Stockton open cast coal mine is way bigger than it seems.

This digger at the Stockton open cast coal mine is way bigger than it seems.

The question follows – why have we been so profligate in our burning? The answer, also explored in the book, flows from our nature and the way we think. The mid-to-late nineteenth century, when New Zealand’s coal was first exploited on an industrial scale, was an age of a particular style of thinking. It was common across the industrialising world but particularly evident on the whole colonial frontier from the United States to Australia and South Africa – and one of the key drivers of the impecunious pace with which we dug up and burned the coal.

That same thinking also introduced another side of the human story of coal – our attitudes to it; the way we relied on it and yet also saw those who dug it as a social threat; and the way we relentlessly found news ways of exploiting it.

One theme became increasingly clear throughout. We have been digging and burning coal, not just in New Zealand but around the world, with ever-increasing pace in the last 250 years. The fact that coal is no longer burned in domestic homes has disguised the fact that in the last few dozen years particularly, that pace has skyrocketed.

Today, coal combustion produces 43 percent of all global greenhouse gas emissions. Nearly half. And the jury is back on climate change. It’s happening – and it’s an own goal. Big time. Making coal the chief villain.

That was why I ended the book the way I did. With – a well, you’ll just have to check it out for yourselves.

Copyright © Matthew Wright 2014

Buy print edition from Fishpond

Buy from Fishpond

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

The three questions all authors must ask before starting

It’s amazing how many writing lessons I find in music. When I was a kid and learning music, there was an attitude that rock musicians were musical Neanderthals who could strum a few chords while making animal noises. ‘Proper’ music was ‘classical’, around which the Royal Schools grade courses I was doing was framed.

The panel of one of my analog synths... dusty, a bit scratched, but still workable.

The panel of one of my analog synths… dusty, a bit scratched, but still workable. Actually, these weren’t regarded as proper instruments when I was learning music, either…

The criteria for being a ‘proper’ musician, in short, wasn’t whether the performer provoked an emotional response in stadium-sized audiences and became a shaping force in western culture – but an ability to play 200-year old dinner muzak penned by Mozart, all built around diatonic chord progression – Mozart’s Piano Sonata in C No. 16, K. 545, for instance, uses chords running in descending fifths (vi-ii-V-I). The fact that ‘classical’ structure was a very narrow form of music – as Stockhausen, Cage, Varese and others revealed – didn’t enter into it.

The kicker? Rock music also uses diatonic chord progression – the usual string is I – V – vi – IV (try it, then sing Beatles ‘Let It Be’, Toto ‘Africa’, John Denver ‘Take Me Home’, etc). What’s more, the musicians who made it knew very well what they were doing. Some – like Rick Wakeman – were classically trained. When Ken Russell wanted to make a movie mashing rock music with Franz Liszt and Richard Wagner, Wakeman did the adaptations.

Today? The genre ‘made it’, to my mind, when astrophysicist and Total Rock God Brian May played ‘God Save The Queen’, on electric guitar, on the roof of Buckingham Palace. By invitation. Awesome! Music is music, ‘classical’ is but one corner; and the people who get ahead have got the chops. Here’s Dutch singer Floor Jansen with ‘O Mio Babbino Caro’ from Puccini’s 1918 comic opera Gianni Schicchi. Typical ‘classical’ singing – you know, when they didn’t have microphones and had to be heard over the orchestra.

And here’s Jansen again, with her band ReVamp:

Ernest Hemingway ( J F Kennedy Presidential library, released to public domain)

Ernest Hemingway ( J F Kennedy Presidential library, released to public domain)

What does this have to do with writing? Attitudes of elitism are true of writing, too. Here in New Zealand, for instance, the academic community – on my experience – take the attitude that authors writing on their subjects for a popular market are not going to innovate – that these authors are ignorant of intellectual technique and not academically capable.  I used to get it all the time when I wrote history commercially – a supposition that work had to be judged solely against the narrow criteria demanded of the academy. I was simply an intruding Neanderthal who, presumably, would be better off leaving the territory to the real experts who filled their material with incomprehensible but ego-boosting sentences with the word ‘discourse’ in them. The fact that books written to academic criteria often don’t innovate – and are virtually unreadable, even to other academics, doesn’t enter the calculation.

The reality – and this is where the rock music lesson comes in – is that most people who can write competently know exactly what they are doing, and can also innovate. It’s part of the territory. What’s more, many have the same qualifications as the academics who diss them. I do, for instance. But I don’t work for a university – or see the need to validate myself in the narrow terms academics use to assert status to each other.

All of it comes down to the basic questions all authors must ask themselves before putting pen to paper (well, finger to keyboard, these days):

1. What is the purpose of this piece of writing?
2. Who is the audience?
3. Why will they want to read this particular piece?

Everything else follows – the pitch, the tone, and the content. Intellectual rigour applies, whichever way the ideas are expressed. And it seems to me that the widest audience won’t be the one that likes reading the word ‘discourse’ when ‘conversation’ means the same thing.

Hemingway summed it up. Why use the ‘ten dollar’ words when there are other and better words that do the same thing?

Quite right, too. And that, I think, is true of all writing whatever the subject or genre.

Copyright © Matthew Wright 2014

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

Posing the vital question: are writers also readers?

I have a question to put to you. I posted earlier this week on the books I read as a kid, which have stayed with me.

Spot my title in the middle...

Spot my title in the middle…

The reason a book ‘stays with you’ is because of its emotional impact at the time – and later. Now, that poses a question. You’d think that – as writers write – they’d draw a deeper emotional response from books and from reading than, perhaps, do people who just read. Flip sides of the same experience, but the writer’s deeper into it.

I wonder, though. It isn’t true for me. I find music offers the better experience, certainly in terms of engaging with it. Reading simply doesn’t engage me the same way.

But I write. I write a lot.

So I put it to you – does it follow that ‘writers’ must, by nature, draw their best emotional involvement from ‘reading’. Or is writing an expression of an emotional experience that writers draw, more fully, from all things – the world around them, life experiences, music and, in due place, their own reading? In the end, does it come down to individuals?

I draw distinction here between reading to reverse-engineer how it was done – to examine the way different authors approached their subjects and learn from it – with reading for pleasure. I’m asking about the latter – in short, are writers also readers?

Copyright © Matthew Wright 2014

Can we view 9/11 as history? A Hobsbawmian perspective.

Do you remember what you were doing at the precise moment when you heard about the 11 September 2001 terror attacks on New York and Washington? I do – and I’m not American. I’m a Kiwi. But I remember. Here in New Zealand, on the other side of the date-line, initial news broke in the early hours of 12 September. My wife – listening to overnight talkback radio on earpieces – heard the news and jabbed me in the ribs. ‘Wake up, a plane’s hit a building in New York.’

Thinking about tragic accidents, we got up to see whether anything was on TV. It was. And then the news got worse. Way worse. The fact that there was live coverage, here in New Zealand, underscored the scale of the tragedy as a world event.

A fireman calls for 10 more colleagues amidst the ruins of the World Trade Centre, 10 September 2001. US Navy, Public Domain, via Wikimedia Commons.

A fireman calls for 10 more colleagues amidst the ruins of the World Trade Centre, 10 September 2001. US Navy, Public Domain, via Wikimedia Commons.

That reveals the huge dimension of those events 13 years ago. A human tragedy of appalling scale that became a defining moment not just for New York, not just for the United States – but for the planet. One that helped shape the first decade of the twenty-first century for everybody in the developed world, not least because of the behaviours, attitudes and oppositions that followed, ranging from tighter security for air travellers to wars in Iraq and Afghanistan.

The time is not yet ripe to consider these events history, for they are not. But when they are – in two, three generations, when young children view 2001 much as we view 1941, a distant time of grandparents and great grandparents – how will we see the 9/11 attacks then?

The answer, to me, emerges from the way that history, for professional historians, is all about meaning – about finding the broad shapes and patterns that turn the world of the past into the world of the present. These patterns seldom match the number system we use to count the passing years.

When we look at history that way we cannot go past the work of Eric Hobsbawm, who was to my mind perhaps the greatest historian of the twentieth century. I do not make such statement lightly. He took the long view. The historian’s view. A view divorced from the round-number dates into which we usually divide the past, like the end of a decade or a century.

For Hobsbawm, centuries were defined by the patterns of social and economic trends. That was why he called the nineteenth century a ‘long century’, marked by its ‘age of revolution’. To him, this century began in 1789 with the revolution that ended the ancien regime in France and which began a pattern of industrial-driven social dislocation and revolt. It ended in 1914 when the ‘guns of August’ heralded the end of the old European order in its entirety. Of course the trends that led to these pivotal moments pre-dated the specific instant by decades. Nothing, historically, comes out of a vacuum. But these dates offered defining events that, for Hobsbawm, brought the wider underlying trends into a decisive and overt reality.

USS Arizona, 7 December 1941. Public domain, http://www.ibiblio.org/hyperwar/ OnlineLibrary/photos/images/ac00001/ ac05904.jpg

Distances of history. In 2087, the tragedy of 9/11 will be as far removed in time as Pearl Harbor is today. How will people view it? Public domain.

Following the same logic, Hobsbawm also argued that the twentieth century was ‘short’ – beginning in 1914, with that collapse of the old order and the rise, in its place, of a tripartite world in which democracy was initially on the losing side of totalitarian fascism and communism. That resolved with the victory (luckily) of democracy – an event Hobsbawm argued was marked by the collapse of the Soviet Union, the revolutionary state that had emerged from the First World War.

The decisive date, for Hobsbawm, was the formal end of the Cold War in 1992. By this reasoning the twenty-first century began in 1993. But I wonder. We cannot know our future – cannot say whether there will be any long and over-arching socio-political pattern to the twenty-first century. But so far, one does seem to be emerging, for the early part of it at least.

Like Hobsbawm’s long and short centuries, this shape has been defined by trends bubbling away well before the pivotal moment. They were evident for quite some time through the late twentieth century, partially masked by the over-powering priorities of the Cold War. But if we want to point, in Hobsbawmian fashion, to a defining moment – a point where those underlying issues suddenly became present and urgent in everyday consciousness, it has to be 9/11. Sure, that leaves us with a 9-year interregnum after the end of the twentieth century – but, as I say, history at the thematic level never does tidily match up with numeric dates or round numbers.

And will future historians look back on the twenty-first as a long century? A short one? That’s up to us, really – meaning, everybody on the planet – and the choices we make.

Copyright © Matthew Wright 2014

Why celebrity phone hacking is really everyone’s problem

Until last week, I’d never heard of Jennifer Lawrence, still less known that she apparently had salacious selfies on her phone’s cloud account. Now, it seems, everybody in the world has the news, and apparently the stolen pictures will be made into an art exhibition. Do I care (just checking the care-o-meter here)? No.

But what I do care about is the fact that the celebrity selfie hacking scandal is everyone’s problem.

1195428087807981914johnny_automatic_card_trick_svg_medMy worry has got nothing to do with the way the public debate has been sidetracked by red-herrring arguments, all flowing from the cult of celebrity that began, in the modern sense, as a Hollywood marketing device during the second decade of the twentieth century. That’s why these pictures get targeted. Hey – get a life. Celebrity Bits are the same as Everybody Else’s Bits. Get over it. Celebrities are also entitled to their privacy and property, just like everybody else.

No – the problem is the principle of data security. Everybody’s data security. It’s an arms race, on-line and off. People store all sorts of things on electronic media these days. Medical records, bank account details, passwords. Some of it ends up in the cloud. Some doesn’t, but even home computers may not be safe. Hacking goes on all the time, often looking for your bank account. It’s a sad indictment of human nature that those perpetrating this vandalism look on it as an assertion of superiority. I believe the term is ‘owned’, spelt ‘pwned’.

Artwork by Plognark http://www.plognark.com/ Creative Commons license

Artwork by Plognark http://www.plognark.com/ Creative Commons license

It’s not going to be resolved by passing laws or codes of conduct. Some immoral asshole out there, somewhere, will spoil the party.

All we can do is be vigilant. Various services are introducing two-step authentication, in which you can’t just log on by password, you have to add a code that’s sent to your phone.

You still need a strong password. I am amazed that the most popular password is – uh – ‘password’, pronounced ‘Yes, I WANT you to steal my stuff’. Other stupid passwords include ‘123456’, the names of pop-culture icons (‘HarryPotter’) or something published elsewhere, like your pet’s name.

But even a password that can’t be associated with you has to meet certain criteria. The reason is mathematical – specifically, factorial, a term denoted with an exclamation mark. In point of fact, the math of password security gets complex, because any human-generated password won’t be truly random – and terms such as ‘entropy’ enter the mix when figuring crackability. But at the end of the day, the more characters the better, and the more variables per character the better. Check this out:

  1. Any English word. There are around 1,000,000 unique words in English (including ‘callipygian’) but that’s not many for a hack-bot looking for word matches. Your account can be cracked in less than a minute.
  2. Mis-spelt English word. Doesn’t raise the odds. Hackers expect mis-spellings or number substitutions.
  3. Eight truly random lower case letters. Better. There are 208,827,064,576 combinations of the 26-letter alpha set in lower case.
  4. Eight truly random lower and upper case letters. Even better. These produce 53,459,728,531,456 potential passwords.
  5. Eight truly random keystrokes chosen from the entire available set. Best. There are 645,753,531,245,761 possible passwords.

If you use 10 truly random keystrokes, you end up with 3,255,243,551,009,881,201 possible combinations. But even that is still crackable, given time – so the other step is to change the password. Often.

Make it a habit. And – just out of interest, seeing as we’re talking about true randomness, does anybody know what the term ‘one time pad’ means?

Copyright © Matthew Wright 2014

The real truth of the First World War

There has been a growing consensus among historians in recent years that the First and Second World Wars were not separate events. They were two acts in a 31-year drama that began in 1914.

Ration party of the Royal Irish Rifles on the Somme, probably 1 July 1916. Public domain, from Wikimedia Commons http://en.wikipedia.org/wiki/File:Royal_Irish_Rifles_ration_party_Somme_July_1916.jpg

Ration party of the Royal Irish Rifles on the Somme, probably 1 July 1916. Public domain, Wikimedia Commons.

Indeed, there are reasons to argue that this war was followed by a third act, set up by the collapse of the old order in the First World War – the rise of Communism, which was not resolved by the Second World War and led to the Cold War. That did not end until 1992. These events defined the society, politics and economics of the twentieth century; and it is for these reasons that Eric Hobsbawm has argued that this century – in those terms – was a ‘short’ century, beginning in 1914 and ending in 1992.

I’m inclined to agree. As far as the two World Wars are concerned there is little doubt about the integration between them. Briefly the argument is this. In 1918, the German state collapsed, but the advancing Allies were still – certainly by George Patton’s estimate – a few weeks off being able to beat the German army. The result was that Germany essentially retained an unbroken field army. This was dispersed by Versailles, but the soldiers, brought up like the rest of Germany on the notion of ‘Reich’, felt cheated. Into the breach leaped a shell-shocked veteran of the Ypres front, sporting the Charlie Chaplin moustache he’d devised for gas-mask wear.

SMS Baden, one of the last of Germany's First World War super-dreadnoughts.

SMS Baden, one of the last of Germany’s First World War super-dreadnoughts. Public domain.

It wasn’t difficult for Hitler to whip up support based on the popular sense of injustice and denied destiny, drawing power from disaffected former soldiers who formed a significant demographic group. It was also not hard for him to find a sub-culture within Germany who could be blamed. All of this was wrapped in the guise of a ‘new order’, but actually it was not – the Nazis, in short, did not come out of a vacuum; they merely re-framed an idea that already existed. This connection was realised by the British as the Second World War came to an end and they wondered how to avoid repeating the mistakes of 1919. As early as 1943, Sir Robert Vansittart argued that Hitler was merely a symptom. The deeper problem was that Versailles hadn’t broken eighty-odd years’ worth of Bismarckian ‘Reich’ mentality.

Wright_Shattered Glory coverThis perspective demands a different view of the First World War. So far, non-military historians in New Zealand – working in ignorance of the military realties – have simply added an intellectual layer to the cliche of the First World War as a psychologically inexplicable void into which the rational world fell as a result of mechanistic international systems, the pig-headedness of stupid governments and the incompetence of Chateau-bound general officers. There has even been an attempt by one New Zealand historian to re-cast Britain and the Allies as the aggressive, evil villains of the piece. Military historians have not been seduced by such fantasies, but have still been captured by a pervasive framework of sadness, remembrance and sacrifice. Into this, again for New Zealand, has been stirred mythologies of nationalism, of the ‘birth’ of today’s nation on the shores of Gallipoli in 1915. The result of this heady mix has been a narrow orthodoxy and an equally narrow exploration of events in terms of that orthodoxy.

Landing at D-Day. Photo by Chief Photographer's Mate (CPHOM) Robert F. Sargent, U.S. Coast Guard. Public Domain.

Landing on D-Day, 6 June 1944. Photo by Chief Photographer’s Mate (CPHOM) Robert F. Sargent, U.S. Coast Guard. Public Domain.

I question this framework, not least because of the argument that the Second World War was a specific outcome of the First. The implication of the two being different aspects of a single struggle is clear; there are questions yet to be investigated about the ‘why’ of the First World War. The issue is the extent to which the ‘Reich’ mentality was perceived as a genuine threat in 1914 when Britain (in particular) debated whether to enter the conflict, and whether and how that answer drove the Allies to persist even after available offence (infantry) had proven itself inadequate against the defence (wire, machine guns and trenches). We have to remember that fear of German imperialism had already driven Europe’s alliance structures from the 1880s. And, for New Zealand, the question is how did that intersect with – and potentially drive – the sense of pro-British imperialism that did so much to define our mind-set in the generation before 1914?

These sorts of questions are beginning to be asked in British historical circles now. I keep being invited to symposia at various universities over there, where these matters are being discussed. Unfortunately we are a long way off being able to properly pose such queries in New Zealand. Yet, realistically, that interpretation needs to be explored. Perhaps I should do it. What do you think?

Copyright © Matthew Wright 2014