The real truth of the First World War

There has been a growing consensus among historians in recent years that the First and Second World Wars were not separate events. They were two acts in a 31-year drama that began in 1914.

Ration party of the Royal Irish Rifles on the Somme, probably 1 July 1916. Public domain, from Wikimedia Commons

Ration party of the Royal Irish Rifles on the Somme, probably 1 July 1916. Public domain, Wikimedia Commons.

Indeed, there are reasons to argue that this war was followed by a third act, set up by the collapse of the old order in the First World War – the rise of Communism, which was not resolved by the Second World War and led to the Cold War. That did not end until 1992. These events defined the society, politics and economics of the twentieth century; and it is for these reasons that Eric Hobsbawm has argued that this century – in those terms – was a ‘short’ century, beginning in 1914 and ending in 1992.

I’m inclined to agree. As far as the two World Wars are concerned there is little doubt about the integration between them. Briefly the argument is this. In 1918, the German state collapsed, but the advancing Allies were still – certainly by George Patton’s estimate – a few weeks off being able to beat the German army. The result was that Germany essentially retained an unbroken field army. This was dispersed by Versailles, but the soldiers, brought up like the rest of Germany on the notion of ‘Reich’, felt cheated. Into the breach leaped a shell-shocked veteran of the Ypres front, sporting the Charlie Chaplin moustache he’d devised for gas-mask wear.

SMS Baden, one of the last of Germany's First World War super-dreadnoughts.

SMS Baden, one of the last of Germany’s First World War super-dreadnoughts. Public domain.

It wasn’t difficult for Hitler to whip up support based on the popular sense of injustice and denied destiny, drawing power from disaffected former soldiers who formed a significant demographic group. It was also not hard for him to find a sub-culture within Germany who could be blamed. All of this was wrapped in the guise of a ‘new order’, but actually it was not – the Nazis, in short, did not come out of a vacuum; they merely re-framed an idea that already existed. This connection was realised by the British as the Second World War came to an end and they wondered how to avoid repeating the mistakes of 1919. As early as 1943, Sir Robert Vansittart argued that Hitler was merely a symptom. The deeper problem was that Versailles hadn’t broken eighty-odd years’ worth of Bismarckian ‘Reich’ mentality.

Wright_Shattered Glory coverThis perspective demands a different view of the First World War. So far, non-military historians in New Zealand – working in ignorance of the military realties – have simply added an intellectual layer to the cliche of the First World War as a psychologically inexplicable void into which the rational world fell as a result of mechanistic international systems, the pig-headedness of stupid governments and the incompetence of Chateau-bound general officers. There has even been an attempt by one New Zealand historian to re-cast Britain and the Allies as the aggressive, evil villains of the piece. Military historians have not been seduced by such fantasies, but have still been captured by a pervasive framework of sadness, remembrance and sacrifice. Into this, again for New Zealand, has been stirred mythologies of nationalism, of the ‘birth’ of today’s nation on the shores of Gallipoli in 1915. The result of this heady mix has been a narrow orthodoxy and an equally narrow exploration of events in terms of that orthodoxy.

Landing at D-Day. Photo by Chief Photographer's Mate (CPHOM) Robert F. Sargent, U.S. Coast Guard. Public Domain.

Landing on D-Day, 6 June 1944. Photo by Chief Photographer’s Mate (CPHOM) Robert F. Sargent, U.S. Coast Guard. Public Domain.

I question this framework, not least because of the argument that the Second World War was a specific outcome of the First. The implication of the two being different aspects of a single struggle is clear; there are questions yet to be investigated about the ‘why’ of the First World War. The issue is the extent to which the ‘Reich’ mentality was perceived as a genuine threat in 1914 when Britain (in particular) debated whether to enter the conflict, and whether and how that answer drove the Allies to persist even after available offence (infantry) had proven itself inadequate against the defence (wire, machine guns and trenches). We have to remember that fear of German imperialism had already driven Europe’s alliance structures from the 1880s. And, for New Zealand, the question is how did that intersect with – and potentially drive – the sense of pro-British imperialism that did so much to define our mind-set in the generation before 1914?

These sorts of questions are beginning to be asked in British historical circles now. I keep being invited to symposia at various universities over there, where these matters are being discussed. Unfortunately we are a long way off being able to properly pose such queries in New Zealand. Yet, realistically, that interpretation needs to be explored. Perhaps I should do it. What do you think?

Copyright © Matthew Wright 2014

Fringe thinking fruit-loops or just misunderstood?

I am often bemused at the way some people seem to think. Particularly those who advocate what we might call ‘fringe’ theories.

I took this photo of the Moeraki boulders in 2007. They fact that they are not perfect spheres is evident.

Moeraki boulders, north of Dunedin. It’s been argued that they are weights used by Chinese sailors to raise sail. As I know the natural geological origin of them, that’s not a theory I believe myself, but hey…

These are often portrayed in pseudo-scientific terms; there is a hypothesis. Then comes the apparent basis for the hypothesis, frequently explicitly titled ‘the evidence’ or ‘the facts’. And finally, the fringe thinker tells us that this evidence therefore proves the proposal. QED.

All of which sounds suitably watertight, except that – every time – the connection between the hypothesis and the evidence offered to support it is non-existent by actual scientific measure. Or the evidence is presented without proper context.

Some years ago I was asked to review a book which hypothesised that a Chinese civilisation had existed in New Zealand before what they called ‘Maori’ arrived. (I think they mean ‘Polynesians’, but hey…)

This Chinese hypothesis stood against orthodox archaeology which discredited the notion of a ‘pre-Maori’ settlement as early as 1923, and has since shown that New Zealand was settled by Polynesians around 1280 AD. They were the first humans to ever walk this land. Their Polynesian settler culture, later, developed into a distinct form whose people called themselves Maori. In other words, the Maori never ‘arrived’ – they were indigenous to New Zealand.

This picture has been built from a multi-disciplinary approach; archaeology, linguistics, genetic analysis, and available oral record. Data from all these different forms of scholarship fits together. It is also consistent with the wider picture of how the South Pacific was settled, including the places the Polynesian settlers came from.

Nonetheless, that didn’t stop someone touring the South Island looking for ‘facts’ to ‘prove’ that a Chinese civilisation had been thriving here before they were (inevitably) conquered by arriving Maori. This ‘evidence’ was packed off to the Rafter Radiation Laboratory in Gracefield, Lower Hutt, for carbon dating. And sure enough, it was of suitable age. Proof, of course, that the hypothesis had been ‘scientifically’ proven. Aha! QED.

Except, of course, it wasn’t proof at all. Like any good journalist I rang the head of the lab and discovered that they’d been given some bagged samples of debris, which they were asked to test. They did, and provided the answer without comment. The problem was that the material had been provided without context. This meant the results were scientifically meaningless.

I’m contemplating writing a book myself on the pseudo-science phenomenon with its hilarious syllogisms and wonderful exploration of every logical fallacy so far discovered. How do these crazy ideas get such traction? Why do they seem to appeal more than the obvious science?

Would anybody be interested if I wrote something on this whole intriguing phenomenon?

Copyright © Matthew Wright 2014

Click to buy from Fishpond

Click to buy print edition from Fishpond

Click to buy e-book from Amazon

Click to buy e-book from Amazon


The paradox of Europe’s high-fat, low heart-disease diets

I am always fascinated by the way science occasionally comes up with ‘insoluble questions’ or ‘paradoxes’. After a while, these tricky queries go away because, it turns out, everybody was barking up a tree to which they had been led by an expert whose ideas had captured peer and public attention.

The Rue de Lafayette one night in 2004

Photo I took of the Rue de Lafayette in central Paris. I scoffed as much high-fat French cuisine as I could get down this boulevard. And it was delicious.

The big one, these days, is the link between high cholesterol and heart disease.  This has been dogma for decades. After the Second World War, US scientists theorised that saturated fats contributed to high cholesterol, hence clogged arteries, and therefore caused heart disease. The idea was enshrined in a US Department of Agriculture guideline in 1980.

Low fat, it seemed, was the way ahead – and it was embraced by the food industry in the US, followed by large parts of the rest of the western world.

Except Europe. They didn’t much change – and traditional French, German and Italian cuisine is awash with saturated fats and high-cholesterol foods. Yet they suffer less heart disease and are less obese than Americans. What’s more, since 1980 obesity has become a major issue in the United States and other countries that have followed the US low-fat lead, such as New Zealand.

A paradox! Something science can’t explain. Or is it?

The problem is that research often tests only what can be funded, something often framed by commercial priorities. This framework is further shaped by one of the philosophical flaws of western rational thinking; the notion that complex questions can be eventually reduced to single-cause questions and answers.

Reality is far less co-operative. The real world isn’t black-and-white. It’s not even shades of grey. It’s filled with mathematically complex systems that can sometimes settle into states of meta-stability, or which appear to present superficial patterns to initial human observation. An observation framed by the innate human tendency to see patterns in the first instance.

For me, from my philosophical perspective, it’s intriguing that recent research suggests that the link between saturated fat and ischemic (blood-flow related) heart disease is more tenuous than thought. Certainly it’s been well accepted – and was, even fifty years ago when the low-fat message was being developed – that types of cholesterol are utterly vital. If you had none at all in your system, you’d die, because it plays a crucial role in human biochemistry on a number of levels. Cholesterol even makes it possible for you to synthesise Vitamin D when exposed to sunlight. It’s one of the things humans can produce – your liver actually makes it, for these reasons.

As I understand it, recent studies suggest that the effort to diagnose and fix the problem of ‘heart attacks’ based on a simplistic mid-twentieth century premise – something picked up by much of western society as dogma – has been one of the factors implicated in a new epidemic of health problems. There is evidence that the current epidemic of diabetes (especially Type 2) and other diseases is one symptom of the way carbohydrates were substituted for fatty foods a generation ago, and of the way food manufacturers also compensated for a reduction in saturated fats by adding sugar or artificial sweeteners. Use of corn syrup in the US, for example, is up by 198 percent on 1970 figures.

I’m not a medical doctor. And from the scientific perspective all this demands testing. But the intellectual mechanisms behind this picture seem obvious to me from the principles of logic and philosophy – I learned the latter, incidentally, at post-grad level from Peter Munz, one of only two students of both Karl Popper (the inventor of modern scientific method) and Ludwig Wittgenstein (who theorised that language distorts understanding). I am in no doubt that language alone cannot convey pure concept; and I think the onus is on us to extend our understanding through careful reason – which includes being reasonable.

What am I getting at? Start with a premise and an if-then chain of reasoning, and you can build a compelling argument that is watertight of itself – but it doesn’t mean the answer is right. Data may be incomplete; or the interplay of possibilities may not be fully considered.

What follows? A human failing – self-evident smugness, pride in the ‘discovery’, followed by over-compensation that reverses the old thinking without properly considering the lateral issues. Why? Because very few people are equipped to think ‘sideways’, and scientists aren’t exceptions.

Which would be fine if it was confined to academic papers. But it isn’t. Is it.

Copyright © Matthew Wright 2014

Click to buy from Fishpond

Click to buy print edition from Fishpond


Another counterblast to tobacco

As I enter Grumpy Old Man territory (a tad over 30, and I’m sticking to that) I find myself less and less tolerant of people who smoke around me.

James I of England, portrait by Daniel Myrtens, 1621. Public domain, via Wikipedia.

James I of England, portrait by Daniel Myrtens, 1621. Public domain, via Wikipedia.

I’ve never smoked. it’s a horrible habit. What’s more, it inflicts itself on other people whether they like it or not, and I don’t see why I need to put up with it. If people want to succumb to their nicotine addiction and kill themselves slowly with some really nasty carcinogens, that’s up to them – but I’d rather they didn’t spew those carcinogens out around me.

I’m not alone. Back in the early 1600s, King James I of England penned a tirade about the latest import from the Americas – tobacco. Smoking had become all the rage in his court, and he hated it. Smoking, he insisted, was a ‘stinking suffumigation’. And this, what’s more, came at a time when attitudes to personal hygiene were split. Everybody said you needed baths. King James said you didn’t. The real question in his court was who might be suffumigated first. But he was King. His ‘Counterblaste to Tobacco’ was one of the first anti-smoking tracts. And it wasn’t the last.

The New Zealand government passed laws forbidding smoking in public places in 2003. A lot of offices have followed suit, with the result that central city shop doorways are usually filled with people loitering in choking clouds of cigarette smoke. Or they light up and wander off down the street, leaving non-smokers behind them to choke in the trail. Certainly in central Wellington, the foot traffic is dense enough to make it very difficult to get past them.

It’s pretty inconsiderate as far as I am concerned. I don’t spit in their faces. Why are they spitting smoke into mine? Grrrr…

Copyright © Matthew Wright 2013

Behold the mighty power of rapatronics

Rapatronics. Sounds like science fiction, doesn’t it – or maybe a berserk new music style. Except it isn’t.

In fact, rapatronics – ‘Rapid Action Electronics’ – was a technology for taking ultra-high speed photographs, invented by Harold Edgerton in the 1940s. That’s right – around 70 years ago. The system used oscillating magnetic fields to polarise and then depolarise a Faraday cell made typically of flint glass, turning it briefly transparent and acting as a shutter with exposure times down to 4 millionths of a second.

Rapatronic picture of an atomic explosion. Spikes are extensions of the fireball into the guy ropes stabilising the testing tower. Mottling effect is caused by the bomb casing, already vapourised and reflecting off the shock front of the fireball.

Rapatronic picture of an atomic explosion, milliseconds after detonation. Spikes are extensions of the fireball into the guy ropes stabilising the testing tower. Mottling effect is caused by the bomb casing, already vapourised and reflecting off the shock front of the fireball. Public domain, via Wikimedia.

Pretty cool tech even by today’s standards, and what’s even cooler is that the principle of using magnetic fields to polarise material was discovered by Michael Faraday in 1845.

Thing is, with a shutter speed of 1/4,000,000 of a second you need a pretty bright flash to properly expose the film. An atomic flash, in fact.  In 1947, Edgerton and two friends set up a company, EG&G, to make rapatronic cameras capable of photographing the first microseconds of nuclear test blasts. Each camera was good for one shot – there was no way of transporting roll film fast enough, so Edgerton typically ganged up a rack of them to take a series of shots at millisecond intervals. They were in operation by 1950 and used, for the last time, in 1962. By then the US, Britain and Soviet Union were already talking about a nuclear test ban treaty; and it was signed the following year – ending all nuclear tests except those held underground.

Edgington was also able to use his shutters to photograph hummingbirds in flight for the first time, at much slower shutter speeds, photographed bullets passing through playing cards, and was still working on camera systems in the 1980s – notably a strobe system that could take motion pictures of creatures that normally moved too slowly to be detected.

But his ghostly monochrome images of those atomic weapons tests remain perhaps the iconic demonstration of his inventiveness – and a sobering reminder of the wider mind-set of that age. The mid-twentieth century was still the age when humanity believed nature could be conquered. The atomic weapons and cameras used to photograph them ran to the edges of the laws of physics. It was an age when all things ‘atomic’ symbolised high-tech, superiority and power. When bigger was better – including, for a while, atomic bombs.

I still wonder how we got away with the twentieth century – why the world didn’t dissolve into armageddon, probably by accident. But we did get away with it. The dangerous stand-off was defused. Sanity prevailed.

Next time, of course, we may not be so lucky.

Copyright © Matthew Wright

History never repeats, except a bit…

Apple were reportedly subject last week to an employee lawsuit.

Detail from an engraving of a factory in Soho, Birmingham, c1820. Matthew Wright coll., public domain.

Detail from an engraving of a factory in Soho, Birmingham, c1820. Matthew Wright coll., public domain.

Apparently, workers at their store are searched on leaving the workplace to make sure they haven’t pocketed product.

The action by two former employees is, reportedly, not because this is demeaning and assumes employees are thieves by default. Oh no. It’s because the workers apparently haven’t been paid while waiting to prove their innocence.

Doubtless truth will out, but on the face of the media reports - doesn’t this reverse the principle of innocent-until-proven-guilty that the western justice system rests on? As a friend of mine pointed out, if an employee is asked to submit to search – meaning their integrity is being questioned – surely the accused can reasonably request the police are called to properly investigate what is, by any measure, a very serious allegation?

Meanwhile, on the other side of the Atlantic there’s a report in the UK over ‘zero hour’ employment – a normal exclusive arrangement (no secondary job), except the employer picks and chooses the hours the employee works and is paid for. According to the Guardian, Buckingham Palace uses the system, apparently, among others.

To me, that one harks back 200+ years to the early industrial revolution when workers lined up outside factories in the hope of being selected for a day’s work.

I suppose someone will invent work-houses next, places to humiliate and starve those whose misfortune is not of their making, but who can be conveniently blamed for it anyway. ‘More, Mr Twist? You want MORE?’

History never repeats in the specific; cultures change over time, ideals and values move with it.  Still, in the long game of history, it is possible to see patterns – to see swings, usually between extremes, punctuated by periods of reason. But underlying human nature doesn’t change, and if we look back we can see the same patterns of power, of injustice, of have and have not emerging time and again. A common human pattern, irrespective of how they are intellectualised and couched in the moment.

Which makes me wonder. In this age of buzz-words such as ‘solution’, will there be a moment when some wonk around the world, without the slightest trace of irony – and in profound ignorance of what they are actually saying – comes up with a ‘final’ solution to some problem or other.

I think I’d laugh. And then…then I think I’d get very scared.

Copyright © Matthew Wright 2013

Anyone for a PINT? What I dislike about psychometrics

There is a scream here in New Zealand at the moment about the way psychometric testing is being used to select public servants and others for redundancy. And quite rightly, too. One aggrieved victim has already obtained a $15,000 settlement in the employment court over it.

As far as I am concerned psychometrics are pseudoscience. Some stranger gives you questions based on a pop-theory about human behaviours and characteristics. None of them fit how you think, but you fumble through anyhow.

Then this stranger, who has never met you before and is ignorant of you as a rounded person, informs you what sort of person you Really Are. You’re classified, pigeon-holed and put into your box. Or is that ‘place’?

1206563615670858090johnny_automatic_soldiers_heads_svg_medI recall, years ago, being told what sort of person I was after such a test. When I objected, I was told this was because I was the sort of person who would object. Quite. There are words to describe people who follow this particular tautology.

What I object to is the arbitrariness. Most of these systems are based on how some psychologists imagine people should be. Yes, it  fits some broad character archetypes. And people can usually see aspects of themselves in the results, once they’ve heard them (think about what that actually means).

But these tests are  framed by the mind-set of those who create them – something defined by time and culture. A lot of psychometrics harks back to thinking of the early-mid twentieth century, with its mechanistic ways of deconstructing and classifying complex systems, notions of uniformity, and its arbitrary way of handling shades of grey.

Early twentieth century psychology was relentlessly guided by the period need to reduce and systematise humanity, just as the wider world was being systematised. Hence Jung’s work on psychological types and classifications which eventually fed into the Myers-Briggs reduction of complex human reality to just sixteen slots.

Psychometric testing is also culture-centric. The classic example is the IQ test posed in the 1920s to European migrants hoping to enter the US. They were stopped at Ellis Island and tested. One of the questions was a drawing of a house without a chimney; add the missing item. To those brought up in Eastern Europe the missing item was a cross over the door. But that wasn’t the right answer, and they missed other culturally-framed questions the same way – ergo, they were morons, and sent away again. Some were killed by the Nazis, a few years later.

But the limits of psychometric testing hasn’t stopped adoption by corporates. Why? Because these tests classify people in ways that can be enumerated, like accounts. And it’s attracted a lot of pseudo-science – even from people with qualifications in psychology – who have filled the market with ingenious, glib and corporate-friendly systems for fitting people into trendy theory. ‘Hey, here’s a test for reducing the human condition to twenty questions and four character types arrayed in a polyhedron.’

I have put much of my adult life into trying to understand the human condition – how it has framed history, how it frames us now; and I think one of our faults is our ability to over-rationalise and lead ourselves down fantasy paths.

Psychometrics. Useful tool – or arbitrary systems for pigeon-holing people that we’ve inherited from an early-mid twentieth century that also brought us eugenics? Your thoughts?

Copyright © Matthew Wright 2013