Should we be dispassionate about writing – like Spock?

The other week I argued that Terry Brooks’ Sword of Shannara was a poorly written Tolkien rip-off that put me off the rest of the novels. Responses fell into two camps – people who agreed and thought the whole Shannara series was dismal; and those who were offended.

Wright_Typewriter2Fair point. People don’t have to agree – indeed, differing opinions are great, because they push discussion. And maybe something nobody thought of will come out of it. That’s what counts. Good stuff.

But what intrigued me about the discussion was the level of emotion it provoked in one or two places. A couple of of the responses were – well, a bit personal. Surely it’s possible to chat about the abstract value or otherwise of books? And then I got thinking. In some ways it isn’t, because the purpose of both reading and writing is emotional.

Authors write because they get an emotional satisfaction from doing so. Readers read because of the emotional journey it produces. By describing the opinion I and apparently others have of Brooks, I’d affirmed one sort of opinion. But I’d also trodden on the toes of others, who got a positive charge from reading his material.

The question, then, is whether writers and readers should step back from the emotion? In some ways I don’t think it’s possible for reading, because the very purpose of reading is to have an emotional experience. People read to become entangled in the emotional journey – be it to learn something, to feel validated, to find place, or simply to be distracted. However, I think it’s essential for writers to step back.

Yes, authors write because they get their own emotional satisfaction from doing so – from producing material that meets a need of their own and which will take others on an emotional journey. But at the same time, the clarity of thought that this process requires demands abstraction. How often have you written something in the heat of a moment and then, later, read through it and realised it’s foolish?

Authors have to be able to not only include the intended emotion, but also to step back from their own entanglements from time to time – to look at what they are producing from a more abstract perspective. Only then can the content and intent become properly clear – and the emotional journey on which they are going to take the reader emerge in balance. Really, we all have to approach writing like Spock would.

Seething with emotion underneath – sure – but not letting that get in the way of careful thought and analysis. Thoughts?

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

Do societies re-package their narratives of recent events? And is that ‘history’?

The other day a reader commented on a post I’d written about 9/11 as history and pointed out, quite rightly, that it doesn’t take long for events to be ‘packaged’ in ways that stand against the more dispassionate requirement of historians to understand.

The cover of 'Shattered Glory'. Now out of print.

The cover of ‘Shattered Glory’. Out of print (sigh…)

I agree. There’s no doubt in my mind that dramatic events affecting whole societies are swiftly re-invented by those who live through them. Not least because of emotional entanglement with what’s just happened. This is normal, historically. I traced just such a re-invention of New Zealand’s 1915 Gallipoli defeat in my book Shattered Glory (Penguin 2010). By April 1916, just five months after the stalled campaign ended in an ignominious retreat, it had been re-cast as a glorious victory, because it was a sacrifice for Empire. This reflected prevailing pop-sentiment of the day towards our place in a wider British Empire and helped address grief at the death toll, which was colossal for a country of just under 1 million souls. But the conception of Gallipoli as triumph was the exact opposite of the military defeat and human truth; a demonstration of the way societies, en masse, rationalise events to suit immediate emotional needs. And it had an impact on our view of history because, in a demonstration of the stickiness of re-invention, that view is largely what guides the popular conception of New Zealand’s Gallipoli experience today, nearly a century on.

So can we analyse recent events ‘historically’, in the same sense that we can analyse something that happened a century or two ago? History-as-discipline is one of the intellectual pursuits that self-examines its analytical philosophy. Hobsbawm, for instance, didn’t divide history via round-number centuries but by events, typically, political and social (‘social’, inevitably, encompasses ‘economic’, which despite the ‘hardening’ of economics with a mathematical over-gloss since the late 1940s, is at heart about society).

To Hobsbawm, the nineteenth century was ‘long’, book-ended by the French revolution of 1789 and the First World War of 1914. Whereas the twentieth century was ‘short’, framed by the outbreak of the First World War in 1914 and the end of the Cold War in 1992.

Those arguments were possible because Hobsbawm stood at the end of the cycles; they were evident to him and he had a distance to perceive what had happened in fully historical terms, certainly as far as the ‘long’ nineteenth century was concerned. But what about things that have just happened? Things we popularly call ‘historic’ but which still burn fresh in memory and haven’t achieved the more sonorous quiet of a deeper past?

To me there are several issues. The first is the problem of context. Sometimes, the deeper over-arching forces that drive the widest patterns of history – combinations of long-standing technological, social, political, ideological and, it seems, environmental factors – aren’t obvious for decades afterwards. We can’t tell precisely what a particular development may mean until it’s put into place not only of what went before, but also of what went after – and, usually, some time after. Last week’s, last year’s or even last decade’s news won’t cut it in these terms.

The second issue is the related one of emotional perspective. It takes about 25-30 years, or more, for one generation’s problem to be resolved and replaced by another; and also for the people primarily involved in it to be far enough back to be treated with the (ideally) abstract dispassion of history.  It is only now, for instance, that we are seeing treatment of Winston Churchill that moves beyond the pro- and anti- partisanship of his life and immediate decades after his death.

Me, on the Bridge over the River Kwai.

Me, on the Bridge over the River Kwai, a place that brings the human condition into sharp relief. Something happened to me five minutes after this photo was taken that gives the lie to notions of ‘rational egoism’. Ask me in the comments.

Thirdly there’s the ‘recency’ phenomenon, in which we tend to view events just gone as larger than those further back, to the cost of proportion. This also fuels a tendency to view whatever just happened as the arbiter of the future. Take the Cold War, which – via Hobsbawm’s thesis – was a temporary product of the way the old world collapsed in 1914-19. But you wouldn’t have known that living in the middle of it. And when it did finish with the predictable collapse of the Communist economy, Francis Fukuyama insisted that history had ended – that Western capitalist ideology, as he defined it, had won, and there would be no further change. Ouch. This was ‘recency’ in full display.

The reality of abstract historical analysis, of course, is that it has nothing to do with ‘direction’ or ‘progress’ towards an inevitable or ideal one-dimensional ‘end’ such as I believe was implied by Fukuyama. Indeed, by definition, history cannot end. It’s a product of human change through time; and the onus is on historians to understand that deeper human condition, the ‘unity in diversity’ beloved of social anthropology, as a pre-requisite to being able to understand how that then expresses itself in ever-smaller scales of detail when framed by a specific society.

I’ve found through my own work in the field that practical detail changes affecting a specific society usually happen generationally – sometimes imperceptibly, sometimes with sharper impact as happened in the 1960s when the generation brought up in wake of the Second World War objected to the philosophy of their parents.

And so we have the tools with which to approach the issue of ‘recent’ history. The pitfalls of those tools may not be fully overcome – indeed, logically, they cannot be; but to know they are there and to understand how these limitations work is, I think, a very great step towards being able to couch recent events in more dispassionate light.

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

The real truth of the First World War

There has been a growing consensus among historians in recent years that the First and Second World Wars were not separate events. They were two acts in a 31-year drama that began in 1914.

Ration party of the Royal Irish Rifles on the Somme, probably 1 July 1916. Public domain, from Wikimedia Commons http://en.wikipedia.org/wiki/File:Royal_Irish_Rifles_ration_party_Somme_July_1916.jpg

Ration party of the Royal Irish Rifles on the Somme, probably 1 July 1916. Public domain, Wikimedia Commons.

Indeed, there are reasons to argue that this war was followed by a third act, set up by the collapse of the old order in the First World War – the rise of Communism, which was not resolved by the Second World War and led to the Cold War. That did not end until 1992. These events defined the society, politics and economics of the twentieth century; and it is for these reasons that Eric Hobsbawm has argued that this century – in those terms – was a ‘short’ century, beginning in 1914 and ending in 1992.

I’m inclined to agree. As far as the two World Wars are concerned there is little doubt about the integration between them. Briefly the argument is this. In 1918, the German state collapsed, but the advancing Allies were still – certainly by George Patton’s estimate – a few weeks off being able to beat the German army. The result was that Germany essentially retained an unbroken field army. This was dispersed by Versailles, but the soldiers, brought up like the rest of Germany on the notion of ‘Reich’, felt cheated. Into the breach leaped a shell-shocked veteran of the Ypres front, sporting the Charlie Chaplin moustache he’d devised for gas-mask wear.

SMS Baden, one of the last of Germany's First World War super-dreadnoughts.

SMS Baden, one of the last of Germany’s First World War super-dreadnoughts. Public domain.

It wasn’t difficult for Hitler to whip up support based on the popular sense of injustice and denied destiny, drawing power from disaffected former soldiers who formed a significant demographic group. It was also not hard for him to find a sub-culture within Germany who could be blamed. All of this was wrapped in the guise of a ‘new order’, but actually it was not – the Nazis, in short, did not come out of a vacuum; they merely re-framed an idea that already existed. This connection was realised by the British as the Second World War came to an end and they wondered how to avoid repeating the mistakes of 1919. As early as 1943, Sir Robert Vansittart argued that Hitler was merely a symptom. The deeper problem was that Versailles hadn’t broken eighty-odd years’ worth of Bismarckian ‘Reich’ mentality.

Wright_Shattered Glory coverThis perspective demands a different view of the First World War. So far, non-military historians in New Zealand – working in ignorance of the military realties – have simply added an intellectual layer to the cliche of the First World War as a psychologically inexplicable void into which the rational world fell as a result of mechanistic international systems, the pig-headedness of stupid governments and the incompetence of Chateau-bound general officers. There has even been an attempt by one New Zealand historian to re-cast Britain and the Allies as the aggressive, evil villains of the piece. Military historians have not been seduced by such fantasies, but have still been captured by a pervasive framework of sadness, remembrance and sacrifice. Into this, again for New Zealand, has been stirred mythologies of nationalism, of the ‘birth’ of today’s nation on the shores of Gallipoli in 1915. The result of this heady mix has been a narrow orthodoxy and an equally narrow exploration of events in terms of that orthodoxy.

Landing at D-Day. Photo by Chief Photographer's Mate (CPHOM) Robert F. Sargent, U.S. Coast Guard. Public Domain.

Landing on D-Day, 6 June 1944. Photo by Chief Photographer’s Mate (CPHOM) Robert F. Sargent, U.S. Coast Guard. Public Domain.

I question this framework, not least because of the argument that the Second World War was a specific outcome of the First. The implication of the two being different aspects of a single struggle is clear; there are questions yet to be investigated about the ‘why’ of the First World War. The issue is the extent to which the ‘Reich’ mentality was perceived as a genuine threat in 1914 when Britain (in particular) debated whether to enter the conflict, and whether and how that answer drove the Allies to persist even after available offence (infantry) had proven itself inadequate against the defence (wire, machine guns and trenches). We have to remember that fear of German imperialism had already driven Europe’s alliance structures from the 1880s. And, for New Zealand, the question is how did that intersect with – and potentially drive – the sense of pro-British imperialism that did so much to define our mind-set in the generation before 1914?

These sorts of questions are beginning to be asked in British historical circles now. I keep being invited to symposia at various universities over there, where these matters are being discussed. Unfortunately we are a long way off being able to properly pose such queries in New Zealand. Yet, realistically, that interpretation needs to be explored. Perhaps I should do it. What do you think?

Copyright © Matthew Wright 2014

Fringe thinking fruit-loops or just misunderstood?

I am often bemused at the way some people seem to think. Particularly those who advocate what we might call ‘fringe’ theories.

I took this photo of the Moeraki boulders in 2007. They fact that they are not perfect spheres is evident.

Moeraki boulders, north of Dunedin. It’s been argued that they are weights used by Chinese sailors to raise sail. As I know the natural geological origin of them, that’s not a theory I believe myself, but hey…

These are often portrayed in pseudo-scientific terms; there is a hypothesis. Then comes the apparent basis for the hypothesis, frequently explicitly titled ‘the evidence’ or ‘the facts’. And finally, the fringe thinker tells us that this evidence therefore proves the proposal. QED.

All of which sounds suitably watertight, except that – every time – the connection between the hypothesis and the evidence offered to support it is non-existent by actual scientific measure. Or the evidence is presented without proper context.

Some years ago I was asked to review a book which hypothesised that a Chinese civilisation had existed in New Zealand before what they called ‘Maori’ arrived. (I think they mean ‘Polynesians’, but hey…)

This Chinese hypothesis stood against orthodox archaeology which discredited the notion of a ‘pre-Maori’ settlement as early as 1923, and has since shown that New Zealand was settled by Polynesians around 1280 AD. They were the first humans to ever walk this land. Their Polynesian settler culture, later, developed into a distinct form whose people called themselves Maori. In other words, the Maori never ‘arrived’ – they were indigenous to New Zealand.

This picture has been built from a multi-disciplinary approach; archaeology, linguistics, genetic analysis, and available oral record. Data from all these different forms of scholarship fits together. It is also consistent with the wider picture of how the South Pacific was settled, including the places the Polynesian settlers came from.

Nonetheless, that didn’t stop someone touring the South Island looking for ‘facts’ to ‘prove’ that a Chinese civilisation had been thriving here before they were (inevitably) conquered by arriving Maori. This ‘evidence’ was packed off to the Rafter Radiation Laboratory in Gracefield, Lower Hutt, for carbon dating. And sure enough, it was of suitable age. Proof, of course, that the hypothesis had been ‘scientifically’ proven. Aha! QED.

Except, of course, it wasn’t proof at all. Like any good journalist I rang the head of the lab and discovered that they’d been given some bagged samples of debris, which they were asked to test. They did, and provided the answer without comment. The problem was that the material had been provided without context. This meant the results were scientifically meaningless.

I’m contemplating writing a book myself on the pseudo-science phenomenon with its hilarious syllogisms and wonderful exploration of every logical fallacy so far discovered. How do these crazy ideas get such traction? Why do they seem to appeal more than the obvious science?

Would anybody be interested if I wrote something on this whole intriguing phenomenon?

Copyright © Matthew Wright 2014

Click to buy from Fishpond

Click to buy print edition from Fishpond

Click to buy e-book from Amazon

Click to buy e-book from Amazon

 

The paradox of Europe’s high-fat, low heart-disease diets

I am always fascinated by the way science occasionally comes up with ‘insoluble questions’ or ‘paradoxes’. After a while, these tricky queries go away because, it turns out, everybody was barking up a tree to which they had been led by an expert whose ideas had captured peer and public attention.

The Rue de Lafayette one night in 2004

Photo I took of the Rue de Lafayette in central Paris. I scoffed as much high-fat French cuisine as I could get down this boulevard. And it was delicious.

The big one, these days, is the link between high cholesterol and heart disease.  This has been dogma for decades. After the Second World War, US scientists theorised that saturated fats contributed to high cholesterol, hence clogged arteries, and therefore caused heart disease. The idea was enshrined in a US Department of Agriculture guideline in 1980.

Low fat, it seemed, was the way ahead – and it was embraced by the food industry in the US, followed by large parts of the rest of the western world.

Except Europe. They didn’t much change – and traditional French, German and Italian cuisine is awash with saturated fats and high-cholesterol foods. Yet they suffer less heart disease and are less obese than Americans. What’s more, since 1980 obesity has become a major issue in the United States and other countries that have followed the US low-fat lead, such as New Zealand.

A paradox! Something science can’t explain. Or is it?

The problem is that research often tests only what can be funded, something often framed by commercial priorities. This framework is further shaped by one of the philosophical flaws of western rational thinking; the notion that complex questions can be eventually reduced to single-cause questions and answers.

Reality is far less co-operative. The real world isn’t black-and-white. It’s not even shades of grey. It’s filled with mathematically complex systems that can sometimes settle into states of meta-stability, or which appear to present superficial patterns to initial human observation. An observation framed by the innate human tendency to see patterns in the first instance.

For me, from my philosophical perspective, it’s intriguing that recent research suggests that the link between saturated fat and ischemic (blood-flow related) heart disease is more tenuous than thought. Certainly it’s been well accepted – and was, even fifty years ago when the low-fat message was being developed – that types of cholesterol are utterly vital. If you had none at all in your system, you’d die, because it plays a crucial role in human biochemistry on a number of levels. Cholesterol even makes it possible for you to synthesise Vitamin D when exposed to sunlight. It’s one of the things humans can produce – your liver actually makes it, for these reasons.

As I understand it, recent studies suggest that the effort to diagnose and fix the problem of ‘heart attacks’ based on a simplistic mid-twentieth century premise – something picked up by much of western society as dogma – has been one of the factors implicated in a new epidemic of health problems. There is evidence that the current epidemic of diabetes (especially Type 2) and other diseases is one symptom of the way carbohydrates were substituted for fatty foods a generation ago, and of the way food manufacturers also compensated for a reduction in saturated fats by adding sugar or artificial sweeteners. Use of corn syrup in the US, for example, is up by 198 percent on 1970 figures.

I’m not a medical doctor. And from the scientific perspective all this demands testing. But the intellectual mechanisms behind this picture seem obvious to me from the principles of logic and philosophy – I learned the latter, incidentally, at post-grad level from Peter Munz, one of only two students of both Karl Popper (the inventor of modern scientific method) and Ludwig Wittgenstein (who theorised that language distorts understanding). I am in no doubt that language alone cannot convey pure concept; and I think the onus is on us to extend our understanding through careful reason – which includes being reasonable.

What am I getting at? Start with a premise and an if-then chain of reasoning, and you can build a compelling argument that is watertight of itself – but it doesn’t mean the answer is right. Data may be incomplete; or the interplay of possibilities may not be fully considered.

What follows? A human failing – self-evident smugness, pride in the ‘discovery’, followed by over-compensation that reverses the old thinking without properly considering the lateral issues. Why? Because very few people are equipped to think ‘sideways’, and scientists aren’t exceptions.

Which would be fine if it was confined to academic papers. But it isn’t. Is it.

Copyright © Matthew Wright 2014

Click to buy from Fishpond

Click to buy print edition from Fishpond

 

Another counterblast to tobacco

As I enter Grumpy Old Man territory (a tad over 30, and I’m sticking to that) I find myself less and less tolerant of people who smoke around me.

James I of England, portrait by Daniel Myrtens, 1621. Public domain, via Wikipedia.

James I of England, portrait by Daniel Myrtens, 1621. Public domain, via Wikipedia.

I’ve never smoked. it’s a horrible habit. What’s more, it inflicts itself on other people whether they like it or not, and I don’t see why I need to put up with it. If people want to succumb to their nicotine addiction and kill themselves slowly with some really nasty carcinogens, that’s up to them – but I’d rather they didn’t spew those carcinogens out around me.

I’m not alone. Back in the early 1600s, King James I of England penned a tirade about the latest import from the Americas – tobacco. Smoking had become all the rage in his court, and he hated it. Smoking, he insisted, was a ‘stinking suffumigation’. And this, what’s more, came at a time when attitudes to personal hygiene were split. Everybody said you needed baths. King James said you didn’t. The real question in his court was who might be suffumigated first. But he was King. His ‘Counterblaste to Tobacco’ was one of the first anti-smoking tracts. And it wasn’t the last.

The New Zealand government passed laws forbidding smoking in public places in 2003. A lot of offices have followed suit, with the result that central city shop doorways are usually filled with people loitering in choking clouds of cigarette smoke. Or they light up and wander off down the street, leaving non-smokers behind them to choke in the trail. Certainly in central Wellington, the foot traffic is dense enough to make it very difficult to get past them.

It’s pretty inconsiderate as far as I am concerned. I don’t spit in their faces. Why are they spitting smoke into mine? Grrrr…

Copyright © Matthew Wright 2013

Behold the mighty power of rapatronics

Rapatronics. Sounds like science fiction, doesn’t it – or maybe a berserk new music style. Except it isn’t.

In fact, rapatronics – ‘Rapid Action Electronics’ – was a technology for taking ultra-high speed photographs, invented by Harold Edgerton in the 1940s. That’s right – around 70 years ago. The system used oscillating magnetic fields to polarise and then depolarise a Faraday cell made typically of flint glass, turning it briefly transparent and acting as a shutter with exposure times down to 4 millionths of a second.

Rapatronic picture of an atomic explosion. Spikes are extensions of the fireball into the guy ropes stabilising the testing tower. Mottling effect is caused by the bomb casing, already vapourised and reflecting off the shock front of the fireball.

Rapatronic picture of an atomic explosion, milliseconds after detonation. Spikes are extensions of the fireball into the guy ropes stabilising the testing tower. Mottling effect is caused by the bomb casing, already vapourised and reflecting off the shock front of the fireball. Public domain, via Wikimedia.

Pretty cool tech even by today’s standards, and what’s even cooler is that the principle of using magnetic fields to polarise material was discovered by Michael Faraday in 1845.

Thing is, with a shutter speed of 1/4,000,000 of a second you need a pretty bright flash to properly expose the film. An atomic flash, in fact.  In 1947, Edgerton and two friends set up a company, EG&G, to make rapatronic cameras capable of photographing the first microseconds of nuclear test blasts. Each camera was good for one shot – there was no way of transporting roll film fast enough, so Edgerton typically ganged up a rack of them to take a series of shots at millisecond intervals. They were in operation by 1950 and used, for the last time, in 1962. By then the US, Britain and Soviet Union were already talking about a nuclear test ban treaty; and it was signed the following year – ending all nuclear tests except those held underground.

Edgington was also able to use his shutters to photograph hummingbirds in flight for the first time, at much slower shutter speeds, photographed bullets passing through playing cards, and was still working on camera systems in the 1980s – notably a strobe system that could take motion pictures of creatures that normally moved too slowly to be detected.

But his ghostly monochrome images of those atomic weapons tests remain perhaps the iconic demonstration of his inventiveness – and a sobering reminder of the wider mind-set of that age. The mid-twentieth century was still the age when humanity believed nature could be conquered. The atomic weapons and cameras used to photograph them ran to the edges of the laws of physics. It was an age when all things ‘atomic’ symbolised high-tech, superiority and power. When bigger was better – including, for a while, atomic bombs.

I still wonder how we got away with the twentieth century – why the world didn’t dissolve into armageddon, probably by accident. But we did get away with it. The dangerous stand-off was defused. Sanity prevailed.

Next time, of course, we may not be so lucky.

Copyright © Matthew Wright