3D printed steak chips? It’s enough to make me go all hippy and vegetarian…

Human inventiveness seems limitless these days, so I wasn’t surprised to discover the other week that food technologists have been experimenting with 3d printed meat – currently produced, at astronomical expense, in the shape of chips.

Gallus gallus domesticus on Rarotonga, looking very much like the Red Jungle Fowl (Gallus gallus).

I’ll have my chicken free-range and wild, thanks…

Artificial food has been a long-standing SF staple – brilliantly played by Arthur C. Clarke in his hilarious 1961 satire ‘Food Of The Gods’. All food in this future was synthesised to the point where the very idea of eating something once alive had become offensive. Even the word ‘carnivore’ had to be spelt, lest it nauseate listeners, and synthetic meat had names unassociated with animals. In classic Clarke fashion, of course, there was a twist. Food synthesisers could produce anything. And there was this synth-meat called ‘Ambrosia Plus’, which sold like hotcakes until a rival company found out what the prototype was… (I won’t spoil the fun other than to point out that there’s a verb for a specific sort of meat-eating starting with ‘c’, and it isn’t ‘carnivore’.)

In the real world, 3D printed meat isn’t synthetic – it’s made of actual animal muscle cells which are artificially bred and then sprayed, in layers, to produce the product. Currently it’s a lab technique and the obvious challenge for its gainsayers is to find ways of industrialising it. Also of getting customers past the ‘ewwww’ factor of eating animal tissue bred in a petri dish and vomited into chip shape through a nozzle.

To my mind the key challenge is identifying the total energy requirement – printed meat may NOT be as efficient as current ‘natural’ methods of getting meat to your dinner table, where a large part of the energy comes from sunlight, via a grassy paddock and the digestive systems of ruminants.

Mercifully, we haven’t been told ‘This Is The Way ALL Meat Will Be Eaten In Future’, ‘The Future Is Now’ and other such dribble. Predictions of that sort pivot off the ‘recency effect’, by which whatever just happened is seen as far more important than it really is when set against the wider span of history. We fall into that trap quite often – often, these days, over products launched on the back of commercial ambition. What really happens is that the ‘way of the future’ idea joins a host of others. All of these then blend together and react with society in ways that eventually – and usually generationally – produces changes, but inevitably not the ones predicted by the ‘Future Is Here’ brigade.

In one of the ironies of the way we usually imagine our future, things that do dramatically change the way we live – such as the internet – are often not seen coming, or touted as game-changers. Certainly not in the way that food pills, flying cars and the cashless society have been.

As for artificial meat – well, I expect that if – IF – it can be industrialised, it’ll find a home in hamburger patties. But there seems little chance of it being mistaken for the real deal, still less supplanting a delicious slab of dead cow seared sirloin on the dinner table.

Copyright © Matthew Wright 2015

When ethics overcome history

Another iconic building in my home town, Napier, New Zealand, bit the dust a while back. The Williams building – 103 years old – survived both the devastating 1931 earthquake and fire that followed.

Panorama I took of Napier's Hastings Street, Williams Building to the far left.

Panorama I took of Napier’s Hastings Street, Williams Building to the far right.

Now it’s gone down before the wrecking ball. And a good thing too. You see, it apparently only met 5 percent of the current earthquake-proofing standard. Ouch. Surviving the 1931 quake and retaining its structural integrity were, it seems, two different things.

The Williams building. Click to enlarge.

The Williams building going…going… Click to enlarge.

It’s the latest in a succession of quake-risk demolitions around the city. A few structures – such as the Paxie building, centre in the photo above, or the old State Theatre (where I first saw Star Wars in 1977) have been gutted and the facades preserved. But original ‘deco’ buildings of the 1930s are limited to a couple of city blocks. A single heritage precinct. When I was a kid, deco filled the town.

....and gone....

….and gone…. Click to enlarge

I know, I can hear the howls of protest now. ‘But – but – you’re interested in history…how can you support knocking it down?’

Easy. History is more than the artefacts it leaves anyway, but the real calculation is more immediate. A few years back, Napier’s Anglican Cathedral hall was also under threat of demolition, in part because it was a pre-quake masonry structure. The Historic Places Trust approached me, wanting me to put my authority and repute as a nationally known historian behind their effort to have it listed and legally protected. I was well aware of that history, of course. But I knew the building was a quake risk –and I hadn’t been given any engineering reports on which to base the professional opinion I was being asked to provide by Historic Places.

The biggest horror story of the 1931 quake was the way a doctor had to euthanise a badly injured woman who was trapped in the ruins of the cathedral – the only way to save her from being burned alive by advancing fires. In was an appalling moment. The decision tore at him for the rest of his life.

I wasn’t going to endorse saving a building where that might happen again. Risking human life or preserving a historic building? It’s a no-brainer, really. So while it was sad to see that building go -and sad, since, to see other structures like the Williams Building disappear – it’s really not a hard choice. What would you do?

Copyright © Matthew Wright 2015

Is vandalism part of the human condition?

I have a small gripe. Vandals keep tagging a power pole just along from where I live. Marking territory, animal-fashion. It happens every few weeks. The local council always has it painted out within the day; but it highlights what, for me, is one of the saddest sides of the human moral compass.

From http://public-domain.zorger.comVandalism. If somebody has something, it seems – even something as simple as a nicely painted power pole in a quiet suburban street – somebody else wants to break it, take it away or deny it to them. Anything humans have, it seems, is targeted in its own way. Take computing. Visionaries like Bill Gates and Sir Tim Berners Lee had a concept for a wonderful and better human world, connected by computer. So what happened? Other people wrote software to damage, steal, or cause inconvenience to users. Vandalism! Somebody trying to take away what you have – these days, usually the contents of your bank account.

I see the same phenomenon in the way academics always respond to others in their territory by denying the worth of the other’s skills and work – vandalising repute in intellectualised terms. To me that is conceptually no different from the way imbeciles with paint cans performed – it’s designed to take away something that somebody else has.

It’s been common enough through history. And it always works the same way:

1. “Someone’s got something I don’t have, so I have to show I’m better by breaking it or taking it off them.”
2. “I am marking my place and showing I am more important than others.”
3.”I feel validated by doing so.”

The motives, in short, are entwined with ego, status anxiety, and with validating a sense of self. Most human actions are. However, vandalism is a selfish form of self-validation.  It validates by taking away from others. To me this the exact reverse of the way we should behave.

In fact there are other – and better – ways of validating yourself. Helping others, for instance – being kind, taking a moment to help.

If we work together to build, isn’t that better than trying to tear down what others do? It is the difference between selfishness (vandalism) and generosity (kindness).  Bottom line is that kindness is the better path. And I think that, through history, there are times when society in general has taken that kinder path – overtly and obviously. But right now, as we roll into the twenty-first century, isn’t one of them. And I think we need to change that – to nurture kindness by taking the initiative – by expressing kindness, even in small ways, to each other.

I’ve said all this before, of course, but it’s worth saying again. Your thoughts?

Copyright © Matthew Wright 2015

What ever became of all the good in the world?

I am always astonished at the limitless capacity humanity has for intellectualising itself away from care and kindness.

Quick - burn the intruding historian! Avenge ourselves!

School. If you’re accused, you’re guilty!

Many years ago, when I was at school, there was a coat cupboard at the back of the classroom. Next to the cupboard was a trestle table on which had been set a class construction project. The bell went. The class joyously leaped from their chairs and surged to the cupboard, shoving and ramming each other as they fought to get their coats and escape.

I’d hung back to wait for the scrum to clear and saw the cupboard door being forced back by the desperate mob, into the trestle table. I rushed to try and rescue it – too late. The whole lot collapsed to the floor as I got there. Needless to say I was blamed. Everybody had seen me standing over the ruin and it (again) proved what a stupid and worthless child I was, and how dare I claim I was trying to save it, I totally deserved what was coming to me.

So much for trying to be a Good Samaritan.

But – but you say – surely I had rights? No. I had absolutely none. Back then, teachers given power by the system used it to smash those the system had defined as powerless, the kids, and so validate their own sense of worth. If I was seen near a broken table and the teacher decided I had done it – well, then obviously I’d done it, and how dare I protest my innocence.

The main ethical problem with this sort of behaviour is that guilt-on-accusation and summary justice stand not just against the principles of our justice system, but also of the values of care on which western society prides itself. But that is how society seems to work, certainly these days. We have trial-and-conviction by media even before someone alleged of a crime has been charged, just as one instance.

All of it is a symptom of one side of human nature. A symptom of the way humans intellectualise themselves into unkindness. It stands against what we SHOULD be doing – stands against the values of care, compassion, kindness and tolerance that, surely, must form a cornerstone any society.

There is only one answer. We have to bring kindness back into the world – together. Who’s with me?

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

How long is the ‘now’ moment we live in?

How long is ‘now’ – you know, the evanescent moment we live in and usually let past without properly experiencing it.

Wright_AuthorPhoto2014_LoNow, like time itself, is largely seen as a philosophical issue; a personal perception that stretches or shrinks depending on what we are doing. For a kid, an hour spent in a classroom listening to the teacher drone on about stuff the kid neither knows nor care about is an eternity; yet an hour hurtling about with friends at play disappears in a flash. Adults have a different perception of time again; that same elasticity flowing from interest and enthusiasm, but metered often by a sense of purpose. Yes the job’s boring, but it has to be done.

Beyond that is the concept of the ‘moment’ itself. What is ‘now’? In Buddhist philosophy it means being mindful – fully and properly aware of one’s immediate self, immediate place, and immediate environment. It means having awareness of the fullness of the moment, even in its transience, even as we think about past or future.

But what ‘is’ a ‘moment’, scientifically? The reported research indicated that a ‘moment’, to most people, is two or three seconds. Then that perception of ‘now’ vanishes and is replaced by a new one.

If we match that to attention spans we find that the typical time spent on any one item on the internet is literally only a couple of ‘moments’. And then when we realise how shallow the internet must be.

It also underscores just how important and valuable mindfulness actually is. Because a couple of blinks, literally, and the ‘now’ moment is gone.

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

Should we be dispassionate about writing – like Spock?

The other week I argued that Terry Brooks’ Sword of Shannara was a poorly written Tolkien rip-off that put me off the rest of the novels. Responses fell into two camps – people who agreed and thought the whole Shannara series was dismal; and those who were offended.

Wright_Typewriter2Fair point. People don’t have to agree – indeed, differing opinions are great, because they push discussion. And maybe something nobody thought of will come out of it. That’s what counts. Good stuff.

But what intrigued me about the discussion was the level of emotion it provoked in one or two places. A couple of of the responses were – well, a bit personal. Surely it’s possible to chat about the abstract value or otherwise of books? And then I got thinking. In some ways it isn’t, because the purpose of both reading and writing is emotional.

Authors write because they get an emotional satisfaction from doing so. Readers read because of the emotional journey it produces. By describing the opinion I and apparently others have of Brooks, I’d affirmed one sort of opinion. But I’d also trodden on the toes of others, who got a positive charge from reading his material.

The question, then, is whether writers and readers should step back from the emotion? In some ways I don’t think it’s possible for reading, because the very purpose of reading is to have an emotional experience. People read to become entangled in the emotional journey – be it to learn something, to feel validated, to find place, or simply to be distracted. However, I think it’s essential for writers to step back.

Yes, authors write because they get their own emotional satisfaction from doing so – from producing material that meets a need of their own and which will take others on an emotional journey. But at the same time, the clarity of thought that this process requires demands abstraction. How often have you written something in the heat of a moment and then, later, read through it and realised it’s foolish?

Authors have to be able to not only include the intended emotion, but also to step back from their own entanglements from time to time – to look at what they are producing from a more abstract perspective. Only then can the content and intent become properly clear – and the emotional journey on which they are going to take the reader emerge in balance. Really, we all have to approach writing like Spock would.

Seething with emotion underneath – sure – but not letting that get in the way of careful thought and analysis. Thoughts?

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

Do societies re-package their narratives of recent events? And is that ‘history’?

The other day a reader commented on a post I’d written about 9/11 as history and pointed out, quite rightly, that it doesn’t take long for events to be ‘packaged’ in ways that stand against the more dispassionate requirement of historians to understand.

The cover of 'Shattered Glory'. Now out of print.

The cover of ‘Shattered Glory’. Out of print (sigh…)

I agree. There’s no doubt in my mind that dramatic events affecting whole societies are swiftly re-invented by those who live through them. Not least because of emotional entanglement with what’s just happened. This is normal, historically. I traced just such a re-invention of New Zealand’s 1915 Gallipoli defeat in my book Shattered Glory (Penguin 2010). By April 1916, just five months after the stalled campaign ended in an ignominious retreat, it had been re-cast as a glorious victory, because it was a sacrifice for Empire. This reflected prevailing pop-sentiment of the day towards our place in a wider British Empire and helped address grief at the death toll, which was colossal for a country of just under 1 million souls. But the conception of Gallipoli as triumph was the exact opposite of the military defeat and human truth; a demonstration of the way societies, en masse, rationalise events to suit immediate emotional needs. And it had an impact on our view of history because, in a demonstration of the stickiness of re-invention, that view is largely what guides the popular conception of New Zealand’s Gallipoli experience today, nearly a century on.

So can we analyse recent events ‘historically’, in the same sense that we can analyse something that happened a century or two ago? History-as-discipline is one of the intellectual pursuits that self-examines its analytical philosophy. Hobsbawm, for instance, didn’t divide history via round-number centuries but by events, typically, political and social (‘social’, inevitably, encompasses ‘economic’, which despite the ‘hardening’ of economics with a mathematical over-gloss since the late 1940s, is at heart about society).

To Hobsbawm, the nineteenth century was ‘long’, book-ended by the French revolution of 1789 and the First World War of 1914. Whereas the twentieth century was ‘short’, framed by the outbreak of the First World War in 1914 and the end of the Cold War in 1992.

Those arguments were possible because Hobsbawm stood at the end of the cycles; they were evident to him and he had a distance to perceive what had happened in fully historical terms, certainly as far as the ‘long’ nineteenth century was concerned. But what about things that have just happened? Things we popularly call ‘historic’ but which still burn fresh in memory and haven’t achieved the more sonorous quiet of a deeper past?

To me there are several issues. The first is the problem of context. Sometimes, the deeper over-arching forces that drive the widest patterns of history – combinations of long-standing technological, social, political, ideological and, it seems, environmental factors – aren’t obvious for decades afterwards. We can’t tell precisely what a particular development may mean until it’s put into place not only of what went before, but also of what went after – and, usually, some time after. Last week’s, last year’s or even last decade’s news won’t cut it in these terms.

The second issue is the related one of emotional perspective. It takes about 25-30 years, or more, for one generation’s problem to be resolved and replaced by another; and also for the people primarily involved in it to be far enough back to be treated with the (ideally) abstract dispassion of history.  It is only now, for instance, that we are seeing treatment of Winston Churchill that moves beyond the pro- and anti- partisanship of his life and immediate decades after his death.

Me, on the Bridge over the River Kwai.

Me, on the Bridge over the River Kwai, a place that brings the human condition into sharp relief. Something happened to me five minutes after this photo was taken that gives the lie to notions of ‘rational egoism’. Ask me in the comments.

Thirdly there’s the ‘recency’ phenomenon, in which we tend to view events just gone as larger than those further back, to the cost of proportion. This also fuels a tendency to view whatever just happened as the arbiter of the future. Take the Cold War, which – via Hobsbawm’s thesis – was a temporary product of the way the old world collapsed in 1914-19. But you wouldn’t have known that living in the middle of it. And when it did finish with the predictable collapse of the Communist economy, Francis Fukuyama insisted that history had ended – that Western capitalist ideology, as he defined it, had won, and there would be no further change. Ouch. This was ‘recency’ in full display.

The reality of abstract historical analysis, of course, is that it has nothing to do with ‘direction’ or ‘progress’ towards an inevitable or ideal one-dimensional ‘end’ such as I believe was implied by Fukuyama. Indeed, by definition, history cannot end. It’s a product of human change through time; and the onus is on historians to understand that deeper human condition, the ‘unity in diversity’ beloved of social anthropology, as a pre-requisite to being able to understand how that then expresses itself in ever-smaller scales of detail when framed by a specific society.

I’ve found through my own work in the field that practical detail changes affecting a specific society usually happen generationally – sometimes imperceptibly, sometimes with sharper impact as happened in the 1960s when the generation brought up in wake of the Second World War objected to the philosophy of their parents.

And so we have the tools with which to approach the issue of ‘recent’ history. The pitfalls of those tools may not be fully overcome – indeed, logically, they cannot be; but to know they are there and to understand how these limitations work is, I think, a very great step towards being able to couch recent events in more dispassionate light.

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon