Is the APA’s ‘internet gaming disorder’ really a fair label for ordinary gamers?

The American Psychiatric Association recently called for study into a condition they call ‘Internet Gaming Disorder’. My gripe? However much it’s been intellectualised, ‘psychiatry’ is not a science because its diagnoses depend on personal opinion, not on testable (technically, ‘falsifiable’) empirical criteria. Where somebody is obviously in trouble, that’s not a problem. But for normal people who end up labelled ‘faulty’ because their behaviour appears to match whatever society’s latest transient panic happens to be, it is.

Screen shot from Id's classic 1992 shooter Wolfenstein 3D. Which wasnt, actually, in 3D, but hey...

Trust me, I’m a psychologist…

That’s the issue. There are often genuine reasons to be concerned. But social panics are also triggered by nothing more than reaction to change. And all I can see is that the ‘Internet Gaming Disorder’ scale will be turned into yet another intellectualised device for social control by which ‘psychiatrists’ validate their own sense of self-worth at the expense of normal people, this time targeting the behaviour of a generation who spend their time interacting with each other on screen instead of face to face.

Don’t forget, it’s only forty years since the APA tried to classify ‘introversion’ as a disorder.

You can imagine what would have happened if they’d succeeded. Suddenly, introverts – who we know today are a normal part of the human spectrum – would have been told their basic nature was a clinical abnormality. Then they’d be ‘cured’ by relentless assaults on their self-worth and by being forced to spend as much time as possible trying to engage with large groups of people and then told how faulty they were for not coping. After all, it’s ‘normal’ to get energy from socialising in large groups, so just go out and do it, and learn how to make yourself a ‘normal’ person, and it’s your fault if you fail, because it proves you didn’t try hard enough and are personally worthless.

Obviously there are genuine psychiatric illnesses – which are diagnosable and treatable – but I can’t help thinking that others are defined by pop-social criteria, given gloss by the unerring ability humanity has to intellectualise itself into fantasy. This was certainly true in the early-mid twentieth century, when ‘psychology’ emerged from a specific German intellectual sub-culture, as a reaction to the pop-social sexual mores of the day. This emerging pseudo-science, styling itself a true science (but not, because of the failure to meet falsifiability criteria), keyed into a period mind-set that sought to reduce a multi-shaded universe – including the human condition – to arbitrary and polarised categories.

The key false-premise that gave ‘psychology’ its power was the supposition that everybody – with the exception of the ‘psychologist’ – was ‘psychologically defective’. Neurotic. This was never questioned. When fed into period conformity to social imperatives, it meant that ‘psychology’ was less a tool for discoveries about the human condition as a means for bullying normal people who didn’t exactly meet narrow and often artificially (socially transiently) defined behaviours. That spoke more about the nature of period society and the personal insecurities of the ‘psychologists’ than about human reality.1195428087807981914johnny_automatic_card_trick_svg_med

The concept of ‘psychiatry’ emerged, in part, from the union of this pseudo-scientific illusion with medicine; and I am not sure things have changed today – for instance, one available diagnosis today is “ODD” (Oppositional Defiance Disorder), which is an obvious label with which a ‘psychologist’ can invalidate the last-ditch defence of someone who’s come to them for help and doesn’t submit to their ego and power.

What of the idea that ‘Internet Gaming Disorder’ is worth investigating? In a social sense internet gaming is a specialised framework for interaction – a way in which people, often on different sides of the world, associate with each other. The framework is very specific, and mediated by computer.

To me this is a key issue, because I suspect a lot of gamers are also introverts; and the computer enables them to interact with others without losing energy. Gaming also frames a specific sub-culture. Those in it respect the status of achievement within those terms. The computer enables them to interact, and to validate that particular interaction with people they respect. Of course this doesn’t describe the whole life, personalities or social interactions of people who happen to spend time gaming; but validation in various ways is one of the drivers of the human condition; and another is the desire of strangers to validate themselves by taking that away – bullying, which (alas) I think is probably also innate.

That’s why I have alarm bells going when I find the APA trying to call computer gaming a disorder.

Obviously gamers cover a wide spectrum, and no doubt a proportion who focus on it will do it excessively, for various reasons – perhaps including trying to get away from being bullied. But in the main, I suspect life is well in hand and gaming is simply a way of socialising via an abstract medium. The problem I have is that the APA’s question risks all gamers being swept up in a catch-all label of ‘disorder’, just like ‘introverts’ nearly were forty years ago, along with left-handers and anybody else who didn’t conform to ‘psychologically’ normal.

I should add – I don’t game. I would, if I had the time, the co-ordination skills – and an internet service that had a competitive ping-time. I don’t. But in any event, that’s not the issue I’m concerned with today.

Thoughts?

Copyright © Matthew Wright 2015

Has fast food made us lazy cooks? You decide…

I was presented the other day with a slightly disturbing story about an academic (has PhD, works at a university) whose day reportedly begins by slothing out of bed around 11.00 am and ambling to the nearest fried chicken joint, swiftly ingesting 7896 of the 2000 calories a normal adult needs in a day, along with triple the allowance of salt (without counting nitrides).

I was a little surprised – I mean, strokes, heart disease, diabetes and other problems pale into insignificance beside the possibility of being followed around by Stewie Griffin with his tuba:

So how is it that fast food has got so ubiquitous today? It seems to me we need to go back to the industrial revolution – 250-odd years ago now – with its cousin, the agricultural revolution (think Jethro Tull’s seed screw) – to explain it. These shifts eventually solved the eternal human problem; getting enough food. The problem was that food production, in general, also got industrialised and commercialised – and often didn’t end up focussing on what was good for health, but on what was good for profit. That’s true of a lot more than just fast food – but there’s a lot more fast food around of late than there used to be too; and a 2013 WHO report identified deregulation as one of the drivers of a rise in fast food uptake.

Triple cheesburger by Jpneok, public domain from https://openclipart.org/detail/204444/fast-food-triple-cheeseburger

Triple cheesburger by Jpneok, public domain from https://openclipart.org/detail/204444/fast-food-triple-cheeseburger

The way fast food is advertised to us underlines the fact that it’s a commercial product. It’s pushed at us for its taste, for its convenience – and let’s face it, if you’re a harassed parent, coming home tired from work to a house full of kids making noises like steam sirens while bouncing off the walls – isn’t it easier to take the family down town for Yummy Duck Bites and Schlob Burger, or get a pizza delivered, or scoff down that ubiquitous box of pressure-fried Gallus gallus domesticus? What’s more, it all tastes delicious because it’s packed with the things humans are geared to like, because we couldn’t get them easily back in hunter-gatherer days– salts, sugars and fats.

It’s easy to see how it’s become so ubiquitous. Problem is, fast food is also packed with deceptively high numbers of calories (thanks to the fats, mainly). And what happens then? Well, let me go pick up that tuba. That’s quite apart from what fast food does to essential gut bacteria. The take-out lesson? I’m sure it’s OK to eat fast food on a ‘sometimes’ basis. But I doubt it’s good to eat every day. Sure, the ingredients that go in are top-quality vegetables, milk, beef, chicken, and so on – but then they’re processed, filled with chemicals to preserve them, to condition the cooking oil, to help ensure a consistency of product, and so forth.

What do I recommend? I have healthy and fast home-cooked ‘Kiwi bloke’ recipes that nobody in my household other than me eats, which I’ll be happy to share in the comments – ask.

Copyright © Matthew Wright 2015

3D printed steak chips? It’s enough to make me go all hippy and vegetarian…

Human inventiveness seems limitless these days, so I wasn’t surprised to discover the other week that food technologists have been experimenting with 3d printed meat – currently produced, at astronomical expense, in the shape of chips.

Gallus gallus domesticus on Rarotonga, looking very much like the Red Jungle Fowl (Gallus gallus).

I’ll have my chicken free-range and wild, thanks…

Artificial food has been a long-standing SF staple – brilliantly played by Arthur C. Clarke in his hilarious 1961 satire ‘Food Of The Gods’. All food in this future was synthesised to the point where the very idea of eating something once alive had become offensive. Even the word ‘carnivore’ had to be spelt, lest it nauseate listeners, and synthetic meat had names unassociated with animals. In classic Clarke fashion, of course, there was a twist. Food synthesisers could produce anything. And there was this synth-meat called ‘Ambrosia Plus’, which sold like hotcakes until a rival company found out what the prototype was… (I won’t spoil the fun other than to point out that there’s a verb for a specific sort of meat-eating starting with ‘c’, and it isn’t ‘carnivore’.)

In the real world, 3D printed meat isn’t synthetic – it’s made of actual animal muscle cells which are artificially bred and then sprayed, in layers, to produce the product. Currently it’s a lab technique and the obvious challenge for its gainsayers is to find ways of industrialising it. Also of getting customers past the ‘ewwww’ factor of eating animal tissue bred in a petri dish and vomited into chip shape through a nozzle.

To my mind the key challenge is identifying the total energy requirement – printed meat may NOT be as efficient as current ‘natural’ methods of getting meat to your dinner table, where a large part of the energy comes from sunlight, via a grassy paddock and the digestive systems of ruminants.

Mercifully, we haven’t been told ‘This Is The Way ALL Meat Will Be Eaten In Future’, ‘The Future Is Now’ and other such dribble. Predictions of that sort pivot off the ‘recency effect’, by which whatever just happened is seen as far more important than it really is when set against the wider span of history. We fall into that trap quite often – often, these days, over products launched on the back of commercial ambition. What really happens is that the ‘way of the future’ idea joins a host of others. All of these then blend together and react with society in ways that eventually – and usually generationally – produces changes, but inevitably not the ones predicted by the ‘Future Is Here’ brigade.

In one of the ironies of the way we usually imagine our future, things that do dramatically change the way we live – such as the internet – are often not seen coming, or touted as game-changers. Certainly not in the way that food pills, flying cars and the cashless society have been.

As for artificial meat – well, I expect that if – IF – it can be industrialised, it’ll find a home in hamburger patties. But there seems little chance of it being mistaken for the real deal, still less supplanting a delicious slab of dead cow seared sirloin on the dinner table.

Copyright © Matthew Wright 2015

When ethics overcome history

Another iconic building in my home town, Napier, New Zealand, bit the dust a while back. The Williams building – 103 years old – survived both the devastating 1931 earthquake and fire that followed.

Panorama I took of Napier's Hastings Street, Williams Building to the far left.

Panorama I took of Napier’s Hastings Street, Williams Building to the far right.

Now it’s gone down before the wrecking ball. And a good thing too. You see, it apparently only met 5 percent of the current earthquake-proofing standard. Ouch. Surviving the 1931 quake and retaining its structural integrity were, it seems, two different things.

The Williams building. Click to enlarge.

The Williams building going…going… Click to enlarge.

It’s the latest in a succession of quake-risk demolitions around the city. A few structures – such as the Paxie building, centre in the photo above, or the old State Theatre (where I first saw Star Wars in 1977) have been gutted and the facades preserved. But original ‘deco’ buildings of the 1930s are limited to a couple of city blocks. A single heritage precinct. When I was a kid, deco filled the town.

....and gone....

….and gone…. Click to enlarge

I know, I can hear the howls of protest now. ‘But – but – you’re interested in history…how can you support knocking it down?’

Easy. History is more than the artefacts it leaves anyway, but the real calculation is more immediate. A few years back, Napier’s Anglican Cathedral hall was also under threat of demolition, in part because it was a pre-quake masonry structure. The Historic Places Trust approached me, wanting me to put my authority and repute as a nationally known historian behind their effort to have it listed and legally protected. I was well aware of that history, of course. But I knew the building was a quake risk –and I hadn’t been given any engineering reports on which to base the professional opinion I was being asked to provide by Historic Places.

The biggest horror story of the 1931 quake was the way a doctor had to euthanise a badly injured woman who was trapped in the ruins of the cathedral – the only way to save her from being burned alive by advancing fires. In was an appalling moment. The decision tore at him for the rest of his life.

I wasn’t going to endorse saving a building where that might happen again. Risking human life or preserving a historic building? It’s a no-brainer, really. So while it was sad to see that building go -and sad, since, to see other structures like the Williams Building disappear – it’s really not a hard choice. What would you do?

Copyright © Matthew Wright 2015

Is vandalism part of the human condition?

I have a small gripe. Vandals keep tagging a power pole just along from where I live. Marking territory, animal-fashion. It happens every few weeks. The local council always has it painted out within the day; but it highlights what, for me, is one of the saddest sides of the human moral compass.

From http://public-domain.zorger.comVandalism. If somebody has something, it seems – even something as simple as a nicely painted power pole in a quiet suburban street – somebody else wants to break it, take it away or deny it to them. Anything humans have, it seems, is targeted in its own way. Take computing. Visionaries like Bill Gates and Sir Tim Berners Lee had a concept for a wonderful and better human world, connected by computer. So what happened? Other people wrote software to damage, steal, or cause inconvenience to users. Vandalism! Somebody trying to take away what you have – these days, usually the contents of your bank account.

I see the same phenomenon in the way academics always respond to others in their territory by denying the worth of the other’s skills and work – vandalising repute in intellectualised terms. To me that is conceptually no different from the way imbeciles with paint cans performed – it’s designed to take away something that somebody else has.

It’s been common enough through history. And it always works the same way:

1. “Someone’s got something I don’t have, so I have to show I’m better by breaking it or taking it off them.”
2. “I am marking my place and showing I am more important than others.”
3.”I feel validated by doing so.”

The motives, in short, are entwined with ego, status anxiety, and with validating a sense of self. Most human actions are. However, vandalism is a selfish form of self-validation.  It validates by taking away from others. To me this the exact reverse of the way we should behave.

In fact there are other – and better – ways of validating yourself. Helping others, for instance – being kind, taking a moment to help.

If we work together to build, isn’t that better than trying to tear down what others do? It is the difference between selfishness (vandalism) and generosity (kindness).  Bottom line is that kindness is the better path. And I think that, through history, there are times when society in general has taken that kinder path – overtly and obviously. But right now, as we roll into the twenty-first century, isn’t one of them. And I think we need to change that – to nurture kindness by taking the initiative – by expressing kindness, even in small ways, to each other.

I’ve said all this before, of course, but it’s worth saying again. Your thoughts?

Copyright © Matthew Wright 2015

What ever became of all the good in the world?

I am always astonished at the limitless capacity humanity has for intellectualising itself away from care and kindness.

Quick - burn the intruding historian! Avenge ourselves!

School. If you’re accused, you’re guilty!

Many years ago, when I was at school, there was a coat cupboard at the back of the classroom. Next to the cupboard was a trestle table on which had been set a class construction project. The bell went. The class joyously leaped from their chairs and surged to the cupboard, shoving and ramming each other as they fought to get their coats and escape.

I’d hung back to wait for the scrum to clear and saw the cupboard door being forced back by the desperate mob, into the trestle table. I rushed to try and rescue it – too late. The whole lot collapsed to the floor as I got there. Needless to say I was blamed. Everybody had seen me standing over the ruin and it (again) proved what a stupid and worthless child I was, and how dare I claim I was trying to save it, I totally deserved what was coming to me.

So much for trying to be a Good Samaritan.

But – but you say – surely I had rights? No. I had absolutely none. Back then, teachers given power by the system used it to smash those the system had defined as powerless, the kids, and so validate their own sense of worth. If I was seen near a broken table and the teacher decided I had done it – well, then obviously I’d done it, and how dare I protest my innocence.

The main ethical problem with this sort of behaviour is that guilt-on-accusation and summary justice stand not just against the principles of our justice system, but also of the values of care on which western society prides itself. But that is how society seems to work, certainly these days. We have trial-and-conviction by media even before someone alleged of a crime has been charged, just as one instance.

All of it is a symptom of one side of human nature. A symptom of the way humans intellectualise themselves into unkindness. It stands against what we SHOULD be doing – stands against the values of care, compassion, kindness and tolerance that, surely, must form a cornerstone any society.

There is only one answer. We have to bring kindness back into the world – together. Who’s with me?

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon

How long is the ‘now’ moment we live in?

How long is ‘now’ – you know, the evanescent moment we live in and usually let past without properly experiencing it.

Wright_AuthorPhoto2014_LoNow, like time itself, is largely seen as a philosophical issue; a personal perception that stretches or shrinks depending on what we are doing. For a kid, an hour spent in a classroom listening to the teacher drone on about stuff the kid neither knows nor care about is an eternity; yet an hour hurtling about with friends at play disappears in a flash. Adults have a different perception of time again; that same elasticity flowing from interest and enthusiasm, but metered often by a sense of purpose. Yes the job’s boring, but it has to be done.

Beyond that is the concept of the ‘moment’ itself. What is ‘now’? In Buddhist philosophy it means being mindful – fully and properly aware of one’s immediate self, immediate place, and immediate environment. It means having awareness of the fullness of the moment, even in its transience, even as we think about past or future.

But what ‘is’ a ‘moment’, scientifically? The reported research indicated that a ‘moment’, to most people, is two or three seconds. Then that perception of ‘now’ vanishes and is replaced by a new one.

If we match that to attention spans we find that the typical time spent on any one item on the internet is literally only a couple of ‘moments’. And then when we realise how shallow the internet must be.

It also underscores just how important and valuable mindfulness actually is. Because a couple of blinks, literally, and the ‘now’ moment is gone.

Copyright © Matthew Wright 2015

Click to buy from Fishpond.

Buy from Fishpond.

Click to buy from Fishpond

Buy from Fishpond

Click to buy e-book from Amazon

Buy e-book from Amazon