The Big Bang theory wins again. So does Einstein.

It’s a great time to be a geek. We’re learning all sorts of extreme stuff. There’s a team led by John Kovac, from the Harvard-Smithsonian Center for Astrophysics, who’ve been beavering away on one of the fundamental questions of modern cosmology. The secret has demanded some extreme research in an extreme place. Antarctica. There’s a telescope there, BICEP2, that’s been collecting data on the cosmic background temperature. Last week, the team published their initial results.

Timeline of the universe - with the Wilkinson Microwave Antisotropy Probe at the end. Click to enlarge. Public domain, NASA.

Timeline of the universe – with the Wilkinson Microwave Antisotropy Probe at the end. Click to enlarge. Public domain, NASA.

The theory they were testing is as extreme as such things get and goes like this. Straight after the Big Bang, the universe was miniscule and very hot. Then it expanded – unbelievably fast in the first few trillionth trillionths of a second, but then much more slowly. After a while it was cool enough for the particles we know and love today to be formed. This ‘recombination’ epoch occurred perhaps 380,000 years after the Big Bang. One of the outcomes was that photons were released from the plasma fog – physicists call this ‘photon decoupling’.

What couldn’t quite be proven was that the early rate of expansion – ‘inflation’ – had been very high.

But now it has. And the method combines the very best of cool and of geek. This early universe can still be seen, out at the edge of visibility. That initial photon release is called the ‘cosmic microwave background’ (CMB), first predicted in 1948 by Ralph Alpher and others, and observed in 1965 by accident when it interfered with the reception of a radio being built in Bell Laboratories. That started a flurry of research. Its temperature is around 2.725 degrees kelvin, a shade above absolute zero. It’s that temperature because it’s been red-shifted (the wavelengths radiated from it have stretched, because the universe is expanding, and stuff further away gets stretched more). The equation works backwards from today’s CMB temperature, 2.725 degrees Kelvin, thus: Tr = 2.725(1 + z).

The COBE satellite map of the CMB. NASA, public domain, via Wikipedia.

The COBE anisotropic satellite map of the CMB. NASA, public domain, via Wikipedia.

The thing is that, way back – we’re talking 13.8 billion years – the universe was a tiny fraction of its current size, and the components were much closer together. Imagine a deflated balloon. Splat paint across the balloon. Now inflate the balloon. See how the paint splats move further apart from each other? But they’re still the original pattern of the splat. In the same sort of way, the CMB background pattern is a snapshot of the way the universe was when ‘photon decoupling’ occurred. It’s crucial to proving the Big Bang theory. It’s long been known that the background is largely homogenous (proving that it was once all in close proximity) but carries tiny irregularities in the pattern (anisotropy). What the BICEP2 team discovered is that the variations are polarised in a swirling pattern, a so-called B-mode.

The reason the radiation is polarised that way is because early inflation was faster than light-speed, and the gravity waves within it were stretched, rippling the fabric of space-time in a particular way and creating the swirls. Discovering the swirls, in short, identifies both the early rate of expansion (which took the universe from a nanometer to 250,000,0000 light years diameter in 0.00000000000000000000000000000001 of a second…I think I counted right…) and gives us an indirect view of gravitational waves for the first time. How cool is that?

Albert Einstein lecturing in 1921 - after he'd published both the Special and General Theories of Relativity. Public domain, via Wikimedia Commons.

Albert Einstein lecturing in 1921 – after he’d published both the Special and General Theories of Relativity. Public domain, via Wikimedia Commons.

What’s a ‘gravitational wave’? They were first predicted nearly a century ago by Albert Einstein, whose General Theory of Relativity’of 1917 was actually a theory of gravity. According to Einstein, space and time are an entwined ‘fabric’. Energy and mass (which, themselves, are the same thing) distort that fabric. Think of a thin rubber sheet (space-time), then drop a marble (mass/energy) into it. The marble will sink, stretching the sheet. Gravitational waves? Einstein’s theory made clear that these waves had to exist. They’re ripples in the fabric.

One of the outcomes of last week’s discovery is the implication that ‘multiverses’ exist. Another is that there is not only a particle to transmit gravity, a ‘graviton’, but also an ‘inflaton’ which pushes the universe apart. Theorists suspect that ‘inflatons’ have a half-life and they were prevalent only in the very early universe.

There’s more to come from this, including new questions. But one thing is certain. Einstein’s been proven right. Again.

Copyright © Matthew Wright 2014

Coming up: More geekery, fun writing tips, and more.

Write it now: do writers always perch on a soap-box?

Back in the early 1980s, when I was a history student at Victoria University, one of the other students took me aside and nodded towards the lecturer. ‘D’you know he’s really a Liberal?’

Hmmn

Hmmn

The Professor in question was one of New Zealand’s leading historians of the day on the Liberal party, which was in government 1891-1912 and imploded in the early 1920s. The world had long since moved on, rendering interest in them academic. Which, I suppose, is why this Professor was studying them.

That didn’t make him a Liberal, personally. But the distinction, it seemed, was lost on his students, to whom interest and personal advocacy were one and the same. The idea’s not unique to universities – though on my experience the angry, sanctimonious and half-educated youth who inhabited the history department at the time set the gold standard.

Post-Vietnam anti-war rhetoric was well entrenched. Post-colonial thinking was on the rise. Failure to advocate it was a fast road to social ostracism, buoyed on unsubtle intellectual bullying that enforced conformity to the breathless ‘new order’. Those who failed to conform lost out socially and found that career doors were not opened.

Conflation of interest with advocacy happens in the real world too – for writers it’s an occupational hazard. Freelance journos are bound to crash into the social no-no de jour sooner or later – they write on such a wide range, and even those who focus their brand into a particular subject get tarred eventually. Non-fiction book writers hit it. Want to write a book on how the Nazis took over Germany? Be careful.

Novellists hit it – I recall reading that Jerry Pournelle and Larry Niven took a lot of stick for setting  The Mote In God’s Eye in a human Empire. Were they advocating Imperialism? Not at all. This was simply the setting.

That’s not to say that writing can’t be a soap-box. Often it is. But it can also be abstract – and it’s important for the writer to understand how that works – to signal the difference. Also for readers to appreciate it.

For me the trick is stepping away from the bus. Looking back and figuring out just what it is that frames the way we think. It doesn’t mean rejecting that – but it does mean understanding it. From that, it’s possible to be properly abstract. Or, indeed, to get back on the soap box, this time in an informed way.

Your thoughts?

Copyright © Matthew Wright 2014

Coming up: More writing tips, science geekery and fun. Check it out.

I miss my future. It’s been taken from me.

I miss my future. When I was a kid, 21st-century food was going to be pre-packaged space pap. We would all, inevitably, be eating  paste out of tubes. It was futuristic. It was progress.

On  the way to Mars, concept for 1981 flight,via NASA.

The future of 1970: a Mars mission, 1981 style.

Today? We’re in that future. And I still cook fresh veggies and steak. Some of it from the garden (the veggies, not the steak).

When I was a teenager, plastic cards were going to kill cash. In the 21st century we’d just have cards. It was inevitable. It was the future. Get with the program. Today? We use more cash than ever, but chequebooks died.

When I was in my twenties, video was going to kill the movies. It was inevitable. We just had to accept it. When I last looked, movies were bigger than ever – didn’t The Hobbit, Part 2,889,332 just rake in a billion at the box office?

And, of course, personal computers were going to give us the paperless office. Except that today every office is awash with …yup, paper, generated by what we produce on computer, churning out of giant multi-function copiers that run endlessly, every second the office is open.

Did we fail to adopt all these things hard or fast enough? Is it just that technology hasn’t quite delivered what was expected – but it will, it will? No. The problem is with the way we think – with the faulty way we imagine change occurs over time with technology and people. With the way we assume any novelty will dominate our whole future. With the way we inevitably home in on single-cause reasons for change, when in reality anything to do with human society is going to exist in many more than fifty shades of grey. The problem is a fundamental misunderstanding – driven by the simplistic ‘progressive’ mind-set that has so dominated popular thinking since the Age of Reason.

I know all that. But still…I miss my future.

Copyright © Matthew Wright 2014

Coming up: More writing tips, science, history and more. Watch this space.

Looking for the missing spirit of Christmas…with zombies…

We went to the local mall on Sunday. It was packed, of course, with the usual shopping zombies, their minds destroyed by the glitz and glam.

The Zombie Christmas Maul

The Zombie Christmas Maul

Whenever we visit the mall, She Who Must Be Obeyed forbids me to shuffle along behind them, matching their gait and murmuring “braaaaiiins….”

Well, I’m not forbidden, but she won’t walk hand-in hand if I do, instead she’s on the other side of the mall saying things like ‘I don’t know that weird guy.’

Being the weekend-before-the-weekend-before Christmas, there were a LOT of people shopping last Sunday, interspersed with cellphone-toting teens whose minds were miles away, and toddlers drifting aimlessly around the whole lot like the wayward satellites of some Jovian supergiant. Every so often, one of the squidlings would squeal with the exact pitch and timbre of a gym shoe being scraped across a polished floor.

Looking at the way everybody had been reduced to brainlessness by the pressure to buy, buy, buy for Christmas, I couldn’t help thinking we’ve lost something.

It’s Christmas. It’s a time for caring. A time for families. A time to think of others. A time – well, it’s Christmas Spirit, isn’t it.

What’s it become? A marketing frenzy. A shallow exercise in consumerism. A concerted effort to extract as much cash as possible from the wallets of many who cannot really afford it.

Here in New Zealand, the shops will be open right through Christmas Eve – and open again on Boxing Day when, inevitably, it will be ‘sale time’. I believe that’s true elsewhere too.

Where has the spirit of care gone? Your thoughts?

Copyright © Matthew Wright 2013

Coming up: Fun holiday stuff – with some history, geekery and writing stuff. Regular writing tips, science geekery, history… and more… returns in the new year. Watch this space.

Into deepest time with the REAL big bang theory

My wife occasionally calls herself ‘Penny’, as in Penny off The Big Bang Theory. Especially when I get together with my mathematician friends and we talk geek.

I’m not sure which of us is meant to be Sheldon. Anyway, the ‘big bang’ theory itself was first proposed in 1927 by a Catholic priest, Monseigneur Georges Henri Joseph Édouard Lemaître (1894-1966). He was trying to explain Vesto Slipher’s discovery that distant galaxies were retreating. And he was ignored. Then, in 1929, Edwin Hubble (1889-1953) suggested the same thing. Like most academic fields, physics is all to do with in-crowds; when Hubble spoke, other physicists pricked up their ears.

Timeline of the universe - with the Wilkinson Microwave Antisotropy Probe at the end. Click to enlarge. Public domain, NASA.

Timeline of the universe – with the Wilkinson Microwave Antisotropy Probe at the end. Click to enlarge. Public domain, NASA.

Their logic went like this. Distant galaxies appear redder than they should.  This is because the wavelengths of light and other electromagnetic emissions from them are being stretched from our perspective, meaning they must be moving away. This effect was first discovered by Ernst Doppler who realised this was why fast-moving vehicles go ‘neeeeoooww’. The sound waves are being stretched from the perspective of a stationary listener as the source moves away, so to them the pitch appears to drop. (You can buy a Sheldon costume so you can be the Doppler Effect, like he was in Series 1 Ep. 6…here.)

It works the same with electromagnetic emissions, and red has a longer wavelength than other visible light, so things moving away appear redder to us. Hence the term ‘red shift. It’s used to describe the phenomenon, even if the wavelength isn’t visible light. (No costumes for this one).

Hubble discovered not only that distant galaxies retreat from us, but that the further away they are, the faster they retreat. Hubble’s Law followed: v = H0D, where v is velocity of recession, Ho is Hubble’s constant, and D is the proper distance. The value for Hubble’s constant has never been agreed, but recent work suggests it might be 71 +/- 7 km/sec per megaparsec. Probably. A bit.

It also turned out that distant galaxies are moving away from us whichever way we look, showing that space-time itself is expanding. Imagine a rubber balloon with equidistant dots on it. Inflate the balloon. The dots move apart equally – and the distant ones are moving away faster. That holds true for space-time.

The conclusion was that the universe had been smaller – a mathematical point, in fact, from which everything exploded into the reality we know and love today. Pretty much like the opening credits on The Big Bang Theory, in fact.

Of course, it wasn’t expansion into a void. It was an expansion of space-time itself. The very fabric of physical reality.

It was a kind of cool idea, but nobody had any way of proving it. Physicists argued over whether there had been a ‘big bang’, or whether the universe operated by a modified ‘steady state’ of constant but expanding existence. Then, in 1948, Ralph Alpher and Robert Herman predicted that we should be able to see cosmic background radiation from the ‘big bang’ – and it was found in 1965. The radiation has a black body (idealised) temperature of 2.72 degrees Kelvin, give or take a tad (I define +/- 0.00057 degrees as a ‘tad’).

Into deepest space: Hubble space telescope image of galaxies from the early universe. Public domain, NASA.

Into deepest space: Hubble Space Telescope image of galaxies from the early universe. Public domain, NASA.

And you know the coolest part? Albert Einstein figured it all out in 1917, before any of the evidence was available. His General Theory of Relativity made clear the universe couldn’t be static – it had to be expanding or contracting. Einstein thought that had to be wrong, so he added a ‘cosmic constant’ to eliminate the expansion. But expansion was true, and he later admitted the ‘constant’ fudge was a mistake. His original equations held good.

Einstein had, in short, figured out how the universe worked – so completely that his theory explained the bits that hadn’t been discovered yet.

How cool is that?

Copyright © Matthew Wright 2013

Coming up soon: ‘Write it now’ and ‘Sixty Second Writing tips’, more humour, more science…and, more.

Can dyslexics become great writers? Totally.

I discovered the other day that Agatha Christie was dyslexic. She was also one of the best writers and literary stylists around in early twentieth century Britain.

Jules Verne, public domain from Wikimedia.

Jules Verne, public domain from Wikimedia.

Other dyslexic authors include Stephen Cannell, Jean Betancourt, Jules Verne and Gustave Flaubert, among others. Here’s a list of 25 famous dyslexic authors.

That’s no paradox. ‘Dyslexic’ doesn’t mean ‘stupid’.  Those who have it innately process certain things in a different way from how others do it, which often appears as problems with western reading, writing and spelling. The underlying issue can also manifest as problems with number order (dyscalculia), motor co-ordination (dyspraxia),  or disentangling phonemes when someone speaks (dysaudia). Really these are aspects of the same thing, but western rationalism conditions us to divide concepts into little boxes that miss the connections.

Some dyslexics can read just fine, but have difficulty typing letters in the correct order. Usually ‘dyslexia’ is a combination of all these. It varies individually. The issue is also involved with short-term memory. The thing to understand is that western-style reading, spelling, number-order and hand-writing are not impossible for dyslexicsjust slow and demanding of energy. This is because the processing of the detail has to be done in a different way.

A lot of dyslexics never get identified. Certainly when I was at school, they were usually chewed up by an intolerant and vicious education system that held them responsible for a difficulty not of their making, judged them stupid, lazy and worthless, taught in ways that didn’t work, and spat them out in ruin.

Nikolai Tesla in 1895. Public domain, from Wikimedia Commons.

Nikolai Tesla in 1895. Public domain, from Wikimedia Commons.

It’s often considered a ‘disorder’, though it should not be. The perception, including the medical definition and word ‘disorder’, comes from the problems dyslexics have with western constructs such as left-to-right writing and the culture-specific measure of ‘success’ that follows. Dyslexia doesn’t seem to affect Asian linguistic groups so much.

Also consider this. If people with dyslexia were the majority instead of 5-to-20 percent of the western population, dyslexia wouldn’t be a ‘disability’, it would be normal.

And what an amazing normal that would be.

Although dyslexics are often held to have the usual range of possible IQ’s, that isn’t actually a useful measure of intellect (I’ll explain why in the comments, if anybody asks…) and dyslexia is more often associated with creative thinking and an ability to concieve concepts, ideas and lateral connections. This is because thought is often in the form of pictures and inter-plays of shape, which many people with dyslexia do quickly, naturally, and easily – and in ways that those who don’t have it can’t.

The point being that this is where our civilisation has come from.

Famous dyslexics include Albert Einstein, who figured out how the universe worked. Alexander Graham Bell, Thomas Edison and Michael Faraday – among other scientists – are considered to have been so. My favourite is Nikolai Tesla, who gave us the modern world. All of it. Electric motors? Tesla. Neon lights? Tesla. Your household mains power supply? Tesla. X-rays? Tesla. Any electromagnetic broadcast – meaning all radio, TV, radar, Bluetooth, your microwave oven, your wireless connection – basically, everything we associate with today’s living? Nikolai Tesla.  Need I go on?

Nikolai Tesla with some of his gear in action. Public domain, from http://www.sciencebuzz.org/ blog/monument-nearly-forgotten-genius-sought

Nikolai Tesla with some of his gear. Public domain.

I suspect this same ability to conceptualise is also why some dyslexics go into acting - Marlon Brando, Harrison Ford, Dustin Hoffman, Bob Hoskins, Fred Astaire, Henry Winkler, Liv Tyler, Orlando Bloom, Kiera Knightley and Susan Hampshire are listed among them. Acting is about creating and transferring an emotional response to an audience, which is inexpressible in words and better conceptualised as shapes and patterns. It’s worth remembering, too, that Daniel Radcliffe is reported to be dyspraxic – a related issue.

The thing is that despite being superficially about words, writing is about shapes and patterns - about expressing the inexpressible, the intangibles of emotion, through the terribly flawed and inadequate medium of words. All writers have to strive for these ends, but dealing with shapes, patterns and concepts is something people with dyslexia do naturally, easily and quickly. So from this perspective it’s not surprising that Agatha Christie was a great writer. So were (and are) F. Scott Fitzgerald, W. B. Yeats,  Jules Verne, John Irving, Richard Ford and George Bernard Shaw, among many others.

What surprises me, in fact, is that writing hasn’t attracted more people who are dyslexic.

What are your thoughts?

Copyright © Matthew Wright 2013

Coming up: ‘Write it now’, humour posts and – well, watch this space.

Why is the weather going mad? Humanity’s limitless stupidity, that’s why

The weather these past years seems to have gone mad, and not just in New Zealand – though here it’s been bad enough, we’ve had successions of intense storms with record-breaking wind speeds.

Wellington was in chaos for days after a ‘one in a century’ storm in June – our third in a decade – knocked out power to tens of thousands of homes, felled trees and smashed commuter infrastructure.

Two mornings after, and still raining. Photo I took of debris on Petone Beach. Storm surges drove timber from the Hutt river right up on to the road here.

My photo of debris on Petone Beach, June 2013.

The Dutch half of my family tell me that, over in the Netherlands, winter decided to give spring and summer a miss. It never warmed up until a couple of weeks before summer was due to end. Nothing seemed to stop the rain.

The Hutt river, looking south towards the rail bridge. Usually there's a lot more water in it than this.

Drought 2013, Hutt river. Usually there’s more water in it.

This week Boulder, Colorado, was awash with 1-in-1000 year floods – I picked the story up via blogs, and then news came of a couple of Kiwis living there who had to flee before the deluge. (Check out Susie Lindau’s blog, in my links. and Phil Plait’s awesome science blog ).

Meanwhile Japan – including the damaged reactor at Fukishima  - is being hammered by Typhoon Man-Yi. Half a million people have been ordered to evacuate.

I have an interest in understanding this because I’ve been writing a book on coal, environment and our attitudes (coming out next year). So is all this global storminess a coincidence? Mathematically, that’s possible. Random events – to human perception – appear to cluster. But there is a common cause. A recent analysis attributed about half the recent extreme weather to human-created climate change. Bearing in mind that ‘climate’ and ‘weather’ are not the same thing,  we’re facing the first obvious consequence of our 250 year crusade to dump fossil carbon into the atmosphere.

I’ll blog later about the science of climate change. To me, though, the way things are panning out reveals a great deal about the human condition.

My reasoning at the broadest level is this. We’ve been playing our usual trick of exploiting resources until they’re gone. That was an essential survival skill in the last Ice Age. Other species of human – the Neanderthals, the Denisovians, the ‘Hobbits’, all died. H. Sapiens alone survived – we had, it seemed, the ‘tude (it seems to have been a function of our greater ‘working memory’).

A diagram I made of where we think everybody was, mostly, using my trusty Celestia installation and some painting tools.

A diagram I made using my trusty Celestia installation and some painting tools.

It worked a treat when the human population was a few thousand. When environments were exploited, people moved on – or dwindled, as on Easter Island. But it got industrialised. World population was around a billion in 1800. Factories, locomotives, ships and households in burgeoning cities began pouring coal smoke into the air. Humanity began exploiting the environment not on a regional scale, but globally.

There was but one outcome – the biggest ‘own goal’ in the history of the world, and we’re staring down that barrel now. Into which, as far as I can tell, has swept that other component of the human condition; stupidity – intellectualised, given traction by its rational gloss. But still stupidity.

It’s evident in the way we’ve reacted to climate change. It’s been emotionalised, rationalised, politicised, reduced to catchechisms, polarised between ‘warmists’ and ‘deniers’. All for reasons that have little to do with science, and a lot to do with vested interest, political need, even personal conviction over what constitutes reality. All of it slowing efforts to understand what is happening – then take steps to fix it.

Look at it this way. Past biomass – mostly plants – built up over tens and hundreds of millions of years, has been dug up as coal, gas and oil, then burned in what, by geological standards, is an eye-blink. We’ve dumped the waste products of all those millions of years worth of ancient ecosystems into Earth’s current system in just 250 years – which, when we’re thinking on these scales – amounts to one swift hit. It’s like taking a century’s worth of household rubbish and trying to jam it into a bag that’s only good to hold the rubbish from this morning. And then we try to rationalise our way out of the consequences?

I mean - duh! What did we think was going to happen?

The people at the receiving end of unprecedented weather events are the first victims.

Copyright © Matthew Wright 2013

Coming up this weekend: “Write It Now” and “Sixty Second Writing Tips” return.

Black Friday, paraskevidekatriaphobia, and the origin of OMG

I have never quite understood why Friday 13th is viewed with such foreboding.

HMS Invincible - invented by Jack Fisher and absolutely not going to sail on a Friday 13th in 1914.

HMS Invincible –  the first battlecruiser, invented by Jack Fisher (along with ‘OMG’) and absolutely not going to sail on a Friday 13th in 1914.

From the science perspective it’s no different from any other day. The Earth revolves on its axis, creating the illusion of the sun rising and falling – but one revolution, surely, isn’t any different from another. Arbitrary dates and divisions we make up in western society, surely, are just that? (OMG, I sound like Spock.)

Lots of people beg to differ, though. We are, it seems, often paraskevidekatriaphobics - including, it seems, the man who invented OMG. I’ll explain. On 1 November 1914, a German cruiser squadron under Vice-Admiral Maximilian Reichsgraf von Spee shattered a British force under Rear-Admiral Sir Christopher Cradock, off Coronel.

The British Admiralty – under their volcanic First Sea Lord, Admiral Sir John Arbuthnot Fisher – responded decisively.

First use of OMG! Part of p78 from Fisher's 'Memories' (Hodder & Stoughton, 1919).

First ever use of OMG, in a letter from Sir John Fisher to Winston Churchill, 1917, published two years later; from my copy of Fisher’s ‘Memories’ (Hodder & Stoughton, 1919). Click to enlarge.

Fisher was an incredible character – deeply devout, creative, brilliant, egotistical, paranoid and prone to pursuing feuds, the man who invented not only the battlecruiser but also the term OMG that we know and love today. Seriously! I have the original publication. “O.M.G. (Oh! My God!)” And as First Sea Lord, he wasn’t going to stand for any rubbish from the Germans.

On the back of von Spee’s Coronel victory, a massive force, including two battlecruisers, was ordered to hunt down and destroy von Spee’s cruiser squadron. But then it turned out that Invincible needed dockyard work at Devonport and would not be ready to sail before Friday 13 November. Fisher discovered the point and declared to Winston Churchill, then his political counterpart in the Admiralty, ‘Friday 13th! What a day to choose!”

Churchill thought so too, though for other reasons than those of a superstitious sailor. Britain was at war, and as far as he was concerned there was no excuse for dockyard slackness.  The ships, he insisted, would leave on Wednesday 11th – even if it meant sending dock workers with the Invincible.

They did. And it turned out to be very bad luck for von Spee, who was caught and annihilated off the Falkland Islands on 8 December.

Do you believe in Friday 13th – or other omens?

Copyright © Matthew Wright 2013

Our fascination with Diana just doesn’t go away

It’s the sixteenth anniversary of Princess Diana’s death in in Paris this week.

It was first reported in New Zealand, mid-afternoon on that August day in 1997, as ‘breaking news’ that she had been injured in a motor accident. In our household we were cynical about media beat-ups of Diana’’s adventures – a woman being presented not as some-time part of a key British governmental institution with a thousand year history, but as celebrity gossip magazine fodder, with all that this implied for manufactured drama.

‘Pah,’ I snorted to She Who Must Be Obeyed. ‘Probably chipped a pinkie nail.’

We were due to have dinner with my wife’s parents that evening. By the time we got there, the news was out.  Since then I’ve been through the Pont de l’Alma tunnel where Diana’s car crashed. There is a small sculpted flame on a plinth at the eastern end, in memorium.

A photo I took from the top of the Arc de Triomphe - Hausmann's wide boulevards with their clear lines of fire for cannon very evident.

OK, not the tunnel. I didn’t take a photo of the tunnel. This is one I took from the top of the Arc de Triomphe, on film, using 200 ASA Fuji Superiacolor stock. Hausmann’s wide boulevards feature, designed to give clear lines of fire against mobs.

What intrigues me about the whole tragic affair is that it hasn’t gone away. The outpouring of grief during those late August days of 1997 was unprecedented. Ironically, I suspect Diana had captured the hearts and minds of people partly because the media circus that pursued her, even as she lay dying in the wreck, had also brought her into every household.

That did not reduce the disgraceful voyeurism. A circus motivated not by the values I learned journalists should have – fair investigation, getting the stories that are important for society – but by baser need; personal profit leeched off the fortunes and misfortunes of others, fuelled by the fascination society has been conditioned to have with celebrity.

The meda fascination about Diana hasn’t gone away – expressed, still, in relentless talk of conspiracies, of plots, of secrets known only through whispered stories published in gossip magazines. Naturally. Celebrities can’t die in mundane and banal ways, can they, or by the same sorts of accidents that affect the real world – though we can be fairly sure that is exactly what happened.

The media money circus grinds on. It’s grubby. It’s demeaning. And it’s probably not going to stop for a while.

What are your thoughts? And what were you doing when you heard the news  of Diana’s death?

Copyright © Matthew Wright 2013 

Anyone for a PINT? What I dislike about psychometrics

There is a scream here in New Zealand at the moment about the way psychometric testing is being used to select public servants and others for redundancy. And quite rightly, too. One aggrieved victim has already obtained a $15,000 settlement in the employment court over it.

As far as I am concerned psychometrics are pseudoscience. Some stranger gives you questions based on a pop-theory about human behaviours and characteristics. None of them fit how you think, but you fumble through anyhow.

Then this stranger, who has never met you before and is ignorant of you as a rounded person, informs you what sort of person you Really Are. You’re classified, pigeon-holed and put into your box. Or is that ‘place’?

1206563615670858090johnny_automatic_soldiers_heads_svg_medI recall, years ago, being told what sort of person I was after such a test. When I objected, I was told this was because I was the sort of person who would object. Quite. There are words to describe people who follow this particular tautology.

What I object to is the arbitrariness. Most of these systems are based on how some psychologists imagine people should be. Yes, it  fits some broad character archetypes. And people can usually see aspects of themselves in the results, once they’ve heard them (think about what that actually means).

But these tests are  framed by the mind-set of those who create them – something defined by time and culture. A lot of psychometrics harks back to thinking of the early-mid twentieth century, with its mechanistic ways of deconstructing and classifying complex systems, notions of uniformity, and its arbitrary way of handling shades of grey.

Early twentieth century psychology was relentlessly guided by the period need to reduce and systematise humanity, just as the wider world was being systematised. Hence Jung’s work on psychological types and classifications which eventually fed into the Myers-Briggs reduction of complex human reality to just sixteen slots.

Psychometric testing is also culture-centric. The classic example is the IQ test posed in the 1920s to European migrants hoping to enter the US. They were stopped at Ellis Island and tested. One of the questions was a drawing of a house without a chimney; add the missing item. To those brought up in Eastern Europe the missing item was a cross over the door. But that wasn’t the right answer, and they missed other culturally-framed questions the same way – ergo, they were morons, and sent away again. Some were killed by the Nazis, a few years later.

But the limits of psychometric testing hasn’t stopped adoption by corporates. Why? Because these tests classify people in ways that can be enumerated, like accounts. And it’s attracted a lot of pseudo-science – even from people with qualifications in psychology – who have filled the market with ingenious, glib and corporate-friendly systems for fitting people into trendy theory. ‘Hey, here’s a test for reducing the human condition to twenty questions and four character types arrayed in a polyhedron.’

I have put much of my adult life into trying to understand the human condition – how it has framed history, how it frames us now; and I think one of our faults is our ability to over-rationalise and lead ourselves down fantasy paths.

Psychometrics. Useful tool – or arbitrary systems for pigeon-holing people that we’ve inherited from an early-mid twentieth century that also brought us eugenics? Your thoughts?

Copyright © Matthew Wright 2013