I have always been intrigued by the way we elevate specialists. Anybody prominent in one of half a dozen fields that western society exalts is, we are told, somebody to be looked up to, and who are never wrong.
That’s particularly true of medical specialists particularly, where – certainly here in New Zealand – their accountability is so well protected that when one of them stuffs up, it’s treated as the patient’s fault. The best the patient can expect is that, after a long, expensive and and emotionally draining battle, the Medical Association might possibly think about, perhaps, wafting a slightly dampened bus ticket in the direction of the misbehaving specialist. Maybe. But really it’s the patient’s fault.
Such a culture, inevitably, means incompetence is never picked up. But there is a much wider lesson than this. To me, the fact that specialists do stuff up – and that intellectualised systems are set up to hide the fact – underscores one of the main problems with the way western society, particularly, is generally conditioned to think. We associate possession of knowledge – and particularly expert knowledge – with intelligence and with quality of thinking.
That’s something driven home from primary school where – certainly when I was there – chanting back the ‘times table’ was what got you brownie points, whether you actually knew how to multiply anything or not. It’s what TV ‘quiz games’ pivot on: if you happen, often by chance, to know some factoid or other that answers a specific question, you’re ‘smart’.
Really? My computer can store and churn back factoids far more accurately than I can, and in vastly greater quantity than I could ever rote-memorise. And yet a flatworm is way smarter than my computer. That underscores the issue. Knowledge of itself – the ‘facts’ that we are are told define our ‘intelligence’ – is meaningless until it’s synthesised. Knowledge means nothing until the meaning of the isolated facts is understood and the underlying patterns revealed.And then the onus is on to make sure that this analysis has some sort of robust quality to it. That’s another skill of itself.
The main weakness of specialists is the fact that they are just that. Specialists. They focus on one tiny part of the pattern and, on my experience, often miss the true big picture. Sometimes, they don’t even know they’ve missed it, because it’s so far out of their understanding – the fact that somebody has a PhD or medical degree doesn’t mean that they’re immune to the Dunning Kruger effect. Quite the opposite, in fact.
Sometimes that doesn’t matter. Sometimes all that counts is that tiny issue. But other times, the big picture is what’s important. And when a specialist fails to recognise that wider context – that’s when they get things wrong; and the worst of it is when their ego won’t let them admit that, maybe, ‘specialism’ and ‘wisdom’ are two different things.
The issue highlights the debate between ‘generalists’ and ‘specialists’. We are conditioned to look down on generalism: ‘jacks of all trades, masters of none’. It’s a perjorative – subtly, but still a put-down. And yet generalism, really, should itself be classed as a specialty. It means somebody has knowledge of a LOT of things.
There’s another ingredient, of course, before a generalist can see the wider patterns that follow. Thinking. Western education systems usually don’t teach people how to think – by which I mean, how to structure analysis. Too often, the fact of having knowledge is mistaken for analysis, leading to ‘cookie-cutter’ conclusions that may well work most of the time.
But when deeper analysis is needed – crossing subjects, encompassing wider thought than a specific issue – that ‘cookie cutter’ approach often fails. Spectacularly.
So from this perspective, specialising in generalism (if you get what I mean) – coupled with knowing how to think (how to analyse) creates huge advantages. It makes it possible to direct learning in specific directions – towards the the points of intersection between ‘specialties’, thus enabling the interconnections to be visible. It means that the ‘big picture’ then also becomes visible – and the patterns it points to can be identified, properly.
Once that ‘big picture’ is available – including the ‘operating principle’ around which whatever’s being looked at is structured – then it becomes possible to then focus in on the specific details and understand them properly. It’s something that specialists, who don’t consider big-picture stuff, often miss.
I’ve deliberately kept this vague – kind of general – because I think the principle applies in all forms of human endeavour, particularly when it comes to understanding the human condition. Thoughts?
Copyright © Matthew Wright 2017