Back when I was at primary school one of the many risks kids faced was being seen near other kids who drew the attention of a teacher and were punished. The thing was that anybody in the area was – by definition – part of the group and would be scooped up and also punished. If you protested and said you’d just been nearby, you were doubly punished for lying.
The thing was that just about anything invoked punishment – this was a school that punished one kid, I recall, for skipping for joy in jingly sandals. So it was better to keep your head down, keep well away from even the slightest hint of anything, and if the teacher held up four fingers and told you it was five – well, it was five.
I later discovered that this sort of experience was typical of any institution or system where one group has been given total authority over another. It doesn’t take long for the authority group to lose moral compass – to exploit that power for their own benefit and pleasure. When applied to societies – as happened particularly during the world’s ‘totalitarian’ period after the First World War – it provokes deep fear in the disempowered and deep corruption in the beneficiaries of the system.
But it also highlighted a common human cognitive flaw. We tend to see patterns in everything, even where none exist. It’s been postulated the cause is a legacy survival technique; back in hunter-gatherer days, early humans lived longer if they could spot the threat in the long grass from its shadows – and it was better to have a false alarm than to miss something real. So the ‘pattern assignation’ system evolved with a default of ‘on’ and a hyper-alert sensitivity. It also makes processes that create patterns (‘associations’) more convincing for us, cognitively, than processes that do not.
The problem I have is that ‘association’ profiling is not unique to either authoritarian dystopias or primary schools. The same logic drives the algorithms that major internet companies use to analyse behaviours and serve up targeted ads; if you look for something, you must be interested in it. And the flaw of the logic is also well evident – you do a Google search for (say) a shower fitting, once, and for the next 8 months you’re barraged with advertisements for plumbing, whether you need any more shower fittings or not.
That’s also how the ‘self-reinforcing bubbles’ work – Facebook, particularly, serves up an increasingly narrow range of content, based on what you appear to want, and it takes quite a bit of effort to break clear of it. People with eclectic interests, it seems, confuse the system.
What worries me – harking back to my school adventures – is the way this profiling can lead to false matches. That isn’t much of a problem when it comes to being barraged with plumbing advertisements. But it is a concern if society turns to its darker side, as actually happened in the totalitarian societies of the twentieth century. Many innocent people were punished for no better reason than inadvertently matching aspects of a profile that labelled them targets of the regime.
The fact that the ‘profiling’ is being done now via what one company or another see you doing online makes no difference to the underlying principle. And it’s too easy for that to go wrong – badly wrong – if society as a whole loses moral compass.
Copyright © Matthew Wright 2017