The problem with self-appointed online experts

Something I’ve noticed online of late is the number of people who believe they are experts in fields in which they’re manifestly not qualified. The ones I’ve run into clearly believe themselves to ‘know more’ even than those qualified and professionally working in that territory.

On my experience of them, such people don’t want to engage in reasonable discussion: their purpose is to use their ‘knowledge’ to assert the place they have assigned themselves. As far as I can tell, such ‘experts’ are also usually so ignorant of the basic methodologies of whichever field they’ve adopted as their own that they aren’t even aware of that ignorance. David Dunning and Justin Kruger had something to say about that: a friend of mine refers to it as the ‘Dunny Clogger Effect’, which is a pretty apt description.

Sometimes that’s mixed with a related fallacy by which people professionally qualified in one field think their own training equips them to make expert judgements in a totally different field, as Randall Munroe of XKCD has long since observed:

Via cc 2.5 non-commercial attribution license.

That particular phenomenon is known as ultracrepidarianism: the fallacy by which somebody imagines that being an expert in one field makes them expert in a totally unrelated area, often more so than anybody actually qualified in the latter.

I’m not the only one to notice this either: someone I know who’s a qualified hydraulic engineer made the same observation about his field when encountering online ‘experts’ – people who, he pointed out, don’t respect even leading figures in the field. The issue is exemplified, these days, by the way random nobodies on the internet have all suddenly become expert epidemiologists.

It’s not a new phenomenon. Years ago, before social media, I used to get the same from ‘local historians’: people not qualified in history but who’d picked it up as a hobby and who regarded the subject as exclusive property. The way in which these people weaponised raw data as a device for defaming qualified experts, apparently in order to destroy their income and drive them from the territory, was extraordinary.

Essentially, then, there’s nothing new under the sun. What interests me is the apparent ubiquity of the issue, and the fact that it encompasses so many cognitive biases: everything from Dunning-Kruger to naive realism/objectivity illusions (the belief that raw ‘facts’ are objective), extension neglect (ignoring wider principles in favour of detail), and, of course, ultracrepidarianism.

I suspect the underlying force driving the whole phenomenon, certainly online, is self-validation: people hook their self-worth to their assumed status in a field of interest, and the internet has provided a vehicle that is essentially cost-free when it comes to poor behaviours. To me the ubiquity of the problem probably also says something about human nature – I mean, could these cognitive fallacies have been a survival advantage, way back when?

Have you run into this issue of ‘online experts’ in your own fields?

Copyright © Matthew Wright 2022

Advertisement

7 thoughts on “The problem with self-appointed online experts

  1. One thing I’ve learned since becoming a netizen is to check the background and qualifications of people trotted out as experts in a field. All science ain’t the same, to misquote an old fuel commercial. :/

    Liked by 2 people

    1. I think some general skill sets are transferable: the critical thinking attached to the humanities, for example; but even there, being expert in one topic doesn’t automatically make somebody expert in all. I don’t pretend to be qualified in ancient history, for instance – apart from Asterix books and having to wade through both Herodotus and Tacitus at honours level, I’ve not particularly engaged it. And yet, time and again, I get people ‘correcting’ me over data points in the interpretative field that I have explored for a living for 30-odd years. One of my favourites came from an engineer who informed me I was ‘wrong’ to use the official casualty figure of the 1931 Hawke’s Bay quake. This was because he had calculated a different figure that included deaths from a plane crash a few days later. Proof of my incompetence, apparently. I have no doubt that had I used his figure I’d have had a barrage of letters ‘correcting’ me on the basis that the official figure differed…

      Liked by 2 people

      1. -blinks- how on earth did he justify including the plane crash? Quite apart from the fact that it occurred in the air, why include something that happened /after/ the quake?
        Sometimes generalists can see the forest rather than the trees, and sometimes experts can be too blinkered by their own area of expertise to connect up the dots from ‘outside’, but I believe the current situation is the result of charlatans deliberately muddying the waters for their own gain. 😦

        Liked by 2 people

  2. It could be that some believe their advanced degree qualifies them as critical thinkers and experts at creating and testing hypotheses. They can therefore work with any old set of data and come up with credible conclusions.
    I’m suspicious of my own quick conclusions about anything even moderately complex.

    Liked by 2 people

    1. I think to an extent critical thinking is a transferable skill – in general it’s the basis of the humanities and there’s not a lot to choose between (say) ethnography and sociology even in terms of approach. History is a little different although, broadly, it’s a sub-set of sociology (and I’d argue that economics SHOULD be too). On the other hand, some disciplines don’t properly teach it. Engineering, for instance, where my own experience of the field indicates there’s a predominance of people who believe that (a) any apparently empirical data is by definition an immutable and objective truth, and (b) anything else is ‘opinion’. The problem with (a) is that it doesn’t question whether the data is truly empirical – for which critical thinking provides a tool-set, and (b) critical thinking also provides tools to test and frame the analysis. Nor is an empirical and objective outcome possible when looking at human societies. And yet I’ve repeatedly run into engineers who believe they are genius historians and economists because they’ve obtained what they imagine to be ‘objective’ data from a book or some website. I very much hope this mind-set isn’t extended into their own field; get something wrong, and people die.

      Liked by 3 people

  3. Your observations are entirely appropriate. Seen it too often in subjects which, like you, I have put a lot of effort into but someone with a smattering of information hears nothing you (I) have to tell them. One of my former colleagues used to call the mindset you describe as “invincible ignorance”. the online world is drowning in it.

    Liked by 2 people

Comments are closed.