In 1995, McArthur Wheeler robbed 2 Pittsburgh banks at gunpoint in broad daylight, wearing no mask and turning to smile at the security cameras. When - not surprisingly - the police turned up later at his house to arrest him, his reaction was one of amazement - he was convinced the camera footage must have been doctored. "But I wore the juice", he muttered.
Wheeler was not insane - he was just suffering from cognitive bias. He believed that since lemon juice could be used as invisible ink it must confer the property of invisibility to anything it coated, so he smeared it on his face before entering the banks so the security cameras would not be able to photograph him.
This case prompted scientists David Dunning and Justin Kruger to investigate the effect whereby people tend to estimate their own abilities to be higher than they actually are. The results of their research are now known as the Dunning-Kruger effect.
The Dunning-Kruger effect - cognitive bias - has little to do with intelligence or stupidity but is associated with very low levels of expertise. For example, 80% of drivers believe themselves to be better than average - a statistical impossibility. And contestants on national TV talent shows believe they are exceptional singers when in truth they can barely hold a note.
Software engineering is not immune to this effect; we have all come across programmers with wildly inflated estimates of their own abilities. The less is known about a given task the more likely it is that over-confident claims will cause it to run over time and budget or not work at all once delivered.
There's a second part to the Dunning-Kruger effect that gets less attention, based on the following graph:
In his recent Dev.to article How My Terrible Memory Makes Me a Better Developer, Devon Campbell explains how having a terrible memory encourages better programming practices - most notably efficient and comprehensive documentation. While I suspect his memory is probably no worse than the rest of us, the points made are both interesting and highly relevant.
Without explicitly saying so, Devon's article neatly highlights the second part of Dunning Kruger, which is that as people become more expert in a subject they tend to under-play their own abilities. If you or I find something to be easy or self-evident we tend to assume that others also find it equally easy or self-evident. A moment's thought should reveal this reasoning to be highly suspect, but for many of us it's hard to accept that we really are the experts and that others don't even come close.
The graph shows that ignorance is accompanied by absolute certainty but knowledge qualifies itself. An ignorant person, who doesn't know what he doesn't know, is often 90-100% confident that something is so, but an expert will give himself some leeway and claim only 70% certainty. 90% is better than 70%, right? So who is the majority going to listen to - the one who is certain or the one who acknowledges he doesn't have all the answers?
This is most visible in politics; for example in the UK's ongoing Brexit saga, where those wanting to leave speak with certainty but rarely present any coherent arguments as to what precise benefits will accrue, whereas those wishing to stay use words like "may" and "should" when positing scenarios they claim will follow. This same distinction can easily be seen in the press; you only have to look at the utter conviction expressed in one newspaper and compare it to the nuanced argument presented in another.
As for software engineering, where does all this help us? For one thing, we should all place ourselves somewhere on the Dunning-Kruger graph and then behave accordingly. As I said above, when you don't know much at all it's hard to know how much knowledge you are lacking, so an important action is to go on learning. Don't just assume that having found the answer to one problem it will apply to all. Also, try to avoid conveying certainty. It's not a failing to be circumspect; after all, it's something that marks out experts. People who are always sure of themselves may fool the general population but they shouldn't be able to fool us with more experience.
And if you really are an expert, accept the fact. No false modesty, please. It's not obnoxious to be right as long as you don't put others down in the process. Learn to really judge how good you are compared to others.
One purely practical piece of advice concerns the phrase "read the code", which is too often used to dismiss an inquiry coming from someone who has been unable to find adequate documentation. Those who come after you may be able to read code or they may not - it isn't a universal skill. If you have it you are most fortunate, but many others - myself included - find it extremely difficult and rely on documentation to point the way. By omitting documentation you are closing the door of understanding to anyone with less ability than yourself - before you even consider the advantage you had in being the author of that code.
Photo by Joao Tzanno on Unsplash
Top comments (2)
Thank you Graham for raising such an interesting and important matter. I feel this has a lot to do (in our computer science domain) with the fact that we expect people to be rapidly operational on never ending new tools and pattern. I guess people try to survive by adopting a confident posture regarding a given technology rather than admitting weakness on it. This is the kind of advice Jeff Besos gave, "fake the job, then learn how to do it". I think it is absurd, unless you need an army of medium competent people to do a very time consuming job.
We should build confident, battle tested principles and patterns, and teach people how to look on the long term rather than pursuing a chimera made of frameworks with "revolutionary ideas" but poor long term vision. In this way I very appreciate your work on EasyCoder because that is the kind of technology that will break through the need of efficient and sustainable tools to do the job.
Keep up with the great articles.
These points are well made. I know I'm somewhat more senior than most (to be honest, that's pretty well 'everyone') but I'm not really conscious of having any more memory problems than in the past. However, the rate of introduction of new technologies and techniques is leaving me stranded, and I imagine a lot of others feel the same. If you aren't living and breathing the stuff you rapidly lose your edge.
The problem may be I have to do a lot of varied things these days, each one requiring its own knowledge base. So I could be writing JavaScript code for a few weeks then have to do a couple of weeks on Python. Those few weeks are enough for carefully-constructed mental maps to begin to unwind.
I lived on the French/Italian border for several years and perhaps because I'm not advanced in either language I noticed a similar effect, not finding it easy to change gear when moving from one to the other. If the daily routine means using both there's less switching overhead, but lack of use for even a month or two makes it hard to get instantly back in again.
Human languages don't take on new features as enthusiastically as computer ones, so the problem may well be even harder for us programmers. It takes me many hours to get back into my own code after I haven't seen it for a while, and the problem is much worse when it's code written by someone else (especially if - as noted in my article - they assume I can read it as easily as they can).
I've come to the conclusion that no amount of effort is ever going to bring me up to speed with all of [Angular, React, Laravel, Zend, Spring, ...] and there's no way of knowing which of them is going to be called for by the next job, so I side-step the issue by sticking with the vanilla versions of each language, using wherever possible home-grown libraries that have a similar flavor on each and thereby give me a chance of achieving results. I acknowledge this way I will never get a job at Facebook or in a big finance house, but I don't actually want to work in a software factory; I want a life.