Cognitive Humility is a concept we never talk about in our HF/E education or training, but I think it is much more important than we realize. I am going to define it a little more specifically than either Annie Murphy Paul or David Brooks do.
Brooks, the New York Times op-ed columnist, has for the past couple of years taught a course called “Humilty” at Yale… The purpose of the course, according to its description in the catalog, is to study “traditions of modesty and humility in character building and political leadership,” and to explore “the premise that human beings are blessed with many talents but are also burdened by sinfulness, ignorance, and weakness.”
We know that there are cognitive processes that evolved over thousands of years to help us in our information processing. For most of that time, speed was often more important than accuracy and there were only short term results to worry about. Tomorrow we could be dead.
Now, in our modern environments where we have long term concerns, accuracy is often more important than speed, and there is a ton of information available to process, these processes can steer us wrong. But I don’t agree with the trend to call them biases because they still are helpful much of the time. It is not just that they don’t steer us wrong, they can be better than more conscious, deliberative information processing. But I have written about that before (and I am sure I will again) so I will skip that rabbit hole today.
In general, people are better at seeing these heuristics in others than in themselves. We can tell when someone else is self-deluding or cherry picking their evidence. But the processes are quite subtle and happen in brain regions that are not consciously accessible, so we often miss them in ourselves. We might see them in retrospect during an incident investigation or a debrief. But in real time it is really hard.
In Annie Murphy Paul and David Brooks’ definitions of cognitive humility, the first aspect is the metacognitive awareness to realize that you are susceptible to these error-inducing processes. The second aspect is the metacognitive skill to recognize in advance the kinds of environments where they might appear, so that you can look out for them. The third aspect is the metacognitive insight to diagnose them in real time so that you can avoid or suppress them.
But there is another aspect that I think gets at the heart of cognitive humility much better than any of these. Realizing that you are susceptible is different than truly admitting it to yourself and others around you. Too many of us (and by us I specifically mean in HF/E and other behavioral sciences) have much more of a false humility about these. We might admit in theory that we are susceptible, but do we really believe it in our heart of hearts? In a really visceral way? Some of you might. But I am not always convinced.