It’s useful to be clear about our decision-making so that we can improve on the poor decisions and repeat the good ones. Even if we don’t know how a decision will turn out it’s valuable to be explicit about the thinking behind it so that we can understand it better for the future.
Recently I read an old article on the HBR blog which discussed this. The authors first introduced the two kinds of thinking we do: System 1 and System 2.
The first, System 1 thinking, is automatic, instinctive, and emotional. It relies on mental shortcuts that generate intuitive answers to problems as they arise. The second, System 2, is slow, logical, and deliberate.
They went on to say that while System 1 thinking is useful (it’s fast, and uses little energy) it’s subject to subtle cognitive biases that are very difficult to spot at the time, and can lead to very bad outcomes. They continue:
We do not mean to suggest that System 1 should be entirely suppressed in order to promote sound decisions. The intuitive reactions of System 1 serve as important inputs in the decision-making process.
The trick is to spot when a System 1 decision is being made—particularly for a decision that might be significant—and subject it to a bit of System 2 scrutiny.
Let me give a couple of examples.
I often sit in a meeting of technical people discussing potential solutions to a problem and at one suggestion (quite possibly mine) someone will say, “That’s a terrible idea”. That seems like a System 1 reaction to me, even though they might be right. Usually I ask why, even if I think I know the answer, although I tend to use less combative words: “Can you talk us through the thinking behind that?” I want them not only to justify their decision, but also help others learn how they think. That’s especially useful if they’re more experienced than others in the room. If their thinking includes certain assumptions then we can check if the assumptions are true, and if those assumptions are fair but turn out later to be misinformed then it’s easier to know that we should go back and revise that decision.
Another example is in recruiting. I have occasionally come out of interviewing someone and thought to myself “They’re a No…” with some vague reason, such as “…they just didn’t do it for me.” That’s definitely System 1 thinking coming out. It’s also very unfair to the candidate. Having to give clear feedback is a good way to subject that thought to some rigorous System 2 analysis. Why didn’t they do it for me? What was I expecting? What were they doing when I first thought that? When real answers emerge I’m able to say better what I’m looking for (to the candidate and/or the recruitment agent), as well as check my thinking with colleagues. Sometimes I learn a bit more about my own biases. In all cases the recruitment process improves.
Our instincts aren’t bad things—they exist for a reason. Capturing them and exposing them allows us to handle them better.