How does a “jump to conclusion” actually work?
Like a pane of glass framing and subtly distorting our vision, mental models determine what we see. We all use cognitive “mental maps” to navigate through the complex environments of our world. And all of our mental maps are flawed in some way.
The very latest research in brain science confirms that mental maps play a massive role in perception and behaviour. 60% of what we see is reprocessing of the information already in our brains. We live in a world of self-generating beliefs. These usually remain largely untested. We adopt those beliefs because they are based on conclusions, which are inferred from what we observe, plus our past experience. Our ability to achieve the results we truly desire is eroded by our feelings that:
- Our beliefs are the truth.
- The truth is obvious.
- Our beliefs are based on real data.
- The data we select are the real data.
For example:
I am making a sales pitch to an executive team. They all seem engaged and interested except for one guy, sitting near the door, who seems a little agitated. He looks at his watch and shakes his head. He asks no questions until I am almost finished, at which point he breaks in and suggests I send in a proposal. Everyone starts to shuffle their papers and put their notes away. I move towards my seat.
My presentation has bombed just because of one negative guy. He has obviously written me off. And now he has cut off my opportunity to reach the team. He must be trying to ensure I don’t get access to this decision making group. This is a shame because the people reporting to this person have told me that these ideas are exactly what his department needs. In fact, when I think about it, he really tried to dissuade his people from talking to me in the first place. He must be threatened by what I have to offer. And now because I have rubbed him up the wrong way (which in fact was unavoidable) he has closed off any opportunity I may have had to work with this organisation. What a pain! Another closed door. Let me gather whatever pride and respect still left to me and get out of here as fast as I can.
I decide to give up. By the time I have sat down I have decided to drop this group as a potential client. It would be too hard to convince this guy that I have something to offer. It is too bad I have an enemy who can change the thinking in the team.
In the few seconds before I take my seat, I have climbed up the “ladder of inference,” a set of mental steps of increasing abstraction, often leading to misguided beliefs:
Let’s trace the steps up the ladder of inference:
- The first step was the request for a proposal. This is clear data, which would show up the same to everyone on a video recording of the presentation.
- I selected specific details from the way he spoke and the way he turned away as he spoke and shook his head and looked at his watch. (I did not notice him glancing at his associate and nodding a moment before …)
- I added some meanings of my own, based on my past experience. (“People are threatened by what I am selling…”). And I remain unaware that this person has dropped an important meeting to hear what I have to say, and has to rush off to another.
- I make an assumption about what this person is thinking. (“He is threatened…”)
- I draw conclusions. (“This is another lost prospect.”)
- I adopt or reinforce a belief. (“I can’t sell this stuff…”)
- When I reach the top of the ladder I am ready to walk away from this group, even though in reality, they are quite keen to hear more from me.
It all seems so reasonable. But it is wrong. And it happens so quickly. I am not even aware I’ve done it. Neither is anyone else. The only parts visible to anyone else are the directly observable data at the bottom, and my own decision to take action at the top. Stepping up the ladder takes place in my head, abstract, unseen, unquestioned. If we allow this to be so.
We all spend a lot of time on the ladder of inference. With each conclusion we reinforce a previously held belief. The more we decide we cannot make a sale to certain people, based on filtered data, added meanings, untested assumptions and our own conclusions, the more we reinforce our belief that we cannot sell to this type of person. This is tight, tautological reasoning, a deadly embrace in our own mind. To make it worse, our actions often elicit the negative behaviours that we prefer to see.
Intuition is important for effective living. But untested intuition can lead us into confusion. Imagine discussing a sensitive issue in a team, with all of our untested assumptions, meanings, conclusions and beliefs. The air is thick with misunderstandings, weak compromises and communication breaks down, again. We all walk away feeling disheartened with a vague sense that we should have been able to make something more positive come out of our discussion. Inspecting our ladder of inference is one of the ways we can review our intuitions internally and with a group of people.
Why did that person look at his watch? Was he bored with my presentation? Was he threatened or critical? Or was he facing another massive, totally different crisis? We will not know until we find a way to check these assumptions. Stopping to check others’ thinking is not always appropriate, or effective. Stopping my presentation to comment on clock-watching may have created diversions and upsets, perhaps without either of us getting any nearer the truth. The ‘Fundamental Attribution Error’ tells us that we usually attribute what we think are negative actions by someone else, to a flaw in their character. Sometimes it is OK to know about the ladder of inference, collect our observations and reflect on them after the event.
Ladder of inference can help us in three ways to clear all of this up:
- Becoming aware of our own thinking through reflection
- Making our thinking and reasoning more visible to others through advocacy
- Learning more about others’ thinking, through reasoning.
We can do this by asking questions such as:
- What part of this presentation did you like the most?
- What is the observable data behind that statement?
- Can you run me through that reasoning?
- How did you get from the (observable data) to those conclusions?
- What assumptions did you make to get there?
- When you said “[your inference]” did you mean “[my interpretation of it]”?
- You can ask for data through open questions; “what was your reaction to my presentation?”
- You can test assumptions “Is this not what you expected?”:
- Or you can test observable data “you are looking at your watch.”
- Or you can present your move up the ladder “I am moving up the ladder to these conclusions, maybe we all are. Let’s share our views. What is the observable data?
When we all use the ladder it becomes a powerful tool for healthy communication. It is energising to show others the links in your reasoning. And even if they do not agree with you they can see how you got to your thinking and they can show where they are going. You may also surprise yourself when you understand how you got to where you are.
The Ladder of Inference is a useful model to help us understand how we think. You can read about more conversation models here.
You can also find out more about how we assist with conversations on the StrategyWorks website. While you are there, why not sign up for my monthly newsletter? You can see some examples here.
[…] about this story, The Ladder of Inference comes to mind. It was developed by Chris Argyris and made known in Peter Senge’s book The Fifth […]
Hi Stephen
We discussed this model in 2004 and when you today (in 2011) mentioned it I recognised the title but not the detail. I was well worth another careful read and I can see why this post is so popular and why it is often downloaded. A strong connection exists to the Crucial Conversations/ Confrontations model which describes the path from “see or hear, my story, an emotion (often negative to the extreme) and action” oftne with disasterous results.
Many thanks for the insightful conversation!