My notes and other stuff


Paper: Anticipatory Thinking

This week's post is taken from older notes I had on Gary Klein's paper titled Anticipatory Thinking. Anticipatory thinking is the process of recognizing difficult challenges that may not be clearly understood until they're encountered, and preparing for them. It is overall a sign of expertise in most domains.

Klein mentions that it would be a mistake to conflate "anticipatory thinking" with "predicting what will happen"; in fact, the concept is framed as "gambling with our attention to monitor certain kinds of events and ignore or downplay others." It's deeply tied with uncertainty, ambiguity, and blending it with various concepts and experiences.

He mentions, for example, experienced drivers actively scanning for hazards and potential trouble spots whereas beginners just try to stick to specific lanes. The experienced drivers are not expecting hazards or predicting them, but are managing their attention knowing they could appear.

Part of that focus is not purely probabilistic, and some of it is aimed at low-probability but high-threat events. Prediction is directed with guessing future states of the world, and anticipatory thinking is about preparing to respond, not just prediction. The whole activity is centred on what the people doing it could do.

There are three common forms of anticipatory thinking noted, while also specifying that we expect more forms to be found:

Pattern Matching

Pattern matching takes circumstances of the present situation to bring out similar cues and events from the past. Experts have a repertoire of these and can react instantly:

We will sense that something doesn’t feel right, or that we need to be more vigilant. Greater experience and higher levels of expertise make it more likely that our anticipatory thinking will be accurate and successful. [...] They also carry a danger, namely overconfidence in our experience may lead us to make a diagnosis, but miss something new or novel that may be seen by the naïve observer.

Trajectory Tracking

This one is about being 'ahead of the curve'—looking at where events are going and preparing ourselves for how long it will take to react:

[A]anticipatory thinking here blends our assessment of external events with the preparations we make to handle these events.

Trajectory tracking is different than pattern matching. It requires us to compare what is expected with what is observed. The process of tracking a trajectory and making comparisons is more difficult than directly associating a cue with a threatening outcome.


This one is about seeing connections between events:

Instead of responding to a cue, as in pattern matching or to a trajectory, we also need to appreciate the implications of different events and their interdependencies.

The paper mentions a long example about how often this does not happen due to communication and contextual challenges, and contrasts it with a cool example of when it does:

In one high-level Marine Corps exercise [...] The plan left a very weak defense against an attack from the north so the controllers got ready to launch precisely this type of attack. They were gleefully awaiting the panicked response of the Marine unit they were going to punish. However, the Marines had augmented their staff with some experienced Colonels who had formerly been on active duty but now in the reserves.


One of them noted a situation report that an enemy mechanized brigade had just moved its position. That was odd – this unit only moved at night, and it was daytime. He wondered if it might be on an accelerated time schedule and was getting ready to attack.


Checking further, the Colonel talked to the Senior Intelligence Watch Officer who was also suspicious, not because of any event but because of a non-event. The rate of enemy messages had suddenly declined. This looked like the enemy was maintaining radio silence. Based on these kinds of fragments, the Colonel sounded an alert and the unit rapidly generated a plan to counter the attack—just in time. The Colonel didn’t predict the enemy attack; he put together different cues and discovered a vulnerability in his unit’s defenses.


These three mechanisms play together to let the decision maker mentally simulate courses of actions, to know what sort of problems might arise. It's also essential for team work and coordination. For teams to be effective, they need to be able to predict each other's actions and reactions to unexpected events.

The author reiterates that problem detection isn't just about accumulating more and more discrepancies until you tip the scales; in most cases it requires re-framing the situation to get the significance of things. Anticipatory thinking is part of the mental stimulation that helps generate expectations.

To be surprised means you had to have anticipated things (albeit wrongly), and it's a sign that you need to repair common ground/shared understanding. And there are barriers that may cause this to happen.

Common ones are fixation (maintaining a frame despite evidence that it is entirely wrong, think Chernobyl and the operators thinking it was impossible it had blown up despite graphite on the ground), explaining away inconsistencies with other knowledge, or be overconfident in your abilities (and therefore badly evaluating risk).

At the organizational level, there are extra barriers around policies that hide weak signals, perverse incentives, gaps between the people with the data and those who know how to interpret it, and challenges in directing people's attention.

In fact the authors ran experiments to improve techniques and noted:

Each scenario included weak signals – to determine if and when the team noticed these signals and their implications.


A key finding is that at least one individual in every group did notice the weak signals and their implications and typically half the group noticed the weak signals, based on the individual notes. However, no team took these early signs seriously. Usually, they weren’t mentioned at all. If mentioned, they were dismissed. So the groups themselves did not "consciously" pick up or act on those signals. Therefore, the challenge shifts from helping people recognize weak signals to helping their groups and organizations take advantage of the anticipatory thinking of individuals.

Improving Anticipatory Thinking

Gotta love papers that tell you how to fix things, rather than just "shit's hard, tough luck". A lot of them seem to be referring to Cynefin.

For fixation, they state that you want an outside view to provide a reality check. There's also value in bringing someone with fresh eyes so they're not stuck in the current interpretation of a situation—which isn't improved with a devil's advocate. Fresh eyes are needed for an authentic dissent.

Weak mental models are going to be a limit as well. They refer to a lot of exercises, one they dub attractors/barriers (which I googled to find more about and barriers seem to be about creating simple rules to always respect, and attractors being about embracing positive reactions in people). They also mention the Future backwards exercise to increase the lessons learned in sessions. Other tricks there are to slow down how often people rotate around responsibilities so they have more time to gain experience.

For organizational barriers, Klein breaks them out into two categories.

For between-organization barriers, they mention things that help the flow of ideas, interpretations, and information. One example is creating new units/teams from people that were in competing groups under a unified hierarchy so they can share their experience without competing anymore. This is judged more effective than another tip, which is to create "liason officers" who are to play the role of easing communication.

For within-organization barriers, they want people to voice unpopular concerns. They once again say that devil's advocates don't work. They mention good results from organizations ritualizing dissent, specifically suggesting PreMortems, a practice where you assume your project failed and investigate a virtual failure to find what would have been plausible mechanisms behind its failure. Another view comes from high-reliability organizations in saying that they are always mindful and active towards potential problems rather than being dismissive of them.

Another blocker is going to be complexity:

Military organizations try to overcome complexity by structuring situations. The costs of this structuring process include a difficulty in seeing connections that cut across the boundaries and a vulnerability to situations that don’t fit the pre-existing structure.

Automation can also be a blocker. Here they recommend the approach of requiring an active mindset, which means automation should support cognitive work by augmenting people rather than replacing them (which we covered a lot in these various notes).

For Team coordination, they mention that it's essential to have a see-attend-act approach, meaning that you have to be aware that people who see a threat, people who are in command for it, and people who can solve it may all be different people.

And that's about it for the paper! It adds a section going deeper and commenting on books and resources that add weight to the idea that anticipatory thinking is very much a thing that is part of sensemaking but distinct from other macrocognitive functions (decision making, planning, coordination), even if it intersects with all of them.