Jacob Kaplan-Moss

Thinking About Risk:

Sidebar #2: The Swiss Cheese Model

This is the forth part of my series on thinking about risk, and second “sidebar” to part 1, my basic introduction to risk. It’ll make the most sense if you’ve read that piece and understand the terms risk, likelihood, and impact that I discussed there.

Why do accidents happen?

One of the riskiest near misses1 I’ve experienced was when, on a late fall trip to the Boundary Waters, a member of our trip experienced mild hypothermia2. Hypothermia is nearly always a result of poor risk management: hypothermia of this type develops over time from extended exposure to cold without proper management. Hypothermia is easy to prevent (stop and warm up before you get there!) but increasingly hard to treat the worse it gets. In the end everything turned out okay: we were able to rewarm them with sleeping bags and warm sugary drinks. Still, it was quite spooky, so we cut our trip short and exited the next morning.

Several things went wrong to get us to this point:

  • We chose to take a trip in northern Minnesota late into the shoulder season (this was the first week of October, if I recall correctly), when weather is unpredictable and can get quite cold.
  • While we had reasonable gear (warm layers, sleeping bags) for the conditions, we didn’t have much in the way of spare warm clothes, thus relying on keeping the warm layers we had dry.
  • We flipped a boat earlier that day – a simple accident, non-consequential in the moment.
  • Because it was sunny and warm when we swamped, we decided to continue in wet clothes, relying on the sun staying out so we could dry our clothes in camp in a few hours.
  • But the weather changed rapidly (a surprise, but a predictable one), so it was cold and raining by the time we arrived in camp.
  • Our campsite had very little in the way of good fire material, so we had limited rewarming options.

I tell this story because it follows an extremely common pattern. True fluke accidents are incredibly rare; most of the time, accidents happen not because of a single isolated incident, but because of a series of small missteps that add up to serious consequences.

The Swiss Cheese Model

In risk management, we refer to this as the Swiss cheese model: we imagine human systems as a series of slices of Swiss cheese. Each hole in each slice of cheese is a gap in our risk management. Often, that gap is mitigated by another layer with a hole in a different place. But sometimes, all the holes “line up”, and accident happens:

Three slices of cheese with holes, red arrows running from left to right. Most arrows don't got all the way through, but for one, all the holes line up, and the arrow goes all the way through all three slices.

By Wikipedia user BenAveling, CC BY-SA 4.0.

This shows up incredibly often, in all different fields. Perhaps the example most relevant to my readers: in information security, it’s terrifically common for breaches to when an attacker chains together several low severity issues.

This concept is important to wrap your mind around, because it complicates dealing with “low risk” situations! It’s very tempting to look at a low risk issue and decide the risk is acceptable – after all, there’s not much risk. But we have to look big picture, and think about how this small risk might combine with other minor issues to cause something much more serious.

This is what I mean when I say that hypothermia is a risk management failure: not stopping to get someone out of wet clothes seems like a very low risk move, particularly on a warm day, but failing to consider other factors (changing weather, ease of making a fire, availability of dry clothes, etc.) means that multiple low risks compound.

A difficulty you might be noting here is: how, exactly, can we aggregate multiple risks? That is, if we’re dealing with three “lows” and two “mediums”, what’s the overall risk? We can’t exactly do the math: 3 * Low + 2 * Medium = ???.

We’ll come back to this when we revisit quantitative measurement (which will be sidebar #4), but here’s a simple easy heuristic I like: “three yellows make a red”. The idea is that any given warning sign (a “yellow flag”) might not be enough to change your behavior, but three (or more) warning signs all at once should add up to a “red flag”, a major signal to stop immediately and change course.

This rule is a very simple way of aggregating multiple risks – but it points to the most important takeaway from the Swiss Cheese Model: you can’t consider risks in isolation. You have to look at the entirety of the situation, and how multiple risks might align to create a situation that’s worse than the sum of its parts.


You can find all posts in this series here and follow me in various ways. And if you’ve got questions, or topics you’d like to see covered in this series, please get in touch.


  1. In risk circles, we use the term “near miss” to describe an event that turned out fine but was scarily close to being serious and consequential. ↩︎

  2. Despite the designator, “mild” hypothermia is quite serious. It’s very different from just being cold: once someone progresses to bonafide hypothermia, they’re unable to rewarm without immediate and extensive external rewarming – and this isn’t always possible in the wilderness. Despite the name, “mild” hypothermia is a life-threatening condition. ↩︎