Jacob Kaplan-Moss

Psychological safety in the InfoSec industry

I wrote this post in 2016, more than 8 years ago. It may be very out of date, partially or totally incorrect. I may even no longer agree with this, or might approach things differently if I wrote this post today. I rarely edit posts after writing them, but if I have there'll be a note at the bottom about what I changed and why. If something in this post is actively harmful or dangerous please get in touch and I'll fix it.

My co-worker Eric Mill recently brought up the topic of psychological safety. Referencing a study by Google that points to psychological safety as a key factor in successful teams, Eric wrote:

Maybe these situations sounds familiar to others (they definitely both are to me):

Did you feel like you could ask what the goal was without the risk of sounding like you’re the only one out of the loop? Or did you opt for continuing without clarifying anything, in order to avoid being perceived as someone who is unaware?

[…]

It feels to me like our national security and information safety are directly connected to the psychological safety of our federal staff.

This touched a nerve for me. It’s not just government, either: I believe that the general lack of psychological safety is at the root of many problems in InfoSec.

In particular, there are three problems with our modern security practice that I think come down to a lack of psychological safety:

  1. A culture of shame around personal security practices.
  2. A “blameful culture” that focuses on individual failures rather than systemic ones.
  3. An industry that’s even less diverse than the already-not-great tech industry (while possibly needing diversity even more).

More on each of these points below:

The culture of shame around personal security practices

I come to security from a non-traditional background: I started as a web developer, moved into operations as “DevOps” became A Thing. From there, I sorta got pulled into security without really planning it. So at this point although I have an impressive-sounding resume, my knowledge is deep in some areas, but still shallow in more areas than not – e.g. I’m sure Eric knows more about PKI and TLS than I ever will, despite my nifty-sounding job titles.

I mention the resume because of a thing that happens to me, frequently: because I have “security” in my job title, people seek me out in private and ask for security advice (e.g. how to use password managers, how to make safe backups, etc). Almost all the time, they sound super-embarrassed asking questions, and apologize profusely for what they call “stupid” questions.

To be clear: these aren’t stupid questions at all – they’re smart ones, demonstrating a strong desire to have better personal security practices. And these are super-smart people, deeply accomplished and able to figure out complex problems in many different disciplines. We’ve created a world where people are super-scared about talking about security things, and feel embarrassed about their personal security practices. They’re not comfortable admitting their lack of knowledge in public, and they’ve held off getting better until they can find a so-called “expert” to ask quietly.

[Not to mention: how many people aren’t asking at all, because they feel too ashamed even to get over the hump of asking one single person for help?]

But here’s the thing: we need communication around security to be open and frequent; these conversations can’t keep on happening in private! We rely on our (non-security-expert) staff to be a part of our security posture, and to be a part of our detection system! That is, researchers repeatedly find that the most effective early-warning system for security incidents are random staff members noticing something suspicious and contacting security (see, for example, Verizon’s DBIR which makes this point nearly every year.)

This far-to-common culture of secrecy and shame around security directly prevents this kind of staff buy-in and open communication.

Security’s blameful culture

Worse still, the lack of psychological safety often means we don’t learn from our mistakes. Coming from an operational background, I’m astounded at how the security culture continues to cling to finger-pointing and sword-falling. Ops folks learned a long time ago that blameless postmortems are the most effective tool in reducing future mistakes. Dig into any successful operations team and you’ll find a culture that allows for a deep and honest analysis of failure, without fear of retribution. (See Sidney Dekker’s The Field Guide to Understanding ‘Human Error’ for a longer discussion of this concept.)

Yet it doesn’t seem that the security industry has learned this lesson: all too often breach response is an exercise in finding out who to fire. We continue to cling to the “bad apple theory of error, rather than building a safe environment. If we don’t have a culture that allows us to really learn from our mistakes, we’re going to keep making those same mistakes again and again.

Security’s lack of diversity

Finally, there’s an adversarial nature to security work which tends to bring out the worst in the tech industry’s already-overly-machismo attitudes. Overall, as the the tech industry grapples with its diversity problems, we’re finally recognizing that we can’t tolerate antisocial and outright abusive attitudes. But the security field has lagged behind substantially.

The security industry is even less diverse than tech overall, and that’s saddening. And psychological safety is a major contributor: safe teams are more welcoming, especially to marginalized people, because they can be more sure that discrimination will be taken seriously. There’s no mistake that the safe teams correlate strongly with diverse ones.

This is a big deal for security: we desperately need diverse thinking! Securing systems is an especially wicked problem, and these kinds of problems are best solved by diverse thinking. We get diverse thinking by building diverse teams (see Scott Page’s The Difference), so I’d argue that one reason we keep failing is that our teams just aren’t diverse enough to tackle the huge problems we face.

We’re not going to be able to innovate without making security more welcoming, and psychological safety is a huge part of that.