Unpacking Interview Questions:
Interview Question Series Wrap Up
This is a wrap-up to my unpacking interview questions series, where I dissected some of my favorite questions to ask in job interviews. If you haven’t read the series, you may want to do that first and come back here for the wrap-up.
I’ve spent at least the last decade gradually getting obsessed with improving engineering hiring. It’s perhaps the most impactful thing a manager does. The crux of hiring is the interview, and yet there’s actually very little great advice on interviewing out there. You’ll find plenty of lists of questions, but very few that are developed to the level of rigor that I think is appropriate.
So, I wanted to write this series for a couple of reasons1:
I wanted to share some of the questions I use. They’re been developed carefully, tested frequently, refined, and they have proven results. I’m pretty proud of the work that’s gone into them, and I wanted to show off a bit!
I wanted to demonstrate, by example, the level of rigor I think is appropriate when developing interview questions. My hope is that even if folks don’t use my questions, they’ll be inspired to put in similar work.
If there’s one thing you take away from this series, I hope it’s this: spend more time putting together standardized, tested, rigorous interview guides and rubrics. Doing so will make your hiring process more accurate and more fair. Hiring managers: this is the most important part of your job. Don’t half-ass it.
That’s what most of this post is about: I’ll step back a bit and talk about developing your own questions. There’s also a brief FAQ at the end, from questions I got throughout the week.
Writing your own questions
You may have noticed that all the questions in this series have the same structure. This is by design; it’s intended to give you a template for writing your own questions. Let’s make this structure explicit; here are the parts:
The value or skill that the question is designed to measure. This should be something concrete and measurable, and of course something required for the job. Ideally, it will be the same as an area that will later be used in this role’s performance review. That way, you can correlate performance on interview questions with performance in the job.
The question: I’ll be writing more about different types of questions, and which are most effective, in the future. Short version: behavioral questions (“tell me about a time when…”) are the gold standard; hypothetical questions (“tell me what you would do if…”) can be OK too. (See also this discussion of behavioral questions from the disagreement question.)
Follow-ups: what other questions might you want to ask after the main question? These should help guide candidates towards the important areas to talk about, help interviewers dig deeper, etc.
Behaviors to look for: what should an interviewer be paying attention to as they assess a candidate’s answer? More on what I mean by “behaviors” below.
Positive signs / red flags: what are the strongest signals of a positive or a negative answer?
The key to all of this is getting detailed and explicit about what, specifically, you’re looking for. Especially the behaviors/positive signs/red flags. The more you can standardize, the more effective the question will be. A standard rubric will let you more accurately compare candidates to each other; make you more confident about your decision to hire or pass; help you spread the load of interviewing across your team; and so much more.
On “Behaviors”
I use the word “behavior” a lot above, and throughout the series, so it’s worth expanding on what I mean. A behavior is an observable action someone takes: what they do, what they say, and how they say it.
In interviews, it’s important to measure behaviors, and not your interpretation of those behaviors. You can’t know what someone is thinking or feeling on the inside; you can only observe what they say and do.
For example, you might observe that a candidate stumbles over words, frequently stops mid-sentence and corrects themselves, and looks flushed. Those are behaviors: you can observe them directly. You might reasonably conclude from those behaviors that the candidate is nervous, but “is nervous” is not a behavior! If you were writing a question that selected for confidence in speaking (maybe for a sales or developer relations role), you should avoid having “confidence” or “nervousness” in your answer rubrics: this leaves room for interpretation differences among different interviews and candidates. Instead, you should try to unpack what behaviors signal confidence, and seek to measure those.
Tone is a behavior - but be careful
You’ll notice I include how people speak as a behavior. That’s because it is: tone is something directly observable, and it is an appropriate thing to be looking for. For example, in my question on explaining a topic, I have this red flag:
- đźš© Patronizing tone, especially when explaining “down”
This is quite important to look out for – if a candidate can’t explain a topic without using a patronizing or condescending tone, that’s a really bad sign.
However, as you might imagine, interpreting tone is fairly hard. What, specifically, do I mean by “patronizing?” Most people would probably know it when they hear it, but that’s somewhat dangerous to depend on. This red flag would be better if I could unpack specifically what I mean by “patronizing”.
This is especially important when dealing with tone as it intersects bias. There are some specific ways that our perception of tone can be colored by racism and sexism. For example, white people tend to perceive Black people as “angry” more easily than they do other white people.
It is appropriate – and useful – to measure tone. But, be very careful when including tone as a behavior in your own questions, and question whether it’s appropriate to include any specific tone or whether it’s too likely to be colored by bias.
FAQ
Finally, some answers to some common questions I got about the series:
Can I use these questions in my own interviews?
Please do!
If you use these in any sort of internal guides you develop, I’d appreciate it if you cite the source in those guides. I can’t and won’t enforce this, but please make an attempt.
And, if you do use any of these questions, I’d love to hear about how it went!
“But diversity is bad actually”
First off, this isn’t a question. But I want to address it anyway:
This series was posted to Hacker News, and as you might guess, many people there objected to asking about diversity, equity, and inclusion. I don’t care to argue with people who don’t believe that the tech industry is a hostile place for underrepresented people, or who don’t believe the research showing that diverse teams are more successful, or who think that we don’t have an obligation to work towards equity.
But I do think this objection misses the point of asking about values, and that’s worth digging into. As I wrote in that post:
There’s a class of organizational values that you might consider non-negotiable – everything from DEI to agile development, use of CI/CD, “everyone wears the pager”-style help rotations, etc. In these cases, […] you need to be sure that everyone understands the value and is aligned with it.
All organizations have some basic values that everyone needs to align with. People who aren’t aligned with those values aren’t going to be successful (and will probably be quite miserable). This is precisely the point of an interview: to figure out if someone will be successful. So, if you have deal-breaker values like these, they need to be covered in the interview.
In fact, the objections to asking this question kinda prove why it’s so important. Diversity, equity, and inclusion are important values at the organizations I choose to work. Someone who’s not aligned with those values is going to hate working for me. If I didn’t ask, someone from the “diversity is bad actually” camp might end up getting hired, and we’d both be pretty unhappy about that.
Many of these questions ask about unverifiable facts; aren’t you afraid that candidates will lie?
Some people worry that the kind of questions I ask give candidates the opportunity to lie. For example, when I ask about a disagreement at work, how do I know that the candidate didn’t just make up something they thought I’d want to hear?
Mostly, I just don’t worry about lying in job interviews. It’s safe to assume that candidates will be mostly honest during job interviews. They’ll of course tend to select stories that paint them in the best light possible, but I don’t really consider that dishonesty. It’s just good editing. In 15 years as a hiring manager, and hundreds of interviews, I’m only aware of candidates outright lying twice. I just don’t see the point of spending much effort trying to control for something that happens less than 1% of the time.
I do however, push for specific details as a general practice. So if someone says they deployed a site to AWS I might have a line of questions about specific services, what problems they ran into and how they solved them, how specific details were configured, etc. This is mostly to help me assess the depth of a candidate’s experience, but it can also show if someone is exaggerating their experience.
Mostly, though, the best place to control for honesty is when you conduct reference checks – but that’s a topic for another time.
What’s next?
That’s all for the series, for now. I’ll be writing more about interviewing and hiring in the future, so if you have a topic that you’d like me to address, tweet at or DM me, or send me an email (jacob
at this domain).
I also had a more internal, process goal: I’m trying to write more this year, and I wanted to challenge myself to publish something every day for a week. This series was a good fit for that process challenge. ↩︎