Introduction to Work Sample Tests
Earlier this year, I wrote a series on interview questions. Good interview questions are one key to hiring well, but they’re not the only key. Today, I’m starting a new series on another critical factor in effective hiring: using work sample tests, aka practical exercises.
Why do we need work sample tests?
Interviewing is a weird exercise in prediction. We’re looking to select someone who’s going to be good at Tasks A, B, and C – the duties of the job – but we ask them to perform Tasks D, E, and F – the parts of our interview process. Then we hope that performance on D, E, and F predicts performance on A, B, and C.
Interview questions are a version of this prediction exercise. We ask people questions about Tasks A, B, and C, about times they’ve performed those tasks in the past, and use that to make some predictions about how’ll they’ll perform in the future. Done right, there is good predictive value here. There’s evidence that behavioral interview questions do predict job performance. I’ve tracked the performance of many of my interview questions, including the ones I’ve shared here, and that data shows that interviews are effective. But they’re far from perfect: I’ve hired candidates who’ve done well on interview questions but still struggle to deliver. And I’m sure I’ve rejected candidates who would have done just fine but for whatever reason interviewed poorly. It’s an imperfect correlation, to be sure.
If you take a step back, the concept of interview questions is weird. The way to know with absolute certainty that someone can do the job is to see them do the job. If we’re trying to hire a Django developer to work on our app, why not just have them dive in and work on the app?
Of course, it’s not that simple. If it were, we wouldn’t need job interviews. Software development is complex. It can take hours or even days just to get a development environment. It usually takes weeks or months for a developer to learn enough about an existing codebase to be productive. And software development isn’t just about writing code; there are all the interpersonal teamwork skills, too; it can take weeks or months for a new person to gel with the team. We can’t ask candidates to spend weeks or months working with us for free before deciding to hire them: that would make hiring take forever and would be unfair to candidates.
But, if it’s clear that “actually work together” is the gold standard, what if we found a way to do that in a compact and timely fashion? That’s exactly the line of thinking that leads to work sample tests.
What are work sample tests?
Work sample tests1 are an exercise, a simulation, a small slice of real day-to-day work that we ask candidates to perform. They’re practical, hands-on, and very close or even identical to actual tasks the person would perform if hired. They’re also small, constrained, and simplified enough to be fair to include in a job selection process.
To give you a more concrete idea of what I’m talking about, here are several examples of work sample tests I’ve used:
Role | Work Sample Test |
---|---|
Penetration Tester | Find vulnerabilities in a deliberately-vulnerable example app, like Gruyere |
Software Engineer | Review a pull request (real or a constructed sample) |
Technical Writer | Document a small bit of code, like a CLI app or a small API |
Infrastructure Engineer | Write Terraform/CloudFormation/etc. code to produce an environment matching a given diagram or documentation |
Software Engineer | Given a bug report and a codebase, write a test for and fix that bug |
Developer Relations | Write and deliver a short presentation |
Hopefully, this gives you a flavor of what “work sample test” means. The rest of this series will get much deeper into more examples and how to use them.
Interviews aren’t enough: hiring effectively requires work sample tests
The worst hiring experience I ever had involved a missed work sample test. We hired an infrastructure engineer who claimed several years of AWS experience, and part of the hiring process was a practical infrastructure exercise. So imagine my surprise when he started and I found out that he couldn’t accomplish even the simplest AWS tasks (it took him over a week to create an S3 bucket). After a frustrating six weeks of discovering the depths of his lack of skills, I found out that he’d never actually been given the work sample test. If he had, we’d have discovered that his resume listed skills he didn’t have before hiring him.
Lots of things went wrong here other than the missed work sample test: the rest of our interviews were insufficiently rigorous, and we were sloppy checking references and didn’t discover that his resume was an outright fabrication, listing work history that wasn’t true (he lied about being employed at a company, and we never called to check) So we can’t pin this bad experience on just the missing work sample test, but still: a practical exercise would have caught this fabrication!
If this were just an isolated incident, I might chalk it up to being an outlier (it is). But it’s not: when I review the people I’ve managed with the worst performance, the unifying factor is that none of them were asked to perform work sample tests. I’ve never seen domain-specific performance issues (e.g. engineering, security, etc.) from someone who passed a work sample test. I’ve had many “near misses”, where we avoided hiring someone who turned out to be a good interviewee, but a bad engineer. And on the flip-side, I’ve hired some fantastic candidates based on the strength of their work sample tests, despite mediocre interview performance.
The fact is that asking questions about experience can only get you so far. Interviewing fundamentally measures a candidate’s skill at interviewing! If we use interviewing correctly this correlates to job performance, but rather imperfectly. There is a small but significant number of people who can talk a great game but can’t execute. And you miss out on people who would be great at the job but interview poorly. If you’re hiring for a role that needs specific skills, asking questions about those skills isn’t enough. Only by asking them to actually use those skills can you weed out the bullshit artists.
Hopefully, this has convinced you that some form of work sample test is critical to hiring well. (If not: email me!) However, work sample tests are also a minefield: the space is littered with silly practices like whiteboarding, FizzBuzz, Leetcode, and “reverse a linked list”-style bullshit. The point of this series is to separate these silly practices from the good ones and to give you a framework and several examples to use in your hiring rounds.
The next couple parts of this series will explain the principles behind effective work sample tests (and why I call things like FizzBuzz “silly”). After that, I’ll share four or five examples of effective work sample tests that you can use in your hiring processes. This series will mostly be about engineering and engineering-adjacent work sample tests, since that’s what I know best, but I think the principles should apply to most roles.
Watch this space, and if you want to be notified when those drop you can follow me on Twitter, or subscribe to my RSS feed or newsletter.
Other terms that people sometimes use are “technical exercise”, “practical exercise”, “practical interview”, etc. I’ve chosen to use “work sample test” in this series for a couple of reasons. First, it seems the most inclusive of different kinds of exercises; not all work sample tests are “technical” (whatever that means). Second, it’s the term that appears to be most commonly used in academic literature and the HR field, so using this term will help people new to the idea perform literature searches and such. ↩︎