That Wild Ask A Manager Story
You may have seen this wild story from Ask A Manager a week ago: the new hire who showed up is not the same person we interviewed. If not, take a minute to read the story; it’s quite a ride. In short: the company interviewed someone, hired him, but when he showed up for his first day of work … it was a totally different person.
The story made the rounds of all my online social groups, where we mostly just laughed at how bizarre it was. But in one Slack, the conversation continued, and we started taking it seriously, discussing how we’d respond if this happened to us. A friend asked, “if this was your hire, and you manager asked you to change your hiring practices to prevent this, what would you do?”
Nothing. I would do nothing. I would not try to prevent candidates from getting someone else to pose as them during interviews.
The premise here is simple: designing a human process around pathological cases leads to processes that are themselves pathological.
Hiring processes are designed to measure candidates on a “fit for the role” spectrum; they try to predict how well (or poorly) a candidate would perform, if hired. So our selection process measures behaviors correlated with job performance. But this behavior – having someone else pretend to be you during a job interview – is so far outside anything that normal people would consider that it’s simply not measurable on the same axis as anything job-related.
If we start to imagine adding steps to the interview process to protect against an imposter job candidate, the “solutions” we come up with are quite aggressive. We could ask candidates on video (or in person) to see a photo ID and match the ID against the resume. But this would seem very weird. It starts an interview off in a hostile manner, and send the a strong message of distrust. Honest candidates – which are, remember, the vast majority – will wonder why the heck this company is acting so weird, and will rightly see this as a red flag about the company culture. There will be negative consequences for your hiring practices. For example, anyone who goes by a name that doesn’t match their government ID could be forced into an uncomfortable explanation. Congratulations: your attempt to identify fraudsters has accidentally created a transphobic hiring practice.
When we design software systems, we learn to think deeply about corner cases. We code defensively, making sure our systems handle all sorts of uncommon events. Amazon promises 99.999999999% reliability on objects stored in S3, and even then I’ve written code that handles the 0.000000001% chance that an object disappears.
Designing human systems is different. Computers don’t have emotions; I don’t need to worry insulting the vast majority of S3 objects when I defensively check integrity every time. But humans are different; when we design a human system around uncommon cases, we do need to consider the ramifications on the majority. There are times – and this is one of them – where addressing outlandish behavior requires steps that are just unacceptable.
This isn’t always the case — preventing financial fraud, for example, often leans on practices like surprise audits, which are uncomfortable for honest people and fraudsters alike. But the cost of embezzlement is high enough that organizations can justify audits.
But the cost of fraud in this hiring case is … pretty low actually? In fact, the system basically worked: this person showed up on his first day, was immediately recognized as someone else, and quit within the week.
When something goes wrong, our impulse is to try to make some sort of a change that prevents that problem from ever happening against. But when we’re talking about human systems, we need to measure the cost. Preventing pathological behavior in human systems often creates highly uncomfortable and off-putting processes. In human systems, not all corner cases need fixes. Sometimes the best response is just to roll your eyes, say “people, am I right?” and get back to work.