Ascent

4 Steps To Make Your Interviews Suck Less

Tech interviews should be fun. They're not. They feel like interrogations. Worse, many of the questions asked in them amount to simple ego stroking and share little about the depth of a candidate's abilities.

Tech interviews suck.

What do we need to learn

In an interview, we have a few key things we need to learn about a candidate:

  • can they do the job we would ask them to do?
  • would they enjoy doing the job?
  • are they a good culture fit?
  • how passionate are they about learning and growing?
  • how prepared are we to foster that growth?

To pick these all up in an hour or two is hard, but not unrealistic.

If we follow a few obvious-sounding, but very difficult to master steps will do a lot to move our industry forward. It will also help us a lot with identifying valuable talent even early on in a candidates career.

Rule 1: Know what every question you asks tells you

Ask yourself what each of your interview questions tells you about a candidate. If your answer is "it tells me if they know ____", we have a problem. Instead, every question should reveal a candidate's level of understanding on a range on topics. How about a couple of examples.

Let's start with a bad interview question i used to ask.

After drawing the CSS box model, i would ask a candidate to identify the border, margin and padding in the box. Then i would ask them to indicate which part of the box was used to calculate an element's width.

This question sucks. It sucks because the only thing it tells me is if the candidate knows the box model. I am left to extrapolate if they can actually apply that knowledge. Inversely, i have to assume that not knowing the box model means a candidate is bad at CSS. Both assumptions. Both bets i would be slow to put money on.

Sucks.

So let's look at an example of a far better interview question:

I want you to build me a blog. I have some requirements in mind. How would you build it?

This is a fascinating question because it is very opened-ended. We stand to learn a number of things about a candidate: are they user-focused or tech focused. Do they gravitate to front-end or back-end design. Which tech do they choose? How well do they elicit requirements? How well do they adjust for changes? Do they mention usability principles? How about architectural merits? How well do they seem to understand database structures?

Note that none of those things we learn have a binary "they know/don't know" answer. Instead, we get a feel for a candidate's ability to think through a complex scenario. We get to ask follow-up after follow-up and move the blog forward with a number of changes and see how they adapt. We get to build a system together.

And, bonus, they don't have to write a line of code (more on this later).

Rule 2: Always ask the exact same questions

You have to ask roughly 5 candidates an interview question before it begins to tells you anything about any of them. During this time you will learn how to present the problem best, where candidates will trip up, or if the problem is too easy/difficult. Once you have adjusted your problem so that it is both presented clearly and is difficult enough that a bell curve of answers starts to appear, stop changing things. For good.

If your answer is "it tells me if they know ____", we have a problem.

The real value of interview questions is the ability to compare candidates across years. Much the same way that NFL teams compare college athletes to past and present players, you will do so with candidates.

"Alice is stronger than Bob with databases and is about the same culture fit as Eve." - you

In addition to helping you rationalize candidates, there is no easier way i have found to bring peers up to speed on prospective hires.

Rule 3: Measure skill adjusted for personality

Personalities are nuanced. They do, and will, affect how a candidate performs in your interview. If you don't manage this, and account for it, you will be letting excellent candidates slip through your fingers.

This is most often seen through nervousness. I can't count the number of times i have seen a quivering hand as a candidate goes to start writing an answer.

You have to ask roughly 5 candidates an interview question before it begins to tells you anything about any of them.

I have settled on a couple of ways to avoid this. For starters, i always begin the interview with a question i have already asked the candidate to answer via email. I ask them to explain how they got into tech. Why have they stayed? What do they enjoy about it? These questions are intentionally simple. Not only do they start to put a picture of the person together for me, they also open the candidate up and make them feel more comfortable.

Next, and this will strike many of you as wildly unprofessional, i make a point to swear early on in an interview. That's right. While reciting some anecdote related to their "why tech" answer, i make sure to work a "fuck" or a "bullshit" into my story. The result is remarkably predictable: candidates will double-take at the comment, then end up smiling or laughing.

It may seem bullish but the results speak for themselves. A candidates who is laughing and relaxed will give answers far more representative of who they really are. That is what we are after. So keep the entire interview conversational.

Which brings me to a previous point about not writing code. During a 1 hour interview, i expect a candidate to write around 7 lines of code. All of that code is included in a singe algorithmic question, and it is the final question i ask. This is specifically to keep the interview in a style of conversational problem solving. I want to hear how a person thinks. I want to read the excitement in their voice. I can always see more of their code by asking them to code, on their machine, in an environment that will represent them accurately.

Besides, when was the last time you figured out a hard problem by coding, with semicolons, on a white board? Yeah, me neither.

Rule 4: Rate yourself over time

Finally, it is critical that you are honest with yourself about all of your past hires. Keep mental list of the hires you consider to be your best and worst. Remember how each group performed along your interview process. What parts did you correctly identify and which facets did you miss? Why did you hire the worst and what did you underestimate about the best?

Personally, i have shown a tendency to hire people with stellar attitudes and drives, even when they underperform in my technical questions. This is a pattern i recognized years ago and have adjusted my interpretation of candidates accordingly. I consciously shift how i weight technical performance to account for my personal bias.

Suck less

Truthfully, there is so much nuance in interviews that these rules are more checkpoints than the recipe for surefire success. We all have biases and will naturally gravitate toward certain skill sets and personalities. Being critical about your interview questions is necessary. Scrutinizing your reasons for turning away people is also important.

While interviewing is subjective, most would agree that tech interviews tend to be objectively bad. I have once asked an interviewer "but what does this question actually tell you about me?" and he froze. We need to break that trend. If not for the companies we are hiring for, then break it for the poor person you are tasked with evaluating. You were them once. Now is your chance to make the industry a little better than it was when you arrived.

Get the latest posts delivered right to your inbox.
Author image
Written by Ben
Ben is the co-founder of Skyward. He has spent the last 10 years building products and working with startups.