DEV Community

Cover image for Hiring without Algorithm Quizzes
Massimo for Plank

Posted on • Edited on

Hiring without Algorithm Quizzes

The Problem

I have a strong dislike for algorithm-style technical interviews. I understand their relevance in large enterprises as a basic test for evaluating engineers' abilities. However, at Plank, we're a tight-knit team of 13 developers, and algorithm-style questions don't align with what we're looking for. We seek developers with tangible skills that can bring value to our clients. Since each of our developers eventually takes on project leadership and architecture, we need abstract abilities that go beyond straightforward issue solving.

As we hire a new developer or two per year, we prefer to keep our technical assessments more open-ended. However, for our latest hire, I stumbled upon an approach that I believe holds value and could develop into an interesting procedure for future hires.

The Solution

I decided to model the technical examination after a real-world issue that a developer might encounter – nothing revolutionary there. Essentially, I would ask the candidate to implement a super simplified version of a project we've previously shipped, like The Canadian Encyclopedia (check out our case study on it here). Since doing all that work alone would be too overwhelming, we collaborate with the candidates to build out a prototype together.

I can already hear you shouting, "But Massimo, how could you possibly handle all that work for so many different candidates?!" Well, my friend, the answer is pretty obvious – I automated it!

Our procedure involves having the candidate create a repository with a fresh Laravel install and adding me as a collaborator. Once that's done, I kick off a script that opens three issues describing features that need to be added. Next, it commits two new branches with code to solve two of the three issues and opens two pull requests (PRs) for those branches, explaining the features added to varying degrees. Finally, it's the candidate's job to review the two PRs and provide input as they would if they were hired. They then open a PR for the final issue.

I've made an effort to capture the experience of working at Plank so that candidates can get a taste of what it's like while also giving them numerous opportunities to shine in various ways.

So What About the Automation?

Ah, the automation! As I mentioned earlier, I kept it simple for the first iteration. I created a bash script that clones the candidate's project, uses Git to commit files and create branches, and leverages the GH CLI tool to create the PRs (among other things). I named this script "Autodev." It may not be much, but I absolutely loved using it. Now, let's take a closer look at what it does.

The Script

First things first, we need to grab the repository and create the issues:

#!/bin/zsh

echo "Enter the target repository name (e.g., plank/repo-name):"
# shellcheck disable=SC2162
read repo

# Grab the repo
cd target || exit
git clone "git@github.com:$repo.git" project
cd project || exit

# Create issues to be solved.
gh issue create --title "Users need to be able to view or create articles" --body-file ../../issue_1.md  --assignee "@me"
gh issue create --title "Users need to take and create quizzes" --body-file ../../issue_2.md --assignee "@me"
gh issue create --title "Users need to be able to browse event timelines" --body-file ../../issue_3.md
Enter fullscreen mode Exit fullscreen mode

As you can see, I used "body files" to store the PR details. I wanted to keep as many things as possible outside of the script, making it easy to repurpose if needed in the future. However, the more I think about it, the more I'm inclined to turn this into a fully-fledged web tool or CLI utility. But that's a plan for the future. Next, we needed to create the two PRs, so I added two blocks that look like this:

# ---- PR #1 ----

# Check out a new branch & set upstream
git checkout -b 1-article-feature
git push --set-upstream origin 1-article-feature

# Commit each feature one by one
## Copy everything into the target folder (the dir we're in), except the routes folder
rsync -av ../../pr_1/* ./

git add app/Models database
git commit -m 'feat: add definition of articles model, migration, and factory'

git add app/Http
git commit -m 'feat: add controller functions for interacting with articles'

cat ./routes/routes.stub >> ./routes/web.php
rm -rf ./routes/routes.stub
git add routes
git commit -m 'feat: register controller routes with RESTful methods'

# Push changes
git push

# Create PR
gh pr create --title "Add Article CRUD feature" --body-file ../../pr_1_body.md

git checkout main
Enter fullscreen mode Exit fullscreen mode

In this code block, I reference a file structure of my changed files in a folder called pr_(1,2,...n). I simply copy the files over to the target project and add some atomic-ish commits. Ideally, I'd have a way to list these commits externally, perhaps in a JSON file, and iterate over them. This would prevent the need to update the script in the future. Following the same procedure as creating issues, I use a body file for each PR.

Finally, we clean up with:

# Delete target repo folder
cd ../
rm -rf project
Enter fullscreen mode Exit fullscreen mode

And call it a day!

Results

I won't lie, I was shocked at how well this approach worked. The script never failed on me, even with the variability of what candidates could have committed. It sparked plenty of meaningful discussions with candidates and, I hope, provided them with an enjoyable challenge.

During the interview, we would discuss their comments and how they could guide their pair programmer to improve a line they left a comment on. We would also review their implemented feature, highlighting what they felt they did well and areas that needed improvement. To add another layer, I asked candidates to help me debug an issue I faced when testing their feature since I genuinely believe that debugging is an essential part of the interview process.

One thing I've learned about hiring is that there isn't just one "right" way to approach it. It's an inherently human process, and as much as I enjoy trying (and failing) to automate away my problems, it's crucial to strike a balance. I believe this approach achieves that balance by reducing excessive evaluation work while still maintaining a personal touch.

We believe in the power of human connection and continuous improvement. As we evolve our hiring process, we remain committed to creating an environment where both evaluators and candidates can flourish. Sharing our experiences and insights is essential to collectively improving the hiring landscape for everyone involved.

We encourage everyone reading this to share their hiring processes as well. It's in our best interest to share what we've learned to improve the process for both evaluators and candidates alike.

Top comments (0)