Humanizing asynchronous video interviews for job seekers.
UX Cabin is a remote-first design agency that provides design, research, and web development services for clients.
Async is an in-house product concept for UX Cabin. It aims to use asynchronous video interviews (AVIs) to improve efficiency in recruiting and hiring UX professionals. Job seekers are able to get feedback on their public pre-recorded videos, allowing them to improve their presentation and show their growth to employers.
Job candidates tend to find these interviews impersonal and stressful, because they often don’t receive feedback from hiring teams. They also find it hard to determine if there’s a culture fit with the employer.
UX hiring managers want to know if job seekers have their desired “soft” skills, which is why they’re interested in using asynchronous (pre-recorded) video interviews to assess candidates.
How might we humanize asynchronous (“one-way” pre-recorded) video interviews?
We started the project with the goal of understanding how asynchronous video interviews (AVIs) are perceived. As a research co-lead, I created a research map for each week, and oversaw production of our interview guides, concept tests, and data analysis.
Asynchronous Video Interviews (AVIs) are automated interviews where candidates are guided through text-based questions and prompted to record videos of their answers.
During the initial stakeholder meeting, our client mentioned that as an employer, he liked the efficiency of AVIs. However, he also mentioned that he faced an issue in people actually recording and submitting these videos.
Using a competitive analysis of 11 AVI providers, our team identified pain points and preferences from candidates and hiring managers.
We realized that multiple hiring manager pain points (candidates lying about experience, recruiters without UX experience, and losing candidates due to slow hiring processes) were out of our control. I met with the Design and Product leads, and we jointly decided to focus on the job candidate experience for the rest of our project.
With a narrower scope, we created our platform to address the three main needs from job candidates: a desire for quality feedback, role-specific practice questions, and evaluation of the company for a culture and value fit.
We handed off our personas and main interview insights to the design team so that they could work on ideation and mid-fidelity prototypes. At the same time, the research team developed sketches of main screens for the Async product – we wanted to evaluate our idea before committing to creating high-fidelity screens.
We used remote moderated video calls and concept tested three low-fidelity ideas with 9 job candidates, and received positive feedback on all three ideas.
Introductory video of the role from a hiring manager or team member, which would give the candidate more information about the company’s culture and values.
Job candidates liked the idea of recording themselves and practicing answers to track improvement. They also preferred role-specific questions to better prepare.
There was a strong desire for substantive feedback on their videos, like the one in the yellow circle.
To distinguish Async from existing AVI platforms, we emphasize human connection throughout the process. Employers play a more active role by sharing a video about their team, while job candidates are provided with a platform that helps them practice and improve on their video interview skills.
At the end of our two months, we defined the current AVI space, understood how job candidates and other hiring managers feel about the use of AVIs, and prioritized three core features for Async. The Design team created high-fidelity versions of the three tested concepts for our MVP, shown below.
One major limitation was the lack of quantitative research in this project. While we had rich insights from the user interview and concept testing, we didn’t implement a survey in this research, which would have increased the validity of our research findings.
Another limitation is the challenge with convincing hiring managers to use this platform, as opposed to other AVI services. Employers would have to take an extra step to record information about their open roles, which is a concept that we would need to test before implementing this product.
It was my first time co-leading a research team, and I really enjoyed planning the research stages and being involved in all parts of the process. I gained experience with managing stakeholders, facilitating cross-team collaboration, and conducting Agile research.
Working with a client taught me how to leverage storytelling and prioritize client participation in the design process. By presenting our work in product-friendly language, we were able to get more feedback from UX Cabin, which helped us define our MVP scope to match their needs.