Skip to main content
Recrutiment & Employment Confederation
Research

Asynchronous video interview assessment tool

Practical guides

This publication forms the second of three short guides focusing on the responsible procurement and use of specific data-driven recruitment tools. It is intended to be read in conjunction with the general AI recruitment guidance published in December 2021, which provides detailed steps to determine whether a tool is fit for purpose.

This guide focuses on additional considerations specific to asynchronous (recorded) video interview assessment tools.

The guidance has been developed jointly by the Recruitment and Employment Confederation (REC) and the Centre for Data Ethics and Innovation (CDEI). The CDEI leads the UK Government’s work to enable trustworthy innovation using data and AI. More information about the CDEI can be found on the CDEI web page or by contacting cdei@cdei.gov.uk.

Asynchronous (recorded) video interview assessment tools

 Background

 

Asynchronous video assessments are interviews where applicants are asked to submit a video recording of themselves answering a set of predetermined questions, with answers subsequently undergoing analysis through automated technology (natural language processing), human oversight, or a combination of the two. This differs from a traditional interview format because there is no human interviewer in attendance.

In a small number of cases, these recordings are then reviewed by the hiring manager. However, in the majority of cases, artificial intelligence (AI) and facial analysis are used to evaluate candidates and provide a score of suitability to the hiring manager, who may then choose to review the highest scoring interviews. In recent years, these tools have played an increasing role in the recruitment process, and the demise of face-to-face interaction during the Covid-19 pandemic has brought about an even greater reliance on them.

For recruiters and firms, these tools offer the opportunity for a faster and more efficient hiring process (in particular for mass recruitment rounds, such as graduate schemes). Additionally, they allow the reviewer to watch the interview again without relying on notes. By removing the need for geographical proximity (or ability to travel), these tools enable firms to interview a wider pool of candidates; this is particularly advantageous for roles that are expected to be carried out in a remote or hybrid setting. They also bring an element of flexibility for applicants, who can choose to undertake the interview at a time and place that suits them best.

However, automating assessment of video interviews is one of the most contested uses of AI in the recruitment process. This is because these tools can be built on technology which often has embedded bias and is not grounded in sound science. When considering a tool of this kind, meticulous due diligence is required as well as effective communication of its use and role in the decision-making process. It is also important to consider the experience of the candidate; some applicants may disengage with the process if they feel that it has been dehumanised and they aren’t given the opportunity to ask questions about the company or organisation.

We recommend against using tools that incorporate emotion or expression recognition technology as these have been widely shown to be inaccurate and unfair.

1. Evaluation of value-add and effectiveness

Asynchronous video interview tools may allow candidates to record videos of themselves answering standardised questions, e.g. “tell us about a time when you overcame a challenge”. The candidate may have a set amount of time for each answer, and a set number of tries to record an answer. In general, this type of tool may make interviews more structured, give candidates an opportunity to prepare answers, and may lead to greater efficiency as a result of standardising interviews and automating analysis. Tools of this kind may automatically transcribe the video interview and also analyse the transcribed content.

It is important to establish whether the asynchronous video tool you are considering is fit for purpose, suitable for your recruitment process, and if so, how it will fit into your wider recruitment process.

➔ Evaluate where the tool will be used in the recruitment process and how it will inform decision making.

  • Consider how else you are communicating with the applicant and where you are giving them the opportunity for human contact.

➔ Ask the vendor how these pre-recorded video interviews are analysed. What are the technical processes used?

  • E.g. Is it analysing the words the candidate said in their answers? The tone and intonation they used when saying their answers? The facial expressions or body language they displayed when saying their answers?

➔ Ask:

  • What data is collected and analysed?
  • If used, on what basis are facial expressions evaluated?
  • How is a “good” response defined? What are the criteria for success?
  • Which competencies are assessed? Are these relevant to the job / role?
  • How do they map to the job description criteria?

2. Manage and mitigate risks

2.1  Legitimate and accurate assumptions

Some video interview tools claim to interpret candidates’ emotions and score them as part of the interviewing process. You should be aware that these tools are highly controversial; there is research that has shown that data-driven tools are currently not able to perceive emotions and that these claims are based on poor science. We recommend that you do not procure tools that incorporate  emotion or expression recognition technologies.

 

[1] AI Now, 2019 Report https://ainowinstitute.org/AI_Now_2019_Report.pdf

➔ Seek information from the vendor about the underlying science behind the tool and the decision-making process.

  • Is there a clear evidence base?
  • Ask the vendor to cite the scientific studies that the tool is based upon.
➔ Do not use tools that rest upon pseudo-scientific assumptions (e.g. that emotions can be fully read from facial expressions)
➔ Evaluate whether your team is comfortable with the links the tool makes. For instance, if the tool is assessing communication skills, ensure that the factors that go into making that assessment are legitimate and relevant to the job / role in question.
➔ Ask for copies of auditing documentation in order to understand whether the outcomes produced by the tool have been evaluated, in order to make sure that those outcomes are accurate.

2.2 Bias and discrimination

Discrimination can arise from the use of asynchronous video interview tools, as the technology can unfairly disadvantage candidates who do not fit the typical profile that the tool has been trained to identify.

There is a risk that these tools also risk introducing or reinforcing existing bias into the recruitment process. For example, if the tool monitors whether a candidate looks into their webcam, and takes this into account when providing a score of suitability, it may produce discriminatory outcomes for a candidate who has autism (as they may find it more difficult to maintain eye contact over a sustained period).

If the tool uses speech-to-text transcription, fairness risks may arise from inaccurate transcriptions, particularly for people with regional accents (compared to, for example, those who speak with Received Pronunciation - RP).

Technology often absorbs and replicates biases that exist in the environment in which it has been designed. If the asynchronous video  tool analyses the content of the speech, bias may be ingrained further. This is because the tool may base the assessment of the applicant on ingrained assumptions around what constitutes a desirable attribute.

A structured approach to assess the risk of discrimination:

➔ Consider what competencies and skills are relevant to be assessed as part of the video interview process.

  • Outline the potential side effects of using your chosen competencies and skills. (E.g. if the tool provides a higher score for maintaining focus during the interview, you may miss out on qualified candidates who, through no fault of their own, are distracted by their children or other dependents).
  • Mitigate these side effects where necessary. E.g. include a process for human review of the interview/transcript where an outlier score is recorded.

➔ Think about which groups of people you might be excluding from your recruitment round if you use an asynchronous video interview tool.

  • For example, a candidate who does not see themselves as ‘tech-savvy’ may be put off by the process, and this is likely to mean fewer applications from older candidates (aged 55 and over).3 If the role does not require technology skills, consider whether this is the most appropriate mechanism to assess candidates.
➔ Consider the reputation and history of the video interview tool provider.

➔ Understand whether there is unfairness:

  • Collecting information about the candidates who were scored highly by the tool, compared to the wider pool of candidates who applied, may indicate whether there is a bias in the way the tool is scoring candidates.
➔ An Equality Impact Assessment may be a useful tool for considering the important issues relating to discrimination and is strongly recommended for tools that are likely to have a significant impact on decision making, particularly where a large number of applicants are involved.

Accessibility

Ensure, to the extent possible, that video interview tools are accessible to candidates with disabilities.

➔ Engage directly and meaningfully with people who may be affected, by piloting tools and soliciting feedback on the accessibility of the tools.

  • Be aware that there are many types of disabilities, and that relying only on a statistical method of monitoring the impact of the tool may not show the actual impact of the tool on individuals

➔ Offer reasonable adjustments for those who need them, and ensure that the reasonable adjustments pathway is fair and equivalent to the non-adjusted route.

  • Clearly indicate where candidates can request reasonable adjustments.
  • For example, if an applicant has a stammer, allow them extra time to submit their answers.
➔ Continuously monitor the accessibility of the tool and platform throughout deployment, and create channels to receive additional feedback down the line.

2.3 Transparency

Both candidates and recruiters may be nervous about the dehumanisation of the recruitment process: clear communication and transparency may help to mitigate some of these concerns.

Steps to build transparency into the process:

➔ Clearly explain to candidates that an asynchronous video interview tool will be used as part of the process. Include information on:

  • What the criteria for success are.
  • How the score from the tool will contribute to the overall assessment of the candidate.
  • How candidates can request reasonable adjustments for the interview process.
  • How candidates can get in touch to appeal or challenge a recruitment decision that has involved an asynchronous video interview tool.
  • How candidates can provide feedback about their experience.

➔ Take appropriate measures to ensure that GDPR duties are met, including conducting a Data Protection Impact Assessment (DPIA).

  • If the video interview tool autonomously makes a decision about candidates based on the interview score, you will need to take extra steps to ensure you are fulfilling Article 22 duties. (More information can be found on the ICO website.

 

3 https://ageing-better.org.uk/blogs/how-are-older-people-adapting-digital-technology-during-covid-19-pandem

2.4 Data protection

Asynchronous video interview tools engage data protection law because they involve personal data (relating to an identifiable individual). It is advisable to seek assurance that your chosen vendor’s practices comply with UK GDPR requirements.

Your data protection responsibilities will be determined by the nature of your relationship with the technology vendor. As the recruiter, you are likely to be the joint data controller (determining the purposes for which the data are processed and the means of processing). If the vendor also uses the candidates’ data from your interviews to train their algorithm, they would also become a controller, as they are determining the means of processing, and so acquire the relevant responsibilities.

Steps to check for compliance with data protection law:
➔ Agree who has the role of data controller (vendor or recruiter or both) before a procurement contract is signed - controllers will be responsible for complying and demonstrating compliance with UK GDPR.

➔ Ensure that the vendor’s practices comply with UK GDPR requirements, paying particular attention to the guidance that applies to special category data. This includes:

  • Understanding how your candidates' data will be used in the future. For example, if it is used to train the vendor’s model, consent will be needed from candidates before they interact with the tool.

➔ Complete a Data Protection Impact Assessment (DPIA) - you may want to complete this with the vendor.

  • It is good practice to publish the DPIA so it is accessible to applicants and workers.

3. Communication and building trust

Proactive communication and transparency around how a recruitment round is being conducted is important for inclusivity. This is particularly true when there is a data-driven tool involved in the process, as candidates may feel unclear about how decisions are being made.

➔ Practically, this could involve making available a document to applicants which outlines:

  • The justification for the use of an asynchronous video interview tool.
  • The scientific evidence that underpins the tool.
  • How the tool will form part of the decision making process.
  • The criteria for success for the specific role.
  • How to request reasonable adjustments.
  • How candidates can get in touch to appeal a recruitment decision that has involved an asynchronous video interview tool.
  • How candidates can provide feedback about their experience.
➔ Once the tool is in use, continually monitor and evaluate its impact, in line with the key considerations listed above.

Other guides in this series