This blog was written by Johannah Palmer, Edgecumbe’s Data Protection Officer and Information Security Administrator. This is the fifth blog in our GDPR series. In blog #1, we highlighted a range of things you need to consider when using psychometrics for employment purposes. Blog #2 explored why you must treat psychometrics as special category data and the implications for HR professionals. Blog #3 explored how to deal with the security of psychometric data. In blog #4, we delved into the roles of Controllers and Processors. In this fifth and final blog of the series, we look at the UK GDOR rules on profiling and automated decision making.

Psychometric testing is a valuable tool that can be used to analyse various aspects of an individual’s personality, behaviour, interests, and habits to make predictions – for example, regarding likely job performance – or decisions, such as whether to hire them. In this blog, we will explore what profiling and automated decision-making mean, the implications under UK GDPR, and how they can be used ethically and legally.

Understanding profiling and automated decision-making

Profiling involves the automated processing of personal data to evaluate certain aspects of an individual, such as work performance, preferences, behaviour, and so on. This process aims to identify correlations and predict behaviour or draw other conclusions about an individual based on their personal data.

Automated decision-making, on the other hand, refers to making decisions without any human involvement, often utilising pre-programmed algorithms and criteria.

The UK GDPR and its restrictions

Under the UK GDPR, profiling and automated decision-making are recognised as high-risk activities and additional protections are therefore built into the regulations. Whilst you can use profiling to make predictions about people, they have a right to object to you using their data for profiling purposes under Article 21 of the UK GDPR. If you wish to use their data in this way, you must inform them that you intend to do so and make it easy for them to object; if they object you must immediately stop using their data for profiling.

The restrictions on automated decision-making are even more stringent and individuals have the right not to be subjected to solely automated decisions that have significant legal or similarly consequential effects, including effects on their employment opportunities. There are exceptions to this prohibition (which are set out in Article 22), but for reasons we outline in the next section, it appears unlikely that these would apply in the context of employment decisions. This means that, in the UK, people effectively have the right not to be subjected to automated decision-making when it comes to hiring decisions.

With regards to the right not to be subjected to ‘solely’ automated decisions, ‘solely’ refers to decision-making processes that are entirely automated and exclude any human influence whatsoever. Decisions made by a human being are not automated decisions, even if the decision is informed by an automated profiling process.

Ethical and legal use of profiling and automated decision-making

The restrictions on automated decision-making are strengthened even further when the decisions are based on special category data such as psychometrics. In these circumstances, this kind of processing is only permitted where you either have the individual’s explicit consent, or where there is a substantial public interest (e.g. the prevention of criminal activity). In the case of recruitment, neither of these would appear applicable: it is unlikely that a hiring decision will qualify as fulfilling a substantial public interest and, as we outlined in our earlier blog, explicit consent (which must be freely given, specific, informed and unambiguous) is unlikely to provide an appropriate legal basis for processing special category data. This is because it cannot be freely given due to the unequal bargaining power of the parties involved; when you consider the complex nature of psychometrics and how they are used to inform decisions, it is unlikely that an individual can reasonably be expected to give informed consent.

We therefore conclude that whilst automated profiling may legitimately be used to inform human decisions about employment, automated decision-making (e.g. screening out applicants on the basis of an automatic scoring of application data or psychometrics) cannot. This may surprise you, and we are aware of several companies, including well-known high street names, that appear to act contrary to this provision of the UK GDPR.

From a legal perspective, the most important safeguard to apply when using automated profiling is to complete a Data Protection Impact Assessment (DPIA). This impact assessment will prompt you to consider the risks and potential impacts on individuals of your planned profiling activities, and help you consider whether you have taken appropriate steps to manage those risks. Note: We do not consider there to be suitable means to mitigate the legal risks of automated decision-making in employment decisions.

Ethical practice means going beyond the legal minimum requirements and this is where offering candidates access to feedback and interpretation of their psychometric test results can play a crucial role. It is essential to communicate to candidates that test results are meant to be considered alongside other relevant information and not treated as standalone assessments. Providing feedback and interpretation helps individuals understand the results and promotes transparency in the decision-making process.

Using psychometric testing effectively while complying with the UK GDPR regulations is essential to protect individuals’ privacy and ensure fair and transparent decision-making. By providing feedback and interpretation, involving human professionals, and considering multiple data sources, psychometric testing can be used as a valuable tool within the bounds of the UK GDPR rules on profiling and automated decision-making. Following these guidelines will help you foster trust, maintain legal compliance, and enable you to access the benefits of psychometric testing responsibly.

This concludes our series of blogs on using psychometrics safely under the UK GDPR.  If you would like to learn more about the ways in which we support organisations , please  get in touch via our website or by emailing us at enquiries@edgecumbe.co.uk.

Want to find out more?

We are hosting a webinar on Wednesday 27 September 2023 based on using psychometrics for recruitment and talent management and how to do this safely under the UK GDPR.  If you would like to join us, you can sign up here.