This is part two of a five part series on privacy rights of employees in the context of data driven HR. Click here to read part one about discrimination by AI.
The golden age of data driven HR is upon us. We are rapidly becoming much better at understanding what happens between your ears. In my recently defended doctoral dissertation I contribute to some of the psychometric models that predict attitudes and behaviour on the work floor. Or should we say behind the work screen, as few of us share a work floor in 2022. As with any new revolutionary technology there are some pitfalls and dangers we have to learn to live with. The fundamental questions are “What are the privacy rights of employees?” and, as an HR department or HR-consultant, “What do we have the right to measure?”. What follows is a redacted extract of ethical considerations in my dissertation that dives deeply into these questions. This blog post is part 2, opportunity inequality.
Our work contributes to the emergent practice of data driven HR by providing taxonomy and dimensions on which to analyze. Some describe the advent of data driven HR practices as an integral part of the so called “industry 4.0” (Sivathanu & Pillai, 2018). Also in 2018 the futurist Bernard Marr wrote a book on data driven HR (Marr, 2018). In it he identifies four purposes for the use of data:
“1: Using data to make better decisions
2: Using data to improve operations
3: Using data to better understand your employees
4: Monetization of data” (Marr, 2018)
Our work would be used in the third category, to better understand employees” but subsequently also in the first one “to make better decision”. Right now decision making in HR is messy and inefficient, there are a myriad of selection criteria and clues to look for a handful of attributes such as conscientiousness, motivation and intelligence. The fact that this is messy means that the outcomes are noisy. Let’s do a thought experiment: Let’s say hypothetically we are looking for conscientiousness and intelligence. Imagine three candidates: one, candidate A, who would if we could perfectly measure these attributes score high on them, candidate B, who would score a bit lower and candidate C the lowest. Based on an interview and the CV we may have 50% chance of selecting candidate A, 30% chance of selecting candidate B and 20% chance of candidate C (I’m making up reasonable numbers for the sake of the thought experiment). Suppose we get better at measuring, maybe with an IQ test and Big 5 personality analysis. Then the percentages would shift to maybe, A: 70 %, B: 20%, and C: 10%. Suppose we become really good at psychometric analysis and we measure things near perfection. Then we will hire A 100% of the times. B and C have no chance. If all companies do this all companies will be going after the same employees and C will never get a job. Of course in this simplified example with arbitrary criteria of selection, intelligence and conscientiousness. But we could also apply big data analysis to learn exactly what are the ideal psychometric attributes of the employee for a given function. Initially there may be some discrepancy within the algorithmic models but as they get better they will become more and more uniform identifying the ideal psychometric profile. For a while there will be a competitive advantage to the recruiters that have the best models. But like everything in tech, in a short while the access to the technology will democratize. Soon everyone will have the same excellent open source model. At some point only one specific profile can get a specific type of job. This would mean that B and C never get the job, only the A types.
I guess most of us recognize that at some point in our careers we have been offered an opportunity that was a bit of a jump for us, that may have been offered because someone subjectively believed in us. Hopefully this belief became a self-fulfilling prophecy and we grew into the new role. This dynamic of imperfect selection scatters opportunity for everyone, and yes, sometimes we hire the wrong person for the job, but these exceptional opportunities also create growth and opportunity. If all of these decisions are made by data driven algorithms there will be massive opportunity inequality and individuals will lose the freedom to try to “wing it” at different roles. This will push up the price for the A types and push down the salaries of B and C causing more inequality and decreasing social mobility.
Here there is hope in the market mechanism, if in a competitive labor market there is an incentive in identifying alternative indicators of performance potential and possible roadmap for development of B and C types, then those recruiters who do will have a competitive advantage.
A side effect of this will be that future participants of the labor market will train or be trained on profiling themselves to match the desired profile of the algorithms. This would cause extensive social desirability bias in all psychometric tests. And if subjects are not at all honestly answering the questions but rather trying to guess what the algorithm wants to hear, the tests lose all their value.
I would therefore argue that we may be aided in the decision making process by data driven tools but we should allow some room for human intuition, messy as it may be, it will create opportunities for individuals and companies and will hopefully keep respondents humane and fight the gaming of the algorithm.
Of course subjective intuition is highly biased, and maybe letting the computer decide is more objective and more meritocratic. But maybe a little bit of chaos gives everyone a chance?
What is SARA and how can she deliver your data driven HR methodologies?
Sara stands for Survey Analysis and Reporting Automation. It is a platform where HR consultants can implement their data driven methodologies and automate their workflows. It is used by top consultancy firms around the world to deliver team assessments, psychometric tests, 360 degree feedback, cultural analysis and other analytical HR tools. SARA is the AI you need to be at the cutting edge of HR-tech.
What else does Codific build with privacy by design principles?
Codific is a team of security software engineers that leverage privacy by design principles to build secure cloud solutions. We build applications in different verticals such as HR-tech, Ed-Tech and Med-Tech. Secure collaboration and secure sharing are at the core of our solutions.
Videolab is used by top universities, academies and hospitals to put the care in healthcare. Communication skills, empathy and other soft skills are trained by sharing patient interviews recordings for feedback.
SAMMY Is a Software Assurance Maturity Model management tool. It enables companies to formulate and implement a security assurance program tuned to the risks they are facing. That way other companies can help us build a simple and safe digital future. Obviously our AppSec program and SAMMY itself is built on top of it.
We believe in collaboration and open innovation, we would love to hear about your projects and see how we can contribute in developing secure software and privacy by design architecture. Contact us.