Algorithms and artificial intelligence are being used to screen job candidates. The problem for jobseekers is whether the system is open or fair - and they're starting to fight back.
Tech businesses are offering automated services to the 'pre-hire assessment' recruitment market, saving global companies the time and expense of sifting through large numbers of job applications.
Stephen Buranyi, who’s written on the emerging trend for The Observer, says it’s “early days” and many in the recruitment industry are sceptical, but companies using the technology think it’s improving their hiring.
“The companies that have been using the technology do seem to think that it is by whatever metrics they’re using, producing a better hiring process.
Buranyi says the artificial intelligence-based system uses personality tests but also analyses facial expression, tone of voice, speed of speaking and the words people are using.
To find a model for an employee they want, the tests are given to people already working for a company and the results are correlated to the job performance.
That gives them "an example of what a good employee looks like" so the test can be given to job applicants.
“The programme might not know why a certain facial expressions or certain answers predict good performance in employees but they do know that good employees of the company behave in that way.”
Buranyi says the AI-based tests and algorithms are considered the private intellectual property of companies that make them, so any research or evidence will remain in house.
But companies are “diving in more and more”, he says.
“The companies that have been using the technology do seem to think that it is by whatever metrics they’re using, producing a better hiring process,” he says.
There are worries about hidden biases. “The bias thing is real,” says Buranyi, “but of course people who are proponents of this sort of technology will argue that humans are extremely biased, and biased in some pretty dark ways, and by using this sort of technology you can circumvent that.
“If you are noticing the programme is selecting too many men or too many non-white applicants you can fix that.”
It can be difficult to find out how a bias in a programme is working because most are private property. “It’s not in any way transparent to anyone outside the programme.”
Automated hiring also raises questions of equality of access to information and openness about decisions.
“When there’s a decision …can you get someone to explain to you how it happened – especially if it’s something as serious as you being turned down for a loan or a job.”
In the European Union, the General Data Protection Regulation (GDPR), which is to come into force this year will allow people to challenge decisions made automatically “and appeal to have a human intervene in these things.”
Job counsellors at universities might be preparing people for this type of recruitment but other job seekers, who don’t have the same resources, go in blind, he says.
The programmes are "gameable", he says. Internet forums are springing up where people trade tips and answers to questions, and and younger people especially are finding ways to get through the systems.
“To their credit they do a very good job – they approach it very scientifically. They make fake accounts they do as many applications as possible they try to see what works and what doesn’t.”
In his Observer article Buranyi writes: “One HR employee for a major technology company recommends slipping the words “Oxford” or “Cambridge” into a CV in invisible white text, to pass the automated screening."