AI-powered software used by a growing number of businesses to analyse candidates promises more diversity. Researchers are skeptical. — Photo: Klaus-Dietmar Gabbert/dpaArtificial intelligence used by recruitment platforms to help boost diversity in the workplace is discriminating against people over random changes in clothing or lighting, a new study has warned.
Such software says it helps combat unconscious bias and helps firms identify the most compatible personality for a role, but a new study from the University of Cambridge has warned that these systems can be affected by random details such as the clothes people wear, the lighting in the room they appear in and even what they have in the background for video interviews.
The study, by the University of Cambridge's Centre for Gender Studies – which has been published in the journal Philosophy and Technology – said the use of such AI tools was a dangerous example of"technosolutionism", where people turn to technology to provide a quick fix to an issue that is deep-rooted and requires more time and investment.
"We are concerned that some vendors are wrapping 'snake oil' products in a shiny package and selling them to unsuspecting customers," study co-author Dr Eleanor Drage said.