Author Topic: What's the worst that could happen? (AI interviewer edition)  (Read 823 times)

TheDrake

  • Members
    • View Profile
What's the worst that could happen? (AI interviewer edition)
« on: October 22, 2019, 02:33:55 PM »
Quote
HireVue uses a combination of proprietary voice recognition software and licensed facial recognition software in tandem with a ranking algorithm to determine which candidates most resemble the ideal candidate. The ideal candidate is a composite of traits triggered by body language, tone, and key words gathered from analyses of the existing best members of a particular role.

After the algorithm lets the recruiter know which candidates are at the top of the heap, the recruiter can then choose to spend more time going through the answers of these particular applicants and determine who should move onto the next round, usually for an in-person interview.

Like resume parsing algorithms haven't been bad enough. This is fraught with problems, not the least of which is an unintentional bias toward race, religion, politics, or age. There is no feedback mechanism that would correct such problems, since false negatives or low scores will simply not be interviewed and forgotten.

D.W.

  • Members
    • View Profile
Re: What's the worst that could happen? (AI interviewer edition)
« Reply #1 on: October 22, 2019, 05:17:22 PM »
Ya, my friends in the security field are still... unimpressed, with facial recognition tech when it comes to non white faces.   :-\ 

If you are letting automation decide who to hire, that only tells me what you are looking for is an automaton, not a person.   ::)