Goto

Collaborating Authors

 curious thing


AI Job Interview Software Can't Even Tell If You're Speaking English, Tests Find

#artificialintelligence

AI-powered job interview software may be just as bullshit as you suspect, according to tests run by the MIT Technology Review's "In Machines We Trust" podcast that found two companies' software gave good marks to someone responding to an English-language interview in German. Companies that advertise software tools powered by machine learning for screening job applicants promise efficiency, effectiveness, fairness, and the elimination of shoddy decision-making by humans. In some cases, all the software does is read resumes or cover letters to quickly determine if an applicant's work experience appears right for the job. But a growing number of tools require job-seekers to navigate a hellish series of tasks before they even come close to a phone interview. These can range from having conversations with a chatbot to submitting to voice/face recognition and predictive analytics algorithms that judge them based on their behavior, tone, and appearance.


We tested AI interview tools. Here's what we found.

MIT Technology Review

After more than a year of the covid-19 pandemic, millions of people are searching for employment in the United States. AI-powered interview software claims to help employers sift through applications to find the best people for the job. Companies specializing in this technology reported a surge in business during the pandemic. But as the demand for these technologies increases, so do questions about their accuracy and reliability. In the latest episode of MIT Technology Review's podcast "In Machines We Trust," we tested software from two firms specializing in AI job interviews, MyInterview and Curious Thing. And we found variations in the predictions and job-matching scores that raise concerns about what exactly these algorithms are evaluating.