Human resource managers are using artificial intelligence tools and other machine learning software to assist in the hiring and interview process. According to The Washington Post, HR managers also plan to use those same tools when making decisions about layoffs as well, citing a January survey . That survey of 300 human resources leaders at U.S. companies found that 98 percent of them say software and algorithms will help them make layoff decisions this year.
If you would like more context on this matter, please consider Vikram R. Bhargava, assistant professor of strategic management and public policy at the George Washington University School of Business. His research centers around topics including artificial intelligence, the future of work, technology addiction, mass social media outrage, autonomous vehicles, and other topics related to digital technology policy.
Bhargava’s latest work focuses on this very topic. His research, “Hiring, Algorithms, and Choice: Why Interviews Still Matter,” was published last week in the journal Business Ethics Quarterly.
“Much of the discomfort of HR managers deferring to algorithms are due to worries about bad outcomes: Did the algorithm make the right call? Was there bad data? Were there any untoward racial or gender biases reflected in the data?” Bhargava says. “But even if these outcomes are ultimately improved through an engineering solution, it still doesn’t settle the question of whether HR managers should defer to algorithms. This is not because our gut instincts are far superior—often they’re not."
“Rather, this is because there are important (and overlooked) ethical values created through us making choices—including choices about whom to work with or not work with—that would be jeopardized, were HR managers to abdicate that choice to an algorithm. This is so, no matter how sophisticated algorithms ultimately become at predicting the fit and performance of an employee.”
If you would like to speak with Prof. Bhargava, please contact GW Media Relations Specialist Cate Douglass at [email protected].
-GW-