← Return to search results
Back to Prindle Institute
BusinessSociety

Workplace Diversity: A Numbers Game

By Conner Gordon
24 Jun 2015

Anyone who has applied for a job is likely familiar with the stress it can bring. Governed by unspoken rules and guidelines that at times seem arbitrary, the hiring process has traditionally been seen as an anxiety-producing but necessary part of starting a career. For some, however, this process is stressful for an entirely different reason: the fear of discrimination by employers. How, then, should the process be reformed to provide a more equitable environment?

According to The Atlantic’s Bourree Lam, the answer may lie in algorithms. Using surveys and company-specific hiring algorithms has already proven successful for companies like Google, where employers were looking to improve the diversity of their employee base. Such programs have significantly reduced discriminatory hiring practices, boosting hiring of African Americans by as much as 60 percent in some cases. In standardizing the criteria employers are looking for, these algorithms stand to cut down on race-based employment discrimination that could result from human input.

The downfalls of human influence on hiring practices is well documented. Applicants with “white-sounding” names receive 50 percent more callbacks on job applications than those with “black-sounding” names, according to research done in 2003. And in a high profile social experiment, a man named Jose, who had been unsuccessful in a lengthy job search, removed the “s” from his name on applications and immediately began receiving callbacks. Clearly, then, a hiring system with less human input may help cut down on discriminatory practices.

However, as Lam’s article also points out, the process could have its downsides. It is entirely possible, she writes, that embracing an algorithm-driven approach could actually further entrench discriminatory hiring practices behind the appearance of objectivity. Even the most complex algorithms are ultimately human creations, so the possibility of discrimination being written into them cannot be ruled out.

It could also be argued that the human element of the job search, such as interviews, play a positive role. While offering the possibility of discrimination, interviews also provide a human element to the job search. In this way, they offer a sense of comfort to unsuccessful job-seekers; failure to get a job could be blamed on a poor response to one question, or to a superficial factor that affected the interview. In contrast, the air of objectivity surveys and algorithms bring may suggest to unsuccessful candidates that they are objectively and definitively not good enough. When combined with the other stresses of job-seeking, such algorithms may ultimately make the job search more stress-inducing than it already is.

In formulating algorithm-based hiring, then, programmers and employers must be conscious of such human influences. As Lam’s article points out, such algorithms stand to reduce discriminatory hiring practices by a remarkable degree. However, implementing them without proper care could lead to a dehumanizing environment for job-seekers, or even an entrenchment of the discrimination they aim to prevent. The benefits of such algorithms, then, must be balanced with a crucial fact: the job search is still a human practice, and respect for those involved in it is necessary.

 

Conner was a Graduate Fellow at the Prindle Institute from 2016-2018. Conner's writing focuses on memory, politics and culture. He is currently an MFA candidate at the University of Oregon.
Related Stories