← Return to search results
Back to Prindle Institute
Technology

Judged by Algorithms

By Amy Brown
14 Jan 2016

The Chinese government announced in October that they are setting up a “social credit” system, designed to determine trustworthiness. Every citizen will be put into a database which uses fiscal and government information – including online purchases – to determine their trustworthiness ranking. Information in the ranking includes everything from traffic tickets to academic degrees to if women have taken birth control. Citizens currently treat it like a game, comparing their scores to others in attempts to get the highest score out of their social circle. Critics call the move “dystopian,” but this is only the latest algorithm designed to judge people without face to face interaction.

The United States actually also uses algorithms to evaluate people. This most commonly manifests in credit ratings; your FICO score, compiled with a secret algorithm, determines whether or not you are eligible for a loan. This type of algorithms is so prominent that people who do not use credit cards have an extremely difficult time getting loans when they need one. Employers screen potential hires using their social media profiles in order to determine whether they are a good fit or not, sometimes before ever interacting with the applicant. The government monitors social media sites for national security purposes. The secrecy behind these algorithms encourages conformity, as people are afraid of the unknown systems labelling them or placing them on a certain list. It creates a kind of self-censorship, as people may be concerned that something they say may be taken the wrong way by an algorithm. Although China’s algorithms are viewed as more intense and stifling than American algorithms, it has been said that without oversight and more transparency, they could evolve into a system more like the Chinese.

China’s new approach has been considered even more invasive than common U.S. practices, but supporters of the plan say it is good for China. Since many Chinese do not own houses or cars, this algorithm uses other information to set up a different form of credit system.  It is based on the American FICO system, and therefore is similar in design, just with different inputs to function in China. They also say it builds trust among citizens, allowing for a more convenient and open society. The current credit system is also voluntary, although it will be mandatory in 2020. The current popular system, called Sesame Credit, is simply one of the test-runs of the eventual government program. In the U.S., it is already illegal to have discriminatory outcomes, even if discrimination is not programmed into the algorithms intentionally. There are certain safeguards in place to prevent a dystopian scenario in American algorithms.

Is it ethical to keep the algorithms that judge us completely secret, thereby giving people no idea what needs to be done? Is it ethical to allow automatic algorithms to judge people? Are the current safeguards enough, or does more need to be done to ensure algorithms are used ethically? In the age of technology, with less and less face-to-face communication, it seems crucial for societies to decide how to ethically handle algorithms and the judgements they determine.

Amy graduated from DePauw University in 2017, and was a Hillman Intern and the Digital Media Assistant Managing Editor at the Prindle Institute for Ethics. At DePauw, she was an Honor Scholar and Political Science major with a Russian studies minor. She has spent time abroad in the Czech Republic and now works in Washington, D.C.
Related Stories