← Return to search results
Back to Prindle Institute
Opinion

The Dangers and Ethics of Social Media Censorship

By Marko Mavrovic
19 Sep 2018
"Alex Jones" by Sean P. Anderson licensed under CC BY 2.0 (via Flickr).
“Alex Jones” by Sean P. Anderson licensed under CC BY 2.0 (via Flickr).“Alex Jones” by Sean P. Anderson licensed under CC BY 2.0 (via Flickr).

Alex Jones was removed from Youtube and other major social networks for repeatedly violating the site’s community guidelines. Among other things, Youtube’s community guidelines prohibit nudity or sexual content, harmful or dangerous content, violent or graphic content, and most relevant to this situation, hateful content and harassment. While the site describes its products as “platforms for free expression,” it also states in the same policy section that it does not permit hate speech. How both can be true simultaneously is not entirely clear to me.

Hate speech, however repugnant, is a form of speech/expression. Therefore, if you limit it, you cannot claim to support free speech. You have crossed the threshold into controlled speech. So, the ethical tension in this situation is between the prevention of hateful, hurtful speech and the protection of free speech. If you are in favor of limiting hate speech, you must acknowledge you are in favor of controlling speech, even if it is in a small degree.

Youtube defines hate speech as “content that promotes violence” or “has the primary purpose of inciting hatred against individuals or groups” based on attributes such as race, ethnicity, religion, disability, gender, age, veteran status, sexual orientation, or gender identity, to name a few. I have a couple qualms with this definition, a definition that is fairly consistent with the public discourse.

Firstly, it is difficult to prove intent. Unless the content creator is explicit that the content’s primary purpose is to promote or incite violence and/or hatred against individuals or groups, it is difficult to consistently and accurately discern the purpose of the content. Therefore, enforcing a standard based on the individual’s purpose for their speech is implausible.

Secondly and more importantly, the implications of this definition of hate speech are negative. It would prevent valuable and, in my opinion, necessary criticism. Suppose a group of people is engaging in actual, physical violence. They are destroying property and killing people. If this group claims any religious affiliation or describe themselves as a racial or ethnic group, I risk being unable to criticize them. Indeed, I would be unable to harshly condemn white supremacists for their actions because, after all, they are a group whose existence and mission is based on the race of their members. Additionally, I would be unable to propose combating terrorism perpetuated by an Islamist group because they self-identify as a religious entity.

Youtube’s definition of harassment is equally as troubling. The policy states: “Harassment may include…making hurtful and negative comments/videos about another person.” One immediate and perhaps weak rebuttal to this statement is that there is no reliably objective standard of hurtful, rather it is a subjective response. The threshold of censorship is so low that someone need only express offense or emotional pain at my comment for it to be censored. That is frightening.

The stronger response to this policy is to focus on the “negative” part. I often find conspiracy theories to be distasteful and those frequently perpetuated by Alex Jones uniquely abhorrent. Jones and those of his ilk require scrutiny and sometimes criticism. Yet, under the guidance of social media policy, my opinion warrants censorship because it is a negative condemnation of another person. This is but one example of how the principle of preventing harmful speech can unintentionally clamp down on other valuable forms of speech.

My major objection lies with the subjectivity of these policies. While Youtube may have been noble in its intent to protect certain people from verbal attacks, it is not unreasonable to expect that the language of Youtube’s policy could be deliberately used to limit particular political perspectives from being expressed in the digital town square. Some argue that this censorship has already taken place.

When a social media company is the sole arbiter in determining what speech, and consequently what views, are objectionable, the company can finagle the subjectivity of the descriptors “hurtful” and “hateful” to stifle political perspectives opposed to its own. In fact, any viewpoint – beyond simply political – could effectively be shut down if the company can demonstrate that it is hurtful or harmful of someone. This is the ultimate dystopia of controlling speech: a controlling of thought and debate in a space created precisely for thought and debate.

Circling back to Jones, I do not believe those who create and defend conspiracy theories should be prevented from doing so. They should be given the same protection to express their theories as I am given to criticize them. A world where I can demonstrate through argument that Jones’ theories are wrong is preferable to a world in which I risk being prevented from expressing my thoughts simply because a company or another individual deems it hurtful or hateful. The major social media companies obviously do not.

Facebook and Twitter have jumped on the bandwagon, banning Alex Jones’ content from their sites. Facebook stated that Jones was “glorifying violence” and “using dehumanizing language to describe people who are transgender, Muslims, and immigrants.”

Certainly few can defend the repugnance of the theories and comments he spouts. And it may be tempting to call for a prohibition on such language. But the moment that a social media company or a society calls for a ban on type of speech, they have immediately crossed the threshold into not valuing free speech. Once you no longer value free speech, it becomes much easier to justify eliminating speech that you simply disagree with or believe should not exist. It is an intellectually lazy and dangerous response to speech one finds (subjectively) disagreeable.

Opponents of Jones should not seek to muffle his ideas in the public domain and simply pretend such opinions do not exist. The implications are too grave. Instead, Jones’ opponents should aggressively confront his ideas with better ones. They should demonstrate the abhorrence in his and value in theirs. We should battle it out in the Thunderdome of Free Thought, where the most meritorious idea leaves victorious.

Marko graduated summa cum laude from DePauw University. He has an MSc in International Relations at the London School of Economics. His main interests are in issues of nationalism, sovereignty, and epistemology.
Related Stories