← Return to search results
Back to Prindle Institute

Why Bother with Political Arguments?

photograph of protestors marching

Moral arguments can seem pointless in our hyper-polarized, post-truth society. People rarely seem to change their minds on hot-button political issues, like abortion, gun control, or climate change. Yet Congress recently passed a bill protecting same-sex marriage, and with the support of many Republicans. That bipartisan effort would have been impossible ten years ago.

Does social change like this result from moral arguments, though? Moral reasoning gets a bad rap, but it truly fuels moral progress, so long as it comes from a place of mutual trust and respect.

Ethics professors (like us) might be prone to valorize moral reasoning. We study moral arguments for a living, yet don’t appear to be more ethical than other people. We’re just skilled at poking holes in opposing moral views or coming up with reasons to support our own.

If anything, arguing about politics only seems to make each side dig in their heels (or worse, become more extreme). Cognitive science reveals that, when it comes to ethics and politics, we regularly use reasoning to rationalize the values we already hold. Climate activists often assume that skeptics just don’t understand the science. But research increasingly suggests that the more skeptics know about the science the less they think climate change is a serious threat.

Yet some political partisans do change their minds in light of arguments. For years, Jerry Taylor, a former fellow of the Cato Institute, churned out talking points for climate skeptics. Respected peers eventually convinced him of flaws in his sources and holes in his arguments. Eventually Taylor’s skepticism eroded, and he left Cato to become a climate activist.

It’s not just conservatives who change their minds in response to reasons. The science writer, Bethany Brookshire, once produced a viral tweet about the apparent sexism hitting her inbox. Of the scientists who correspond with her, men seemed much less likely to start their emails with her proper title, “Dr. Brookshire.” However, going back through her emails revealed that it was women who were slightly less likely to use the more formal and respectful title. So Brookshire publicly corrected her mistake.

Even if some people are persuaded by rational argument, aren’t these cases incredibly rare? These stories are anecdotes, but they make vivid a tendency present in us all that’s usually just blocked by other factors. As Julia Galef puts it, although we commonly adopt a “solider mindset,” hellbent on defending core beliefs at all costs, it isn’t inevitable. We are able to adopt a “scout mindset” aimed at an accurate map of the complex terrain.

Recent experiments suggest ordinary people’s attitudes and behavior can respond to arguments about contemporary moral issues. One intriguing study found that students in a college ethics class who studied a famous moral argument for vegetarianism purchased less meat from the dining hall, compared to another group of students who were randomly assigned to study an argument in favor of donating more to charity. Another series of experiments provided participants an opportunity to donate part of a bonus to a charitable organization. The researchers found that reading moral arguments could increase charitable giving, provided the arguments are engaging. These and other studies provide evidence that moral reasoning can change real moral behavior, not just self-reported attitudes.

The trick is to recognize the many forms of reasoning. Moral arguments can be presented as a boring set of premises that logically lead to a conclusion, or lobbed at opponents within a culture of contempt. But these aren’t the only, or the most effective, forms of moral reasoning.

Us humans are not cold robots designed to dispassionately reason alone. Moral reasoning evolved among social primates to resolve problems of interdependent living. Competing ideas arise in dialogue with others, and you’re not going to buy an argument from someone you despise or distrust, and certainly not from someone who treats you with contempt. Mutual trust and respect are required for arguments to be taken seriously in the first place.

Should we conclude that emotions, not reasons, drive social change? It’s both. Mutual trust and respect enable reasoning to do its work of changing moral attitudes.

Consider one way support for same-sex marriage has increased dramatically in the past few decades. Many people — including Republicans like Rob Portman and Dick Cheney — have discovered that a beloved friend or family member is gay. Existing empathy and respect for a loved one removes barriers to understanding the oppression of gay people and to seeing true love between same-sex partners. People have reasoned that if their loved one doesn’t deserve discrimination and stigma, then other gay people don’t either. Consistency reasoning of this sort is ubiquitous in moral life.

Moral arguments from the opposing side are certainly hard to swallow, for they often conflict with our values and challenge our identities. But when we deride reasoning in politics, we’re no better than a physician who concludes that a drug is ineffective because patients refuse to take it. As Taylor emphasizes, once he heard arguments from people he trusted and respected, he opened up, and over time his skepticism began to weaken because he appreciated the arguments.

When moral arguments are planted in fertile ground, they are merely sowed seeds. And we’re not talking about jalapeño seeds, which produce peppers in a few months. Think avocados, which can take a dozen years to bear fruit. During that time, the ground must remain fertile as well. Combative arguments brimming with contempt can poison the well and yield crop failure.

Moral reasoning so conceived is truly a driving force for social change. Without it, progress is impossible. The key is patience, persistence, and mutual respect. Under the right conditions, moral arguments can move mountains — slowly but surely.

Monarchy and Moral Equality

photograph of Queen's Guard in formation at Buckingham Palace

In a recent column, Nicholas Kreuder argues that the very idea of monarchy is incompatible with the moral equality of persons. His argument is straightforward. He claims that to be compatible with moral equality, a hierarchy of esteem must meet two conditions. First, the person esteemed must be estimable — in other words, esteem for her must be earned, or at least deserved. Second, deferential conduct toward the esteemed person must not be coerced or otherwise involuntary. But the deference demanded by a monarch is neither warranted, nor voluntarily given: monarchs are esteemed only for their royal pedigree, and their subjects are expected to show esteem even though they are not, at least in the typical case, subjects by choice. Therefore, the hierarchy of esteem between monarch and subject is fundamentally incompatible with moral equality.

This argument is compelling, and as a confirmed republican, I confess bewilderment at the practice of paying a woman to live in fabulous wealth for a century so that she can christen the nation’s boats.

Nevertheless, for the sake of argument, I would like to critically examine Kreuder’s premises to see whether they really establish his sweeping conclusion.

The first question to consider is a simple one: what is a monarch? The argument against monarchy from moral equality appears to assume that monarchies are by definition hereditary, and that they are never elective. In fact, elective or non-hereditary monarchies are not unusual in human history. In Ancient Greece, the kings of Macedon and Epirus were elected by the army. Alexander Hamilton argued for an elective monarchy in a speech before the Constitutional Convention of 1787; he thought the American monarch should have life tenure and extensive powers.

In truth, authoritative sources seem confused about just what a monarchy is. For instance, the Encyclopedia Britannica defines “monarchy” as “a political system based upon the undivided sovereignty or rule of a single person.” Yet the accompanying article acknowledges that in constitutional monarchies, the monarch has “transfer[red] [her] authority to various societal groups . . . political authority is exercised by elected politicians.” That does not sound like undivided sovereignty to me.

My conclusion is that “monarch” is a label promiscuously affixed to wildly different kinds of regimes, leaving the concept monarch without much determinate content. A monarchy can be limited or absolute, elective or hereditary.

It’s difficult to argue that a concept with little determinate content is incompatible with moral equality. However, if we limit the argument to hereditary monarchies, then the argument against monarchy from moral equality appears to get back on track. If the monarch is not elected, then the deference she demands is not voluntary. And if her claim to esteem is inherited, then it is certainly not deserved.

Yet even when restricted to hereditary monarchies, the argument does not seem entirely plausible. The problem is that in some cases, the hierarchy of esteem between a particular hereditary monarch and her subjects seems voluntary. Consider the United Kingdom’s hereditary but constitutional monarchy. The citizens of that country appear to have widely divergent views about both their monarchs and their monarchy. Some people detest the newly-crowned King Charles III, yet have no qualms with the monarchical institution. Some liked Queen Elizabeth on a personal level but are staunch republicans. Moreover, Britons do not keep their opinions on this score a secret, and they are not generally thrown in jail for publicly criticizing the monarch in the harshest terms. (Although in saying this, reports of anti-royal protestors being arrested on bogus charges of breaching the peace make me pause.)

No one is forced to sing “God Save the Queen” who doesn’t wish to do so. In short, in the U.K., deference to the monarch may be encouraged, but it is certainly not required. A Briton can thrive in her society without ever showing the slightest deference to her monarch.

With respect to the hierarchy of esteem between the U.K.’s monarch and her subjects — as opposed to her constitutional functions or public prominence — the situation seems somewhat akin to the relationship between Catholic priests and the rest of society in the United States. Even non-Catholics regularly refer to priests as “father,” a gesture of deference that is less required than customary. (That said, I refrain from this practice if at all possible; it mildly affronts my democratic temperament. This did not go over particularly well at Notre Dame.)

It might also be doubted whether every hereditary monarch does not deserve esteem. Many Britons seem to think that with her stoicism and quiet dignity, Queen Elizabeth provided stability over the course of a turbulent twentieth century. I take no stance on that proposition, but it certainly seems conceivable that a monarch could come to earn esteem through her exemplary conduct either before she ascends to the throne or while she serves as monarch.

Thus, the extent to which monarchy cuts against moral equality really depends on the conditions of the society in which a particular monarchical institution exists.

The concept of a hereditary monarchy might seem incompatible with moral equality at a very high level of abstraction, but some of its instantiations may be perfectly consonant with it.

This does, however, lead me to a more philosophical point. In the argument against monarchy from moral equality, the test for whether a hierarchy of esteem is morally legitimate requires it to meet both the conditions of deservedness and voluntariness. But it appears that voluntariness alone is sufficient to make such a hierarchy compatible with moral equality. If I routinely genuflect before my girlfriend because I believe her gorgeous auburn hair possesses mystical powers, that does not seem particularly demeaning to my dignity so long as my delusional belief cannot be said to undermine the voluntariness of my deferential act — even though the deference is wholly undeserved. Likewise, so long as a Briton is not forced to pay obeisance to King Charles III, her acts of deference seem to be compatible with her dignity even if the king doesn’t deserve them.

Indeed, voluntariness seems not only sufficient to legitimize a hierarchy of esteem, but also necessary. Martin Luther King, Jr. and Malcolm X are both, in my view, figures richly deserving of esteem. Yet if I were forced to regularly kiss their feet, that hierarchy of esteem would be an insult to my dignity as a moral equal.

Philosophers love abstractions, and I am no exception. Sometimes, however, what appears to be a strong argument at a high level of abstraction loses some of its luster once the messy reality of human existence is brought into view. Such is the case, I think, with the claim that monarchy is per se incompatible with human equality.

Should Monarchies Be Abolished?

photograph of British monarchy crown

On September 8th, 2022, Queen Elizabeth II of England died at the age of 96. She held the crown for 70 years, making her the longest reigning monarch in the history of Britain. Her son, now King Charles III, will likely be coronated in mid-2023.

The death of the British monarch has drawn a number of reactions. Most public officials and organizations have expressed respect for the former monarchies and sympathy towards her family. However, others have offered criticism of both the Queen and the monarchy itself. Multiple people have been arrested in the U.K. for anti-royal protests. Negative sentiment has been particularly strong in nations that were previously British colonies – many have taken to social media to critique the Crown’s role in colonialism: the Economic Freedom Fighters, a minority party in South Africa’s parliament released a statement saying they will “not mourn the death of Elizabeth,” and Irish soccer fans chanted that “Lizzy’s in a box.” Professor Maya Jasanoff bridged the two positions, writing that, while Queen Elizabeth II was committed to her duties and ought to be mourned as a person, she “helped obscure a bloody history of decolonization whose proportions and legacies have yet to be adequately acknowledged.”

My goal in this article is to reflect on monarchies, and their role in contemporary societies. I will not focus on any specific monarch. So, my claims here will be compatible with “good” and “bad” monarchs. Further, I will not consider any particular nation’s monarchy. Rather, I want to focus on the idea of monarchy. Thus, my analysis does not rely on historical events. I argue that monarchies, even in concept, are incompatible with the moral tenets of democratic societies and ought to be abolished as a result.

Democratic societies accept as fundamentally true that all people are moral equals. It is this equality that grounds the right to equal participation in government.

Equal relations stand in contrast to hierarchical relationships. Hierarchies occur when one individual is considered “above” some other(s) in at least one respect. In Private Government, Elizabeth Anderson distinguishes between multiple varieties of hierarchy. Particularly relevant here are hierarchies of esteem. A hierarchy of esteem occurs when some individuals are required to show deference to (an) other(s). This deference may take various forms, such as referring to others through titles or engaging in gestures like bowing or prostration that show inferiority.

Generally, hierarchies of esteem are not automatically impermissible. One might opt into some. For instance, you might have to call your boss “Mrs. Last-Name,” athletes may have to use the title “coach” rather than a first name, etc. Yet, provided that one freely enters into these relationships, such hierarchies need not be troubling. Further, hierarchies of esteem may be part of some relationships that one does not voluntarily enter but are nonetheless morally justifiable – children, generally, are required to show some level of deference to their parents (provided that the parents are caring, have their child’s best interests in mind, etc.), for instance.

The problem with the monarchy is not that it establishes a hierarchy of esteem, but rather that it establishes a mandatory, unearned hierarchy between otherwise equal citizens.

To live in a country with a monarch is to have an individual person and family deemed your social superiors, a group to whom you are expected to show deference, despite your moral equality. This is not a relationship you choose, but rather, one that is thrust upon you. Further, the deference we are said to owe to, and the higher status of, monarchs is not earned. Rather, it is something that they are claimed to deserve simply by virtue of who their parents are, who in turn owe their elevated status to their lineage. Finally, beyond merely commanding deference, monarchs are born into a life of luxury; they live in castles, they travel the world meeting foreign dignitaries, and their deaths may grind a country to a halt as part of a period of mourning.

So, in sum, monarchies undermine the moral foundation of our democracies. We value democratic regimes (in part) because they recognize our equivalent  moral standing. By picking out some, labeling them as the superiors in a hierarchy of deference due to nothing but their ancestry, monarchies are incompatible with the idea that all people are equal.

However, there are some obvious ways one might try to respond. One could object on economic grounds. There is room to argue that monarchies could potentially produce economic benefits. Royals may serve as a tourist attraction or, if internationally popular, might raise the profile and favorability of the nation, thus increasing the desirability of its products and culture. So perhaps monarchies are justified because they are on the whole beneficial.

The problem with this argument is that it compares the incommensurable. It responds to a moral concern by pointing out economic benefits.

My claim is not that monarchy is bad in every respect. Indeed, we can take it for granted that having a monarchy produces economic benefits. However, my claim is that it undermines the moral justification of democracy.

Without a larger argument, it does not follow that economic benefits are sufficient to outweigh moral concerns. This would be like arguing that we should legalize vote-selling due to its economic benefits – it seems to miss the moral reason why we structure public institutions the ways that we do.

Another objection may be grounded in culture. Perhaps monarchies are woven into the cultural fabric of the societies in which they exist; they are part of proud traditions that extend back hundreds or even thousands of years. To abolish a monarchy would be to erase part of a people’s culture.

While it’s true that monarchies are long traditions in many nations, this argument only gets one so far. A practice being part of a people’s culture does not make it immune to critique. Had the Roman practice of gladiatorial combat to the death for the sake of entertainment survived to this day, we would (hopefully) think it ought to be eliminated, despite thousands of years of cultural history.

When a practice violates our society’s foundational moral principles, it ought to be abolished no matter how attached to it we have become.

Finally, one might argue that abolition is unnecessary. Compared to their status throughout history, monarchies have fallen out of grace in the 20th and 21st centuries. Of the nations with monarchies, few have a monarch which wields anything but symbolic power (although some exceptions are notable). This argument relies on a distinction between what we might call monarchs-as-sovereigns and monarchs-as-figureheads. Monarchs-as-sovereign violate the fundamental tenets of democracy by denying citizens the right to participate in government, while monarchs-as-figureheads, wielding only symbolic power, do not, or so the argument goes.

The issue with this argument is that it underappreciates the full extent of what democracy demands. It does get things right by recognizing that the commitment to democracy arises from the belief that people deserve a say in a government that rules over them. However, it is just not that all citizens deserve some say, but rather, that all citizens deserve an equal say. One person, one vote.

Part of the justification for democracy is that individuals ought to be able to shape their lives, and thus deserve a say in the institutions that affect us all.

Although individuals may vary in their knowledge or other capabilities, to give some greater say in our decision making is to give them disproportionate power to shape the lives of others. No one individual should automatically be someone to whom we all must defer. We might collectively agree to, say, regard someone as an expert in a particular matter relevant to the public good and thus defer to her. However, this only occurs after we collectively agree to it in a process where we all have equal say, either by voting directly for her, or voting for the person who appoints her. Unless we have a parity of power in this process, then we diminish the ability of some to shape their own lives.

On these grounds, perhaps a monarchy could be justified if the citizens of a nation voted the monarch into power. This would simply be another means of collective deferment. But since electorates are constantly changing, there would need to be regular votes on this to ensure the voters still want to defer to this monarch. Yet current monarchies, by elevating the monarch (and family) above others while leaving this outside the realm of collective decision-making, violate the moral justification of democracy – some are made superior by default in the hierarchy of esteem. The establishment of democracy and abolition of all monarchy are proverbial branches that stem from the same tree. Our recognition of human equality should lead us to reject monarchy in even innocuous, purely symbolic forms.

Great Man Syndrome and the Social Bases of Self-Respect

black and white photograph of David statue with shadow on wall

“Am I good enough?” “Was someone else smarter or more talented than me?” “Am I just lazy or incompetent?” You might find these thoughts familiar. It’s an anxiety that I have felt many times in graduate school, but I don’t think it’s a unique experience. It seems to show up in other activities, including applying for college and graduate school, pursuing a career in the arts, vying for a tenure-track academic job, and trying to secure grants for scientific research. This anxiety is a moral problem, because it can perpetuate imposter syndrome – feelings of failure and a sense of worthlessness, when none of these are warranted.

The source of this anxiety is something that I would like to call “great man syndrome.” The “great man” could be a man, woman, or non-binary person. What is important is the idea that there are some extra-capable individuals who can transcend the field through sheer force of innate ability or character. Gender, race, and other social categories matter for understanding social conceptions of who has innate ability and character, which can help to explain who is more likely to suffer from this angst, but “great man syndrome” can target people from any social class.

It functions primarily by conflating innate ability or character with professional success, where that professional success is hard to come by. For those of us whose self-conceptions are built around being academics, artists, scientists, or high achievers and whose professional success is uncertain, “great man syndrome” can generate uncertainty about our basic self-worth and identity. On the other hand, those who achieve professional success can easily start to think that they are inherently superior to others.

What does “great man syndrome” look like, psychologically? First, in order to continue pursuing professional success, it’s almost necessary to be prideful and think that because I’m inherently better than others in my field in some way, I can still achieve one of the few, sought-after positions. Second, the sheer difficulty and lack of control over being professionally recognized creates constant anxiety about not producing enough or not hitting all of the nigh-unattainable markers for “great-man-ness.” Third, these myths tie our sense of well-being to our work and professional success in a way that is antithetical to proper self-respect. This results in feelings of euphoria when we are recognized professionally, but deep shame and failure when we are not. “Great man syndrome” negatively impacts our flourishing.

My concept of “great man syndrome” is closely related to Thomas Carlyle’s 19th century “great man theory” of history, which posits that history is largely explained by the impacts of “great men,” who, by their superior innate qualities, were able to make a great impact on the world. There are several reasons to reject Carlyle’s theory: “great men” achieve success with the help of a large host of people whose contributions often go unrecognized; focusing on innate qualities prevents us from seeing how we can grow and improve; and there are multiple examples of successful individuals who do not have the qualities we would expect of “great men.”

Even if one rejects “great man theory,” it can still be easy to fall into “great man syndrome.” Why is this the case? The answer has to do with structural issues common to the fields and practices listed above. Each example I gave above — scientific enterprises, artistic achievement, higher educational attainment, and the academic job market — has the following features. First, each of these environments are highly competitive. Second, they contain members whose identities are tied up with that field of practice. Third, if one fails to land one of the scarce, sought-after positions, there are few alternative methods of gainful employment that allow one to maintain that social identity.

The underlying problem that generates “great man syndrome” isn’t really the competition or the fact that people’s identities are tied up with these pursuits; the problem is that there are only so many positions within those fields that ensure “the social bases of self-respect.” On John Rawls’s view, “the social bases of self-respect” are aspects of institutions that support individuals by providing adequate material means for personal independence and giving them a secure sense that their aims and pursuits are valuable. To be recognized as equal citizens, people need to be structurally and socially supported in ways that promote self-respect and respect from others.

This explains why “great man syndrome” strikes at our basic self-worth — there are only so many positions that provide “the social bases of self-respect.” So, most of the people involved in those pursuits will never achieve the basic conditions of social respect so long as they stay in their field. This can be especially troubling for members of social classes that are not commonly provided “the social bases of self-respect.” Furthermore, because these areas are intrinsically valuable and tied to identity, it can be very hard to leave. Leaving can feel like failing or giving up, and those who point out the structural problems are often labeled as pessimistic or failing to see the true value of the field.

How do we solve this problem? There are a few things that we as individuals can do, and that many people within these areas are already doing. We can change how we talk about the contributions of individuals to these fields and emphasize that we are first and foremost engaged in a collective enterprise which requires that we learn from and care for each other. We can reaffirm to each other that we are worthy of respect and love as human beings regardless of how well we perform under conditions of scarcity. We can also try to reach the halls of power ourselves to change the structures that fail to provide adequate material support for those pursuing these aims.

The difficulty with these solutions is that they do not fundamentally change the underlying institutional failures to provide “the social bases of self-respect.” Some change may be effected by individuals, especially those who attain positions of power, but it will not solve the core issue. To stably ensure that all members of our society have the institutional prerequisites needed for well-being, we need to collectively reaffirm our commitment to respecting each other and providing for each other’s material needs. Only then can we ensure that “the social bases of self-respect” will be preserved over time.

Collective action of this kind itself undermines the core myth of “great man syndrome,” as it shows that change rests in the power of organization and solidarity. In the end, we must build real political and economic power to ensure that everyone has access to “the social bases of self-respect,” and that is something we can only do together.

Politics and Respect in the Wake of Mass Shootings

An aerial photo of the Las Vegas strip, where the 2017 shooting occured.

On October 1, a gunman opened fire on a country music festival in downtown Las Vegas. Almost immediately following news of the shooting, prominent politicians such as Hillary Clinton and Bernie Sanders tweeted pleas for stronger gun control. These tweets drew harsh criticism regarding the politicization of mass shootings. Such criticism appears in the wake of mass shootings, as people assess when it is too soon to start discussing gun control, and what can be done in the future to prevent such tragedies. Continue reading “Politics and Respect in the Wake of Mass Shootings”

Should Hugh Hefner be Buried Next to Marilyn Monroe?

An old snapshot of Hugh Hefner smoking a pipe.

Hugh Hefner, the founder of Playboy magazine, died on September 27 at the age of 91. A leader in the sexual revolution of the second half of the twentieth century, Hefner has been a controversial figure throughout his life. Playboy magazine launched in 1953 and hurled his empire into mainstream success that spanned media, later including multiple television shows, clubs, restaurants, and the notoriously excessive Playboy Mansion.

Continue reading “Should Hugh Hefner be Buried Next to Marilyn Monroe?”

Respecting the Dead: The Case of Charles Byrne, the Irish Giant

Charles Byrne died quite young, at the age of 22, and quite tall, at approximately seven feet, eight inches. This is still tall for today, but must have been more impressive during Mr. Byrne’s short life in the late 18th century. According to an Ohio State University researcher, the average height for men in Northern Europe in the 17th and 18th centuries was only about five feet, five inches. Today, the average height for men in Northern Ireland has been calculated to be about five feet, 10 inches.

Continue reading “Respecting the Dead: The Case of Charles Byrne, the Irish Giant”