← Return to search results
Back to Prindle Institute

On the Appropriateness of Shame

photograph of the Statue of Cain in Paris

Shame has taken up a prominent role in the public discourse recently. For instance, The Atlantic’s Conor Friedersdorf recently tweeted, arguing that Americans have an obligation to right past wrongs but not to feel shame over “wrongs perpetrated before our births.” Shame also plays a role in discourse about the pandemic. Earlier on, people might have felt shame over getting COVID-19: “If someone who thought they were being careful got the virus, well…maybe they weren’t being so careful.” And now the issue of vaccine shaming arises, with debates over whether people should be shamed for not getting the vaccine.

But shame is a nuanced thing. It is an emotion we feel, but it is also something we do to other people. I might feel shame, but I might also try to get you to feel shame: I shame you. This leads to two different questions: When is it appropriate to feel shame? When is it appropriate to shame somebody?

One mistake, a mistake that Friedersdorf makes, is to tie shame too tightly to wrongdoing. Some emotions are linked to wrongdoing. For instance, guilt tends to be linked to having done something morally wrong. And you certainly can be ashamed of your own wrongdoing. But there are more things in heaven and earth than moral rightness and wrongness. Some things are ugly, pitiful, or bad in non-moral ways. You might also be ashamed that you have a large nose, or you might be ashamed that you were too cowardly to take an exciting opportunity.

If shame were tied only to your own wrongdoing, then shame over wrongs perpetrated before your birth would be nonsensical. But shame isn’t even just tied to what you have done, hence the possibility of being ashamed of your nose. Shame is instead based on who we are. And shame is distinctly interpersonal: much of the time we feel shame because we know others think poorly of us (perhaps because of our looks or our inability to better ourselves). Further, who we are is based on our broader connections to other people: being in a family, being a fan of a certain sports team, or being a citizen of someplace or other.

So, you might be ashamed not of your own wrongdoing, but of the wrongdoing of your father. And you might be ashamed of your country, too. Nikole Hannah-Jones said that she was ashamed of America’s bombing of Hiroshima.

Now, you might question whether we should feel ashamed by things we haven’t done, by things we are merely associated with. For one, it seems perfectly reasonable to care about our non-moral qualities and to care about what others think of us. Secondly, shame and pride come hand-in-hand. Parents are proud of what their kids have done, and people are proud of their country’s achievements. Hannah-Jones was right when, responding to Friedersdorf, she pointed out that if you want to feel proud of your country – for what it does well now, and what it has done well through its history – you better be willing to be ashamed of it, too, for what it does badly and what it did badly in the past.

So, we can be ashamed of many things, including things we haven’t done. What about shaming somebody else? When should we shame people? Perhaps the obvious answer is: when they have done something shameful.

Though there might be a variety of forms of shaming, how shaming works should be fairly obvious: if you fail to meet certain standards, other people – remember, shame is interpersonal – can point out that they think less of you. For this to be effective, you need to then reflect on your failures, and this can involve feeling shame: you see why they think less of you, and you think less of yourself for it. Perhaps this process even must involve shame: to fully appreciate your failure might require that you do in fact feel ashamed of it.

So, when should we shame people? Again, the obvious answer is “when they do something shameful,” but that is too simple. It can depend on the relationship between people. You – a serial card cheat – might have no right to tell me that it’s wrong to count cards. You – a stranger on the street – might have no right to tell me not to be so rude to my wife (whereas our friends can step in and say something). So, shaming might be inappropriate if you are a hypocrite or if you have no business in judging me, whereas if you are a respected member of my community and my actions negatively affect my community, you might be well placed to shame me.

We must also keep in mind that some forms of shaming might carry costs: rather than making somebody feel mildly ashamed for a past misdeed, you might make them feel awful. And we need to be careful, as Kenneth Boyd noted in this venue, because shaming can be unfair, either picking out individuals who may have done something that was more acceptable at the time, and it can be a tool of bigotry, shaming people for being a minority and perpetuating harmful systems of oppression.

So, should we shame people for not getting vaccinated? Firstly, not all the unvaccinated have acted shamefully. In places where it can be hard to get time off of work to get the jab (or where people are not aware that they are entitled to time off), or in places where misinformation is rife, perhaps they are meeting or exceeding the standards we should expect of them as fellow members of the public. Or they may have genuine, conscientious objections.

But it is more likely that opposition to “vaccine shaming” turns on the idea that shaming is ineffective. Somebody might be acting shamefully: they might be failing to protect others, relying upon an overly individualized notion of rights (and failing to recognize how they interact with others in a society), and failing to evaluate the evidence properly because – though they should know better – they have been captured by petty, angry politics. It can be frustrating to be told not to shame these people. But if our aim is to get them to take the vaccine, we need to find an alternative strategy that doesn’t prompt a retreat into deeper skepticism.

Or, so the argument goes. But maybe that argument is wrong: there is some evidence that appealing to the sense of shame or embarrassment someone would feel if they spread COVID to a loved one is somewhat effective at increasing the vaccination rate. Ultimately, I don’t know when Americans should feel shame for what happened in the past. And I don’t know when we should shame people for their behavior in this pandemic. I do know that to have a well-informed public discussion, we need to understand the many facets of shame.

The Moral Danger of Conservative Nostalgia

photograph of old movie projector displaying blank image on screen

When the news recently broke that a remake of the 1992 film The Bodyguard (originally starring Whitney Houston and Kevin Costner) is in the works, the internet quickly jumped to imagine who might play the lead roles (one popular trend is pushing for a Lizzo/Chris Evans team-up). Against the excitement, however, were critics arguing that the original film was artistically unique (the soundtrack, led by Houston, is hailed as an extraordinary achievement) and that it would be a mistake to try and recreate it without the incomparable Houston at the helm. Nevertheless, the project is moving forward with the award-winning playwright Matthew López writing the script.

There’s a kind of nostalgia at work in this story that, I want to argue, is not only aesthetically questionable, but can (at least potentially) pose serious moral dangers for a culture enamored with “the good ol’ days.”

It’s become something of a cliché to whine about the deluge of remakes, adaptations, reboots, sequels, and reenvisionings coming from Hollywood. The last year alone has seen new versions of Space Jam, Coming to America, Mulan, and Bill and Ted’s Excellent Adventure released and sequels to Top Gun, The Matrix, Ghostbusters, and Scream are finishing production (with reboots of everything from Twister to Pirates of the Caribbean to The Passion of the Christ and more in the works). Similarly, television shows like Full House, Who’s the Boss, Gilmore Girls, Saved by the Bell, and The Wonder Years have all recently returned with new episodes. (None of these lists are comprehensive.) When considered alongside the distinct trend of constructing an interwoven narrative across multiple films — most famously demonstrated by the Marvel Cinematic Universe — it could very well seem like “no one in Hollywood now has an original notion in their heads.”

But, importantly, this is not a new phenomenon: as most any historian of film will attest, Hollywood has always been in the business of re-telling pre-existing stories. One of the earliest remakes (a film called L’Arroseur) was released in 1896, making reboots older than the Titanic, sliced bread, and the state of Oklahoma. Remember that many films now considered classics — such as The Wizard of Oz, The Godfather, Psycho, and Mean Girls — were adaptations of already-published books; others like Scarface, The Maltese Falcon, and Angels in the Outfield were themselves remakes of already-released films. The upcoming He’s All That is, most directly, a remake of the 1999 movie She’s All That, but that was first a re-envisioning of 1964’s My Fair Lady which was itself an adaptation of George Bernard Shaw’s Pygmalion.

Rachel Fraser has described a “nostalgic culture” as “one bogged down in its own history” — the repeated retelling of familiar stories (and the market-driven incentives that motivate studio execs to continually greenlight such projects) is indeed one way that our present age is one “driven” by nostalgia.

But it is not the dangerous one I have in mind.

In general, ‘nostalgia’ describes an experience of sentimental or mournful longing for one’s past; in the words of philosopher Paula Sweeney, it is an “emotional response to change.” One way a person might so respond is by clinging to their memories, perhaps seeking to recapitulate a particular experience (now with updated special effects, pop culture references, and box office returns). But another is to enshrine the details of one’s memory in a manner that sacralizes the past event such that alterations to its contemporary retelling become offensive (or even heretical). The first kind of nostalgia motivates audiences to want a remake of The Bodyguard; the second kind decrees that no such remake could possibly compare to the original. If the first kind of nostalgia is analogous to addiction, the second might be comparable to idolatry.

While both kinds of nostalgia are focused on the past, only one drives people to try and bring that past (however modulated) into the present; call this form of nostalgia (that stokes the fires of reboots galore) repetitive nostalgia. The other kind of nostalgia promotes precisely the opposite, explicitly prohibiting any contemporary recreations of the past that could potentially alter things for the subjective worst; this kind of conservative nostalgia instead seeks to preserve a crystallized form of what the person remembers in order to protect them from the emotional damage of new changes.

Conservative nostalgia is, I contend, a key factor in the phenomenon of “toxic” fandoms, wherein devoted admirers of some element of pop culture bully other people for the sake of preserving their particular perspective on what they love. Sometimes, this is in response to a perceived attack on the object of their affection, as when one fan tweeted a mild criticism of rapper Nicki Minaj in June 2018 and then faced days of verbal attacks online that escalated to serious privacy violations and the loss of her job. Other times, toxic fandom is triggered alongside other biases, such as racism or sexism: consider the irrational backlash from some circles to Brie Larson’s MCU character or Kelly Marie Tran’s and John Boyega’s Star Wars roles. In each case, conservative nostalgia provokes fans to interpret new (sometimes quite minute) changes as threats, thereby prompting them to act in wildly inappropriate (and sometimes literally life-threatening) ways.

In a different way, something like conservative nostalgia seems to be the foundation of many defenses of preserving the statues built in the early-20th century to honor the failed leaders of the Confederate States of America. Although on one hand, it might seem odd for citizens of a country to want to honor domestic terrorists who formerly attacked that same country, the fact that these monuments have been standing for decades means that, for many people, the visual experience of those statues on roadsides or in town squares is a regular (and perhaps even comforting) element of familiar routines, regardless of who or what the monuments commemorate: to remove them is to make a change that can provoke the sorts of threat responses inherent to experiences of conservative nostalgia.

(To be clear, I do not mean to suggest that conservative nostalgia is the only relevant factor in the debate over Confederate statue removal, but rather that its affective consequences are one important — and perhaps often overlooked — element that should be considered.)

But online bullying and the perpetuation of oppressive propaganda, though seriously morally problematic, still might not count as dangerous (in the standard sense of the term). But when we consider the Capitol Riot of January 2021, we can indeed see extreme ramifications of conservative nostalgia that are easily recognizable as alarmingly unsafe: violent insurrection in an attempt to prevent undesired change. Misled by tangled webs of conspiracy theories and spurred on by multiple prominent figures, the failed insurrectionists sought to protect a false-but-subjectively-comforting narrative for themselves about the outcome of the 2020 election and the continued political career of Donald Trump, even if that meant seriously harming others, destroying historic property, and violating scores of laws. Though the rioters’ nostalgia seems to have been rooted in a mixture of ideologies and beliefs ranging from Christian nationalism to white supremacy to neoliberalism to “home-grown fascism,” one common thread was a fearful resistance to the administrative changes taking place inside the building they were storming — it was a conservative nostalgia that responded to change in an abjectly violent way. (And, concerningly, some politicians and media figures are already starting to reference the Capitol Riot positively, further sedimenting this conservative nostalgia within their own brands to, presumably, further weaponize it for self-serving political and financial support.)

Nevertheless, I don’t think that nostalgia — neither repetitive nor conservative — is necessarily bad: it’s an emotional response to triggers that can motivate further action, but it is those triggers and actions that are directly morally assessable. Still, to overlook the role played by our affective systems is to ignore an important element of human life that can have a massive influence on our thoughts and behavior — and we shouldn’t forget that.

The Knowledge Norms of Emotions

simple single-line drawing of person curled up in bed

This post begins with a sad backstory. A little while back my wife and I had a miscarriage of our first child. There was a lot that was terrible in the experience, but in this post I want to address a certain oddity that I noticed about grief.

Due to a range of bureaucratic complications, it took about a week from when we first suspected a miscarriage to when we had final confirmation. During that week, our confidence that we had miscarried grew, but throughout the period it remained a likelihood rather than a certainty.

What surprised me, during that week, was that the uncertainty made it difficult to grieve. Even when I was pretty sure we had lost the child, it felt ‘out of whack’ to grieve the loss, since there was a chance the child was still alive. It was a terrible week, and I was extremely sad, but it felt out of joint to grieve for the child while recognizing the chance that all might be well. There was no obstacle to feeling anxious, there was nothing out of joint about feeling trepidation, but outright grief felt strange. And it continued to feel strange until we received confirmation of the loss.

This, eventually, got me wondering: is grief characterized by a knowledge norm? In philosophy, a knowledge norm is a normative rule which says that knowledge of something is required for an action or mental state to be appropriate. For example, there seems to be a knowledge norm on assertion: you should only tell someone something if you know that thing is true. This explains, for instance, why if I say “it will rain tomorrow” it is appropriate for you to ask “how do you know?” Or why saying “I don’t know” is an appropriate response if someone asks you a question. (For a thorough defense of a knowledge norm of assertion see Timothy Williamson’s “Knowing and Asserting.”)

Many philosophers also argue that there is a knowledge norm of belief: you should only believe X if you know X is true. Thus, Williamson argues in his book Knowledge and its Limits

“Knowledge sets the standard of appropriateness for belief. That does not imply that all cases of knowing are paradigmatic cases of believing, for one might know p while in a sense treating p as if one did not know p—that is, while treating p in ways untypical of those in which subjects treat what they know. Nevertheless, as a crude generalization, the further one is from knowing p, the less appropriate it is to believe p. Knowing is in that sense the best kind of believing. Mere believing is a kind of botched knowing. In short, belief aims at knowledge (not just truth).”

There also seems to be a knowledge norm of certain actions. For instance, it seems like you should only punish someone if you know they are guilty, and only chastise someone if you know they did wrong. Some philosophers have gone even further and suggested that there is a general knowledge norm on all action: you should only treat X as a reason for action if you know X to be true.

My own experience with grief seems to suggest that there might also be a knowledge norm on various emotions; but as far as I know that topic has not yet been seriously investigated by philosophers.

My experience of the miscarriage suggested there was a knowledge norm to grief because the problem was that it felt wrong to grieve our child’s death as long as I recognized that the child might still be alive. This is parallel to how I can’t know the child had died as long as I recognized that the child might still be alive. In some sense, what is characteristic of knowledge is the elimination of all relevant alternatives. As long as those relevant alternatives remained, we did not know, nor did it feel quite right to grieve.

Here is another reason for thinking that grief is characterized by a knowledge norm: it is hard to fit probabilities with the emotion of grief. It would be weird to think that as I grow more certain, my grief grows proportionally. I do not grieve a small amount at a 5% chance that my spouse has died, nor would my grief double as my confidence grows to 10%. I grieve less for less bad things, not for lower probabilities of equally bad things. But it would be equally weird to think that there is some probabilistic threshold at which grief suddenly becomes appropriate. It is not as though when I go from 94% confident my child died to 96% confident my child died that suddenly grief goes from inappropriate to appropriate.

But if grief neither scales with probability, nor requires a certain probabilistic threshold, then it seems like grief is responsive to a standard other than probabilistic credence, and the natural alternative is that it is responsive to knowledge.

Other emotions also seem to be knowledge normed in this way. It is hard to feel grateful because you think it is likely that someone brought you a present. Normally gratitude is a response to the knowledge that someone did something for you. Jonathan Adler makes a point along these lines about resentment: “Mild resentment is never resentment caused by what one judges to be a serious offense directed toward oneself tempered by one’s degree of uncertainty in that judgment.”

Now, some other emotions at first blush seem different. I can be worried about something without knowing that thing will occur. Similarly, I can be hopeful of something without knowledge it will occur. Yet, even here, it seems that there might be some knowledge norm at play. For instance, it seems weird to be worried about or hope for something you know is impossible. Thus, it might be that you must know that something is possible before you can worry about it or hope for it.

If this is right, does it suggest a general pattern? I think it does. Emotions have appropriateness conditions. Resentment is an appropriate response to being wronged. Gratitude is an appropriate response to being given a gift. Hope is an appropriate response to the possibility of certain goods, as worry is an appropriate response to the possibility of certain bads. In each of these cases, what is required to rightly feel the emotion is knowledge.

That, then, is why grieving felt strange. I didn’t yet know if my grief was appropriate since I lacked knowledge of the tragedy to which my grief was a response.

A Problem with Emotions

abstract acrylic painting of divided canvas

There is a certain challenge to the adequacy of our emotional reactions — especially those reactions, like grief and joy, which feel ‘called for’ at certain times. Suppose a family has a child who falls grievously ill. After many sleepless nights, the child stabilizes and eventually recovers. There are appropriate emotional responses to this sequence; the parents will, and should, feel relieved and joyed at the child’s recovery. Now suppose another family has a child who similarly falls grievously ill. Except this child does not recover and eventually dies. Again, there are appropriate emotional responses. The parents will, and should, feel grieved and heartbroken at the child’s death.

So far, there is no challenge. But now suppose that instead of two different families, it was one family with two children — one recovers, one dies. Here, what are the parents supposed to feel? There are a couple of options.

Perhaps they should feel a sort of moderated grief. After all, something wonderful has happened (a child has recovered) and something terrible has happened (a child has died). Do they partially cancel out (but maybe weighted in the direction of grief since ‘bad is stronger than good’)? The problem with this answer is that the grief is a response to the tragedy of the child’s death. And that child’s death is no less a tragedy just because the other child survived. Moderation would be appropriate if something happened to moderate the tragedy of the child’s death — such as the child being spared death and instead placed within an enchanted sleep — but it does not seem like the appropriate response to some other good thing occurring.

Perhaps then, you just need to feel either emotion. Both grief and joy are appropriate — so long as you feel one, then you are feeling well. But this won’t do either. There is something wrong with the parent who feels nothing for the recovery of their child, just as there is something wrong with the parent who feels nothing for the child’s death.

In fact, the only response that seems appropriate to the situation is to feel both grief and joy. You ought to be grieved at the one child’s death and joyed at the other child’s recovery.

But here is the issue. It doesn’t seem possible to fully feel both at once. Feelings, unlike some other mental states, compete with each other. When I feel happy about one thing, it pushes sadness about other things to the periphery. This is unlike, say, beliefs. The parents can fully believe that one child recovered while, at the same moment, fully believing that the other child died. This is because beliefs do not require active attention. Moments ago, you believed all sorts of things about your former elementary school, but I expect until you read this sentence you were not actively attending to any of those beliefs.

Emotions, however, do require attention. If I can become fully absorbed in my work, then for a time my grief will retreat. (Of course, one of the frustrating things about grief is the way that it maintains a ‘grip’ on your attention — forcing your thoughts to circle back and return again, and again, to the tragedy.)

So, to fully feel the grief at the one child’s death, and to fully feel the joy at the other child’s recovery, would require me to keep my full attention on both at the same time. But we can’t do that with attention, attention is a limited resource. It can only be fully engaged in one direction.

The best we can do, then, is a sort of ping-ponging back and forth between grief and joy. Feeling complete grief when attending to the death, feeling thankful and relieved when attending to the recovery. But at no point, it seems, can my emotions be completely responsive to what is called for.

Berislav Marušić, in his essay “Do Reasons Expire”, considers a related puzzle:

“Grief is, plausibly, a response to reasons; the reason for my grief was my mother’s death; her death does not change over time; but it is not wrong for me to grieve less over time. Yet how could the diminution of grief not be wrong, if my reason for grief stays the same?”

The reason the problem is similar is that there is a disconnect between the response demanded by the event (the tragedy of someone’s death) and the psychological realities of our capacity to have emotions. You just can’t indefinitely grieve, and in turn you don’t indefinitely grieve. But doesn’t it seem as if there is a sense in which you ought to?

There is a conflict, then, between the psychological realities that constraint our emotions, and the appropriateness conditions surrounding what emotions we ‘ought’ to feel.

This is an important conflict to think about. One reason it’s important to be aware of this conflict is because it helps recognize exactly why we need to be so skeptical of grounding our moral decisions simply on emotions like anger or grief. Since we can only feel some emotions to an extent, our emotional responses, at a given time, are usually not responsive to the full range of relevant considerations. You can feel outrage about an injustice, or hopeful at political progress that has been made, but you can’t feel both at the same time to the appropriate extent. But given that psychological reality, that means that basing policy recommendations on emotions of rage or optimistic hope is likely to be morally dangerous.

This does not mean that emotions should play no role in our moral decision-making. Emotions are important. Instead, what this means is that we need to be extremely cautious when acting on our emotional reactions. We should always bear in mind that emotions are likely to not be reflective of the full range of complexities in any given case.

Immoral Emotions, Intentionality, and Insurrection

photograph of Capitol mob being tear gassed outside

Psychologists believe that emotions — those physical reactions and expressive behaviors that accompany feelings like fear, disgust, and joy — are, in and of themselves, neither moral nor immoral, and neither ethical nor unethical. Rather, we assess the behaviors motivated by, or following from, emotions as being either healthy or unhealthy for the individual. Emotions in this sense serve as coping mechanisms (or ego defenses), and are identified as being positive or negative. When these behaviors are deemed negative, they result in unhealthy outcomes for the person.

One major research question that psychologists often face when studying emotional development asks which comes first: thinking or feeling? Does our physiological activity precede conscious awareness, or is it the other way around? The current research tends to suggest the latter; research supports the idea that cognition precedes emotion. The Schacter Two-Factor theory of emotion, states that an emotion, say anger, is recognized as the emotion of anger only after we cognitively interpret it within the immediate environment, and then cognitively label it as anger. (I might label the emotion as anger, or rage or annoyance, depending on the circumstance of the immediate environment since all three of these emotions are related. Rage, for example, is an intensification of anger (the basic emotion), while annoyance is anger, but to a much lesser degree.) While anger might lead one to act immorally, the emotion itself is not considered good or bad.

But this view that cognition precedes emotion might seem to put pressure on the idea we should regard emotions as being neither moral or immoral. For example, philosopher Martha Nussbaum believes that if emotions do have a cognitive component, then they must be taken into account when evaluating ethical judgments (intentions) made that precede behaviors. Jonathan Haidt goes even further by labeling emotions as either moral or immoral depending on how prosocial the resulting behaviors are, and if the emotion was elicited by the concern for others or strictly out of self-interest. If emotions are labeled as such, then the most recent events at the Capitol can be interpreted in this context.

On January 6, a mob stormed the U.S. Capitol building, ransacking offices of lawmakers, hunting for specific government officials, seeking to cause them harm and do physical destruction to the building itself. They were seeking to stop the official count of the Electoral College that would certify the election of a new POTUS, even if it meant that the Vice President and Speaker of the House had to be executed. It’s easy to point to this aggressive behavior as being the result of political polarization, the in-group vs out-group phenomenon, and the effects of social media on collectives. Each of these explanations refers to group behavior, collections of people who must be brought to justice. But, what role did individuals play in the fomenting of such behaviors?

Often, when individuals moralize and then find others of kindred attitudes, a moral convergence is formed. Furthermore, it is known that when the kindling of moralization and moral convergence is present, aggressive behaviors often follow, but there must be a spark to ignite the kindling. It is important to note, however, that the opposite occurs equally as often with non-violent protest groups. There the kindling is present, but there is no violent behavior by the group or any individual within the group; the igniting spark is not present in the non-violent protest group. What makes up this so-called spark? Perhaps the answer can be found by a closer inspection of immoral emotions.

Prior to the attack on the Capitol, the mob met at the Ellipse in front of the White House where the group heard emotionally charged speeches from POTUS and his attorney Rudy Giuliani for over an hour. The speeches conveyed to the group a message that the election had been rigged and stolen from their candidate, and by extension, from them. An emotion of contempt for those responsible for this supposed theft could quite reasonably have been cognitively identified by the persons making up the mob. The speech-makers used terms like “cheaters” and “liars” to generate just such an emotional response.

Anger is elicited when one sees that something is in the way of completing desired goals. If the anger is based in self-interest, then the pro-sociality of the action tendency is low, and the emotion, by definition, is immoral. The speeches were angry ones in the sense that they conveyed the idea that the perceived common goal of re-electing the sitting president was being thwarted by cheating and lying enemies of democracy. The mob was in an environment where it was easy for the individual members to experience anger and contempt as the speeches progressed. In addition, they were under the impression, according to the speech given by the sitting president, that the theft was being carried out just up the street. Anger plus anticipation most often results in aggressive behavior. The kindling was laid, and the spark that lit it came in the form of these emotion-laden speeches filled with words indicative of the emotions of anger, fear, and contempt. Giuliani’s cry for “trial by combat” coupled with these words from their president suggesting that after the count had been interrupted that they would be “the happiest people,” and that what was required was a bit of courage “because you’ll never take back our country with weakness. You have to show strength, and you have to be strong,” could very well have lit the already-present kindling. If the group saw this as a moral issue (“save your country!”), a right issue, and an issue worth fighting for, then the mob was primed to commit these violent acts. As Milgram and others showed us long ago, humans are not above inflicting harm on others as long as an authority figure encourages them to do so.

But, do the emotions experienced by the Capitol mob need to be labeled as immoral in order to explain their egregious behavior? Do we need to follow Haidt and Nussbaum in condemning the emotion and not just the resulting act? Emotions serve as coping strategies or ego defense mechanisms that motivate behavioral responses. The coping strategies used to deal with their conflicting emotions, and the ego defensive behaviors exhibited by the mob can be explained more parsimoniously by the cognitive theory of emotion: there is an emotion present, anger, (but what to do about it?), there is behavior, attack (but whom?), the function of the attack is to destruct, and the ego defense is displacement (attacking something weaker than the perpetrator) in this case, a few unarmed lawmakers. Emotions were no doubt manipulated and contributed to the mayhem, but they also aren’t the primary suspect.

Feel This

Much has been written about the appalling, depressing and infuriating case concerning Brock Turner and his unnamed victim. I won’t rehearse the case, nor the dialectic it has sparked between those sympathetic to the victim and those outraged that sympathy can ever be extended to crime perpetrators, especially when such perpetrators are member of a hyper-privileged class such as that to which Turner belongs.

Continue reading “Feel This”