← Return to search results
Back to Prindle Institute

The Ethics of Weed-Out Courses

photograph of students attending class in lecture hall

On October 3rd, The New York Times reported that organic chemistry adjunct Professor Maitland Jones, Jr. had been fired by N.Y.U. after 82 out of 350 of his students signed a petition against him. Students complained that their grades did not reflect the amount of time and effort they put into the course and that Jones did not prioritize student well-being and learning. (Jones, meanwhile, reported a significant drop off in student performance in the past decade, and especially after the pandemic.) Before firing Jones, university officials had first offered to review student grades and allowed students to withdraw from the class retroactively.

Immediate responses varied: Jones supporters protested the decision; some students who had a bad experience in Jones’ class celebrated the decision in online reviews; and faculty critiqued the decision by administration to appease students as tuition-payers.

More broadly, there have been a wide range of takes offered on the whole situation: Jones’ firing illustrates the precarity of contingent faculty and an administration run amok; “weed-out” classes like organic chemistry exacerbate student inequalities; students are becoming coddled and entitled, which will make them bad doctors; academic degrees are becoming a consumer product – and the consumers with financial power are the parents, not the students; organic chemistry isn’t actually necessary to be a good doctor, and it only became a weed-out course through policy decisions to limit the number of new doctors, leaving us with a shortage of physicians; the systemic and structural factors that create out-of-touch professors, entitled students, and pandering administrators are what we should actually blame.

This case raises rich possibilities for discussion, but I would like to focus on the following question: What purpose do weed-out courses actually serve, and is it a purpose we can get behind?

I will limit the discussion for now to pre-med classes, but the question could be asked of other disciplines as well.

Let’s start with the more positive aspects of weed-out courses. The main purpose of such a course seems to be to allow students to assess whether they have the necessary aptitude for surviving medical school and becoming good doctors. Ideally, a professor would facilitate this task by ensuring the class has the adequate rigor to support students in their pursuits, but also kindly counseling struggling students that they should seek another career path.

One apparent benefit of this kind of course would be to prevent students from spending a great deal of time and money pursuing a career they are more and more unlikely to attain.

Unfortunately, it is a hard truth that effort alone is not enough to get one through medical school, even though dedication and determination are necessary ingredients.

Another benefit of this kind of course would be to encourage students to cultivate the studying and test-taking skills they will need to do well in medical school and to become good doctors.

These considerations seem reasonable to me, but I’m not sure the language of “weeding out” best captures this set of aims. Instead, it suggests hoops that students are required to jump through in order to demonstrate their commitment and thus be granted access to continuing along their career path. There are plenty of questions here as to which courses should serve as the key benchmarks for success as a physician (a bioethics course might be on the list of courses that should be included, and an organic chemistry class might be less central than an immunology class), but having such benchmarks does not itself seem to be a problem.

Doctors need to have a variety of skills to be effective physicians: from the people-skills required in doctor/patient interactions, to a good problem-solving ability to catch diseases and health conditions before they progress, to the vast memorization needed to keep up with best practices and treatments. These are all abilities that should be fostered by pre-med education. If a student lacks one or more of these core capacities, it seems best for them (and their potential patients) to turn to another career path where their abilities might shine.

At the same time, we need to ensure that students of all backgrounds receive the resources (and opportunity) needed to acquire these skills during the course of their undergraduate education so that we do not simply reify existing inequalities.

So, let’s turn to the more negative aspects of weed-out courses. Often, it seems that the goal of a weed-out course is to get a certain portion of the class to withdraw or fail. Even if the express reason for this design is to promote rigor and provide a benchmark for student success, the learning environment can become toxic in several ways. If the professor sets up the class so that only the “truly bright” students can pass and treats student confusion as signs of laziness or stupidity, this creates a host of problems.

First, students who can keep up with the learning environment, whether through advantages in past tutelage or an ability to more quickly grasp the material, may start to see themselves as superior to those who do not do as well in the class. Second, students of all aptitudes may feel immense pressure and dedicate excessive time to studying in order to succeed in the class, contributing to mental distress. Third, students who do not do well in the class despite putting in the same intensive effort may see themselves as failures or as less worthy than other students.

What this kind of weed-out mentality amounts to is a kind of bullying that identifies some people as superior and others as inferior, only loosely tracking a student’s academic merit.

This can create problems not only in the pre-med weed-out courses but also in medical school and beyond. Hierarchies might arise between different medical subspecialties, with physicians in some elite residencies seeing themselves as superior to those who did not make the cut. These dynamics might also lead to epistemic overconfidence in practicing physicians, causing disruptions in doctor/patient interactions and negatively impacting the quality of patient care.

More specifically, I worry that some of our initial defenses of these weed-out classes tend to reify bullying practices rather than establishing the necessary benchmarks one needs to meet in order to be a good physician – in the same way that there are certain benchmarks that one should be able to meet to be a good teacher, a good lawyer, a good journalist, a good businessperson, a good caretaker, and more. While the pandemic has negatively impacted student learning and well-being, the student petition can be read as reflecting an unwillingness to put up with a certain kind of bullying and as a demand for better institutional support.

The pandemic tested us all in a number of ways, and it has made apparent to many of us that some forms of treatment are untenable, especially in times of crisis.

If you take a look at the last 10 years of comments about Jones’ teaching performance on RateMyProfessors (for whatever the review site is worth), negative student ratings of Jones’ classes have been fairly consistent in quantity and quality over time. Students have raised the same concerns again and again, regardless of grades earned: no partial credit on tests, the necessity of studying excessive amounts of time compared to other organic chemistry classes, accusations that Jones did not respect students nor respond well to questions, consistently low test averages (there was conflicting information about whether tests were curved), and high drop rates. Students of all different academic backgrounds reported feeling excessively stressed out by the course, and many complained that the organic chemistry course was made intentionally difficult. While other students gave glowing reviews, it is clear that the instructional problems raised in the petition are not new.

In the end, we’re left – like some of Jones’s students – with what feels like an impossible task: How can we design weed-out classes to be sufficiently rigorous and supportive? And how would we know when we’ve done it right?

Some University of Chicago Students Prove Lukewarm on Free Speech

photograph of University of Chicago ivy-covered Gothic buildings

The University of Chicago is known as a bastion for, and important intellectual proponent of, free speech and academic freedom. Its “Chicago Statement,” released in 2015 and since adopted by over eighty academic institutions across the country, is considered the gold standard for university free speech policy statements. Yet a recent incident involving its student newspaper, The Chicago Maroon, shows that a university’s commitment to free speech is only as robust as that of its members — including its students.

On January 26, 2022, the University of Chicago chapter of Students for Justice in Palestine (SJP UChicago) posted a call to boycott “Sh*tty Zionist Classes” on its Instagram page. Although the boycott included within its ambit any class at Chicago “on Israel or those taught by Israeli fellows,” it was apparently aimed at three particular classes, whose course descriptions the post reproduced along with critical annotations. “By attending these classes,” the post argued, “you are participating in a propaganda campaign that creates complicity in the continuation of Israel’s occupation of Palestine.”

Almost a month later, the Maroon published an op-ed entitled “We Must Condemn the SJP’s Online Anti-Semitism.” Notably, its subheading inaccurately described SJP UChicago’s boycott as aimed at “Jewish-taught and -related classes.” The op-ed itself argued that based on the lunisolar Hebrew calendar, SJP UChicago had posted its boycott demand on Holocaust Remembrance Day, which the authors claimed “was done to isolate and alienate the Jewish population at UChicago and to interfere with a day of mourning.” It also claimed that “the targeting of classes taught specifically by Israeli fellows is xenophobic” and that because all of the courses singled out in the post were housed within the university’s Center for Jewish Studies, the post “furthers the trope that Jewish courses and professors work to contribute to propaganda for Israel.” Finally, it denounced SJP UChicago’s attempt to persuade students to avoid or drop certain classes as a violation of the university’s discrimination and harassment policies, since Israeli faculty were “directly discriminated against,” Jewish students were “indirectly” discriminated against, and the harassment policy “states that any organization that uses social media . . . in order to interfere with the education of students is harassment [sic].”

The op-ed’s first two arguments are fine as far as they go: substantively they’re thin gruel, but they’re firmly in line with the Chicago Statement’s view that the best antidote to “offensive, unwise, immoral, or wrong-headed” speech is more speech. By contrast, the argument that the SJP UChicago boycott announcement violates the university’s discrimination and harassment policies was a blatant attempt by the authors to pressure the university into sanctioning other students for political speech under the flimsy pretext of “harassment” and “discrimination.” This is clearly contrary to the letter and spirit of the Statement, which states that

it is for the individual members of the University community, not for the University as an institution, to make those judgments [about which ideas are offensive, unwise, immoral, or wrong-headed] themselves, and to act on those judgments not by seeking to suppress speech, but by openly and vigorously contesting the ideas that they oppose.

As a threshold matter, it’s unclear whether student choices concerning what classes to take, or speech directed at influencing such choices, fall within the scope of UChicago’s discrimination policy. Even if they do, SJP UChicago’s boycott demand was clearly based on the ideological content of the courses or the instructors’ institutional affiliations, not their national origins or religion. Assuming arguendo that the boycott announcement constituted or encouraged discrimination based on instructor national origin or faith, it could not constitute harassment unless, in addition to being based on a proscribed factor such as national origin, it unreasonably interfered with a person’s work performance or educational program participation or created an intimidating, hostile, or offensive work environment. Finally, because it plausibly occurred in “an academic context,” to qualify as harassment it also had to be directed at specific persons, constitute abuse, and serve no bona fide academic purpose. Clearly, SJP UChicago’s boycott announcement ticks none of these boxes.

If the op-ed itself didn’t excel in free speech terms, what happened next was no better and suggested that SJP UChicago and some editors at the Maroon would probably benefit from reading the Statement again.

On April 2 the editors of Viewpoints, the Maroon’s opinions page, decided to retract the op-ed, citing “factual inaccuracies.” In a long and rambling explanatory note, the editors said that these inaccuracies “flattened dialogue and perpetuated hate toward [SJP UChicago], Palestine Palestinian students, and those on campus who support the Palestinian liberation struggle,” and apologized to SJP UChicago and “all others affected by this decision.” However, the editors identified only four inaccuracies: the characterization of the boycott as targeting “Jewish-taught and -related classes,” which did not even appear in the op-ed itself but in its subheading; another description of the boycott as targeting classes “taught by Israeli professors,” rather than Israeli fellows affiliated with the Israel Institute; the claim that the post was deliberately published on Holocaust Remembrance Day; and the claim that SJP UChicago members had approached students on the quad about boycotting the classes. At key points, the editors appeared to rely upon information provided by SJP UChicago, rather than any independent reporting, to correct the op-ed’s claims. Notably, the retraction note included something like a disclaimer from the editor-in-chief and managing editor of the Maroon pointedly stating that “the following apology does not constitute an institutional perspective and represents only the views of the current Viewpoints Head Editors.”

Thus, apparently under pressure from SJP UChicago and its allies, the Viewpoints editors retracted the op-ed under a thin pretext of concern for four factual inaccuracies. One of these inaccuracies was not even the responsibility of the op-ed’s authors, while others were only inaccurate by the lights of SJP UChicago’s own account of events. Moreover, the Viewpoints editors had other, less dramatic options available to them to address what factual inaccuracies existed, such as publishing corrections or inviting a rebuttal from an SJP UChicago member.

Even if the factual inaccuracies were more significant, however, the crucial question the retraction raises is the extent to which a newspaper is responsible for the factual inaccuracies that appear in the opinion pieces it chooses to run.

On its face, it would seem that since the purpose of an opinions page is to provide a forum for community voices rather than news coverage, ensuring the factual accuracy of the former is a lesser priority. It is true that some factual inaccuracies may be so glaring that they either undermine an op-ed’s main claims or arguments or they amount to pernicious disinformation. In these cases, factual inaccuracies may sap an opinion piece of its value in fostering debate and discussion because they render the piece, in some important sense, irrelevant. That does not seem to be the case here.

In addition, the Viewpoints editors trotted out the specter of the “harm” caused by the op-ed to justify its retraction. The implication, it seems, is that speech must be harmless to be publishable. Some defenders of free speech tend to downplay the harm caused by it, arguing that belief in speech’s harmfulness is based on “cognitive distortions.” However, as I have argued before, the best argument for tolerating offensive, wrong-headed, hateful, or immoral speech is not that it is harmless. For example, the U.S. Supreme Court did not hold that journalists are immune from suit for negligent defamation of public officials because the latter are psychologically maladjusted snowflakes whose reputations are not really harmed by defamatory falsehoods broadcast about them by major news outlets. Instead, its rationale was that the costs of allowing journalists to be sued for negligent defamation — and in particular, the so-called “chilling effects” on politically important speech — substantially outweigh the benefits. By the same token, newspapers like the Maroon should publish potentially harmful speech at least partly because accepting the editorial principle that speech is publishable only if it cannot inflict any degree of harm upon any person at any time would have a devastating effect on a newspaper’s ability to serve as a forum for lively, relevant, and politically engaged debate and discussion.

If, as the original op-ed amply demonstrates, some are already tempted to use institutional discrimination and harassment policies to silence others’ speech, consider what a gift it would be to these censors manqué if everyone accepted that narrow principle of publishable speech.

The University of Chicago has much to be proud of in its record of defending free speech against the rising tide of illiberalism on both the right and left. But as Hannah Arendt reminded us, in every generation, civilization is invaded by barbarians — we call them “children.” Among the most important duties of the university in a liberal society is to inculcate in each new class of undergraduates the disposition to critically evaluate deeply offensive speech without invoking some institutional lever to censor the speaker. Apparently, in this respect even UChicago can stand to do better.

Do Grades Make Our Lives Worse?

photograph of old-fashioned elementary report card

It’s nearing the end of the semester, and many students will be waiting on the edge of their seats to receive their final grades. For those who seek higher education, their GPA will matter for their applications to med school, law school, and other graduate schools. This numerical representation of a student’s academic achievement allows institutions, like universities and medical schools, to have some objective measure by which to discriminate between applicants. And perceptive students can figure out ways to maximize their GPA.

A numerical representation of academic performance is a good thing, right? It is both legible and achievable. However, if we look at contemporary philosopher C. Thi Nguyen’s work on value capture, the answer might not be so clear. According to Nguyen, “value capture occurs when: 1. Our values are, at first, rich and subtle. 2. We encounter simplified (often quantified) versions of those values. 3. Those simplified versions take the place of our richer values in our reasoning and motivation. 4. Our lives get worse.”

To see how this process works, take Nguyen’s example of the Fitbit. Say that I’m trying to start off the year healthier and increase my exercise. My thoughtful mother buys me a Fitbit so that I can track my steps and try to meet a goal of 10,000 steps a day. After a while, I find myself motivated to get 10,000 steps in a day, but that motivation has now replaced my earlier motivation to be healthier and get more exercise. I may be walking more, but I might be neglecting other forms of movement and a more holistic practice of promoting health to meet the clear and concrete goal of meeting my step count. Depending on how obsessed I am with the 10,000 steps number, my life has probably gotten worse. This is the process of value capture.

Are grades subject to value capture? Let’s start with the first step. What are the prior values we are trying to measure with grades? At the broadest level, it seems that grades are meant to capture how well a student is performing given the standards of the class, which are subsequently determined by the standards of the discipline. Given the complexity of any given subject and the many ways that subject could be broken down into a class, it’s very difficult to give a clear and easy explanation of what any given grade is trying to capture. And the same grade could mean different things — two students could be performing equally well in a class but each have different strengths. The values that grading tries to capture are evidently rich and subtle values. Step 1 is complete.

Do grades represent simpler (and sometimes quantified) versions of these rich values? Yes. Grades capture student performance into a number that can be bureaucratically sorted through at an institutional level. This has certain benefits — a law school can quickly do an initial sift through applicants to ensure that they have a sufficiently high GPA and LSAT score. But it also has its drawbacks. It doesn’t capture, for instance, that a slightly lower grade in a very hard class represents better student performance than a higher grade in an easier class. Given the standardized format of grades, a student’s scores may also do a poor job at representing personal growth and achievement that may vary based on the social and educational starting points of different students. Steps 1 and 2 are complete.

What about step 3? Do grades take the place of our richer values in our reasoning and motivation? It seems that often they do. This is in part because of external motivations, such as the importance of grades for employment or getting into a certain program. But it is also in part because of the ways in which we tend to start valuing the grade for its own sake. Think about, for instance, the parent who wants their child to succeed. Instead of focusing on the actual progress their child is making given the challenges their classes present, that parent can easily be seduced by the clarity and seeming objectivity of their child’s grades. The goalposts can quickly shift from “being a good student” to “making good grades.”

This shift can happen for students as well. Grades are often the most tangible feedback they get from their instructors, even though they may sometimes receive qualitative assessments. Grades may feel like a more real and concrete measure of academic performance, especially because they are the record that remains after the course. Students who start off valuing education may easily get sucked into primarily working to maximize their grades rather than to maximize their learning. It is worth noting that Nguyen himself thinks that this motivational shift happens with grades, noting that “students go to school for the sake of gaining knowledge, and come out focused on maximizing their GPA.” Steps 1, 2, and 3 are all complete.

What about step 4? Do grades make our lives worse? This is a hard question to answer, as it’s an empirical question that depends on a myriad of different personal experiences. In my own experience, focusing on getting a higher grade has often interfered with my ability to learn in a course. Instead of diving into the material itself, I often got stuck at the level of trying to figure out how to make sure that I got that A. In harder courses, this would make me very stressed as I worked exceptionally hard to meet the requirements. In easier courses, this would mean that I often slacked off and did not perform as well as I could have, since it was an easy A. And, as much as I tried to shake the motivational pull of grades, it was always there. Grades made my educational experience worse.

What should we do with this problem? Given the potential for value capture, grades are a powerful tool, and teachers should be careful to create an assessment structure that more closely incentivizes an engagement with the rich, pluralistic values that students should come to appreciate. This is a difficult task, as often those values cannot be easily translated into a grading system that is legible to the institution (and to other people across institutions). Because grades provide an easy way to communicate information, it’s unlikely that getting rid of them would make things better, at least in the short-term.

One solution might be to retain the current numerical/letter grade assignments but to add on a short paragraph qualitatively assessing the student’s performance throughout the course. This could be fraught for a number of reasons (including implicit bias, the bureaucratic logistics of tracking of such information, and the additional work for teachers), but that extra information would help to contextualize the numbers on the page and provide a richer understanding of a student’s performance, both for that student and for those assessing the student as an applicant. This solution is far from perfect, but it might provide one step towards recapturing our motivation to track the rich values we started with.

Digital Degrees and Depersonalization

photograph of college student stressing over books and laptop

In an article titled “A ‘Stunning’ Level of Student Disconnection,” Beth McMurtie of the Chronicle of Higher Education analyzes the current state of student disengagement in higher education. The article solicits the personal experiences and observations of college and university faculty, as well as student-facing administrative officers and guidance counselors. Faculty members cite myriad causes of the general malaise they see among the students in their classes: classes switching back and forth between virtual and remote settings; global unrest and existential anxiety, stemming from COVID-19 and the recent war between Ukraine and Russia; interrupted high school years that leave young adults unprepared for the specific challenges and demands of college life; the social isolation of quarantines and lockdowns that filled nearly two years of their lives. Some of these circumstances are unavoidable (e.g., global unrest), while others seem to be improving (classroom uncertainty, lockdowns, and mask mandates). Still, student performance and mental health continues to suffer as badly as it did two years ago, and college enrollment is nearly as low as it was at the start of the pandemic.

McMurtie also takes the time to interview some college students on their experience. The students point to a common element that draws together all the previously-mentioned variables suspected of causing student disengagement: prolonged, almost unceasing, engagement with technology. One college junior quoted in the article describes her sophomore year as a blur, remembering only snippets of early morning Zoom classes, half-slept-through, with camera off, before falling back asleep. Each day seemed to consist in a flow between moments of sleep, internet browsing, and virtual classes. When COVID-19 restrictions subsided and classrooms returned to more of a traditional format, the excessive use of technology that had been mandatory for the past two years left an indelible psychological mark.

As she returned to the classroom, Lyman found that many professors had come to rely more heavily on technology, such as asking everyone to get online to do an activity. Nor do many of her courses have group activities or discussions, which has the effect of making them still seem virtual. ‘I want so badly to be active in my classroom, but everything just still feels, like, fake almost.’

Numerous scientific studies offer empirical support for the observation that more frequent virtual immersion is positively correlated with higher levels of depersonalization — a psychological condition characterized by the persistent or repeated feeling that “you’re observing yourself from outside your body or you have a sense that things around you aren’t real, or both.” In an article published last month in Scientific Studies, researchers reported the following:

We found that increased use of digital media-based activities and online social e-meetings correlated with higher feelings of depersonalisation. We also found that the participants reporting higher experiences of depersonalisation, also reported enhanced vividness of negative emotions (as opposed to positive emotions).

They further remarked that the study “points to potential risks related to overly sedentary, and hyper-digitalized lifestyle habits that may induce feelings of living in one’s ‘head’ (mind), disconnected from one’s body, self and the world.” In short, spending more time online entails spending more time in one’s “head,” making a greater percentage of their life purely cerebral rather than physical. This can lead to a feeling of disconnect between the mind and the body, making all of one’s experiences feel exactly as the undergraduate student described her life during and after the pandemic: unreal.

If the increase and extreme utilization of technology in higher education is even partly to blame for the current student psychological disconnect, instructors and university administrators face a difficult dilemma: should we reduce the use of technology in classes, or not? The answer may at first appear to be an obvious “no”; after all, if such constant virtual existence is taking a psychological toll on college students, then it seems the right move would be to reduce the amount of online presence required to participate in the coursework. But the problem is complicated by the fact that depersonalization makes interacting with humans in the “real world” extremely psychologically taxing — far more taxing than interacting with others, or completing coursework, online. This fact illuminates the exponentially increasing demand over the past two years for online degrees and online course offerings, the decrease in class attendance for in-person classes, and the rising rates of anxiety and depression among young college students on campus. After being forced into a nearly continuous online existence (the average time spent on social media alone — not counting virtual classes — for young people in the United States is 9 hours per day) we feel wrenched out of the physical world, making reentering the world all the more exhausting. We prefer digital existence because the depersonalization has rendered us unable to process anything else.

Some philosophers, like Martha Nussbaum, refer to these kinds of preferences as “adaptive preferences” — things we begin to prefer as a way of adapting to some non-ideal circumstances. One of Nussbaum’s cases focuses on impoverished women in India who were routinely physically abused by their husbands, but preferred to stay married. Some of the women acknowledge that the abuse was “painful and bad, but, still, a part of women’s lot in life, just something women have to put up with as part of being a woman dependent on men.” Another philosopher, Jon Elster, calls these kinds of desires “sour grapes,” because a fox that originally desires grapes may convince himself the grapes he previously wanted were sour (and therefore not to be desired) if he finds himself unable to access them.

Are in-personal classes, social engagement, and physical existence on campus becoming “sour grapes” to us? If we have, to some extent, lost the ability to navigate these spaces with psychological ease, we may convince ourselves that these kinds of interactions are not valuable at all. But as we move further and further from regular (non-virtual) physical interactions with others, the depersonalization continues and deepens. It may be a self-perpetuating problem, with no clear path forward for either students or instructors. Should instructors prioritize meeting students where they are currently and providing virtual education as far as possible? Or should they prioritize moving away from virtual education with hope for long-term benefits? This is a question that higher education will likely continue to grapple with for many years to come.

Against Abstinence-Based COVID-19 Policies

black-and-white photograph of group of students studying outside

There are at least two things that are true about abstinence from sexual activity:

  1. If one wishes to avoid pregnancy and STD-transmission, abstinence is the most effective choice, and
  2. Abstinence is insufficient as a policy category if policy-makers wish to effectively promote pregnancy-avoidance and to prevent STD-transmission within a population.

I take it that (1) is straightforward: if someone wishes to avoid the risks of an activity (including sex), then abstention from that activity is the best way to do so. By (2), I simply mean that prescribing abstinence from sexual activity (and championing its effectiveness) is often not enough to convince people to actually choose to avoid sex. For example, the data on the relative effectiveness of various sex-education programs is consistent and clear: those programs that prioritize (primarily or exclusively) abstinence-only lessons about sex are regularly the least effective programs for actually reducing teen pregnancies and the like. Instead, pragmatic approaches to sex education that comprehensively discuss abstinence alongside topics like contraceptive-use are demonstrably more effective at limiting many of the most negative potential outcomes of sexual activity. Of course, some might argue in response that, even if they are less effective, abstinence-only programs are nevertheless preferable on moral grounds, given that they emphasize moral decision-making for their students.

It is an open question whether or not policy-makers should try to impose their own moral beliefs onto the people affected by their policies, just as it is debatable that good policy-making could somehow produce good people, but the importance of policy making based on evidence is inarguable. And the evidence strongly suggests that abstinence-based sex education does not accomplish the goals typically laid out by sex education programs. Regarding such courses, Laura Lindberg — co-author of a 2017 report in the Journal of Adolescent Health on the impact of “Abstinence-Only-Until-Marriage” (AOUM) sex ed programs in the US — argues that such an approach is “not just unrealistic…[but]…violates medical ethics and harms young people.”

In this article, I’m interested less in questions of sex education than I am in questions of responsibility for the outcomes of ineffective public policies. I think it’s uncontroversial to say that, in many cases of pregnancy, the people most responsible for creating a pregnancy (that results from sexual activity) are the sexual partners themselves. However, it also seems right to think that authority figures who knowingly enact policies that are severely unlikely to effectively prevent some undesirable outcome carry at least some responsibility for that resulting outcome (if it’s true that the outcome would have probably been prevented if the officials had implemented a different policy). I take it that this concern is ultimately what fuels both Lindberg’s criticism of AOUM programs and the widespread support for comprehensive sex-education methods.

Consider now the contemporary situation facing colleges and universities in the United States: despite the persistent spread of the coronavirus pandemic over the previous several months, many institutions of higher education have elected to resume face-to-face instruction in at least some capacity this fall. Across the country, university administrators have developed intricate policies to ensure the safety and security of their campus communities that could, in theory, prevent a need to eventually shift entirely to remote instructional methods. From mask mandates to on-campus testing and temperature checks to limited class sizes to hybrid course delivery models and more, colleges have demonstrated no shortage of creativity in crafting policies to preserve some semblance of normalcy this semester.

But these policies are failing — and we should not be surprised that this is so.

After only a week or two of courses resuming, many campuses (and the communities surrounding them) are already seeing spikes of COVID-19 cases and several universities have already been forced to alter their previous operating plans in response. After one week of classes, the University of North Carolina at Chapel Hill abruptly decided to shift to fully-remote instruction for the remainder of the semester, a decision mirrored by Michigan State University, and (at least temporarily, as of this writing) Notre Dame and Temple University. Others like the University of Iowa, the University of South Carolina, and the Ohio State University have simply pushed ahead with their initial plans, regardless of the rise in positive cases, but the feasible longevity of such an option is bleak. Indeed, as the semester continues to progress, it seems clear that many more colleges will be disrupted by a mid-semester shift, regardless of the policies that they had previously developed to prevent one.

This is, of course, unsurprising, given the realities of life on a college campus. Dormitories, dining halls, and Greek life houses are designed to encourage social gatherings and interactions of precisely the sort that coronavirus-prevention recommendations forbid. Furthermore, the expectations of many college students (fueled explicitly by official university marketing techniques) is that such social functions are a key element of the “college experience.” (And, of course, this is aggravated all the more by the general fearlessness commonly evidenced by 18-25 year-olds that provoke them into generally more risky behavior than other age groups.) Regardless of how many signs are put up in classrooms reminding people to wear masks and no matter the number of patronizing emails sent to chastise students (or local businesses) into “acting responsibly,” it is, at best, naive of university administrators to expect their student bodies to suddenly enter a pandemic-preventing mindset (at least at the compliance rates that would be necessary to actually protect the community as a whole).

Basically, on the whole, colleges have pursued COVID-19-prevention policies based on the irrational hope that their students would exercise precisely the sort of abstinence that college administrators know better than to expect (and, for years leading up to this spring, actively discouraged). As with abstinence-based sex education, two things are true here also:

  1. If one wishes to avoid spreading the coronavirus, constantly wearing masks, washing hands, and avoiding social gathering are crucial behavioral choices, and
  2. Recommending (and even requiring upon pain of punishment) the behaviors described in (1) is insufficient as a policy category if university administrators wish to effectively prevent the spread of the coronavirus on their campuses.

We are already seeing the unfortunate truth of (2) grow more salient by the day.

And, as with sex education, on one level we can rightfully blame college students (and their choices to attend parties or to not wear masks) for these outbreaks on college campuses. But the administrators and other officials who insisted on opening those campuses in the first place cannot sensibly avoid responsibility for those choices or their consequences either. Just as with abstinence-only sex education programs, it seems right to hold officials responsible for policies whose implementation is wildly unlikely, no matter how effective those fanciful policies might be if people were to just follow the rules.

This seems especially true in this case given the (in one sense) higher stakes of the COVID-19 pandemic. Because the coronavirus is transmitted far more quickly and easily than STDs or pregnancies, it is even more crucial to create prevention strategies that are more likely to be successful; in a related way, it also makes tracking responsibility for the spread of the virus far more complicated. At least with a pregnancy, one can point to the people who chose to have sex as shouldering much of the responsibilty for the pregnancy itself; with COVID-19, a particular college student could follow every university policy perfectly and, nevertheless, contract the virus by simply coming into contact with a classmate who has not. In such a case, it seems like the responsible student can rightfully blame both her irresponsible classmate and the institution which created the conditions of her exposure by insisting that their campus open for business while knowingly opting for unrealistic policies.

Put differently: imagine how different sex education might look like if you could “catch” a pregnancy literally just by walking too close to other people. In such a world, simply preaching “abstinence!” would be even less defensible than it already is; nevertheless, that approach is not far from the current state of many COVID-19-prevention policies on college campuses. The only thing this kind of rhetoric ultimately protects is the institution’s legal liability (and even that is up for debate).

In early July, the University of Southern California announced that it would offer no in-person classes for its fall semester, electing instead for entirely remote course-delivery options. At the time, some responded to this announcement with ridicule, suggesting that it was a costly overreaction. Nevertheless, USC’s choice to ensure that its students, staff, and faculty be protected by barriers of distance has meant not only that its semester has been able to proceed as planned, but that the university has not been linked to the same level of case spikes as other institutions (though, even with such a move, outbreaks are bubbling).

As with so much about the novel coronavirus, it remains to be seen what the full extent of its spread will look like. But one thing is clear already: treating irresponsible college students as scapegoats for poorly-conceived policies that justified the risky move of opening (or mostly-opening) campuses is transparently wrong. It oversimplifies the complicated relationship of policy-makers and constituents, even as it misrepresents the nature of moral responsibility for public action, particularly on the part of those in charge. The adults choosing to attend college parties are indeed to blame for doing so, but those parties wouldn’t be happening at all if other adults had made different choices about what this semester was going to look like in the first place.

In short, if college administrators really expect abstinence to be an effective tool to combat COVID-19, then they should be the ones to use it by canceling events, closing campuses, and wrapping up this semester (and, potentially, the next) online.

Universities and the Burdens of Risk

photograph of exterior of classroom building at Harvard

To bring students back to the university is to knowingly expose them to risk of a dangerous disease when such exposure is avoidable. This is morally objectionable on a variety of fronts. The risk of contracting COVID-19 and the seriousness of its potential health outcomes make it very different from the realities we typically accept by engaging in other everyday activities. It is clear that COVID-19 poses a higher risk of death than other coronaviruses. A variety of underlying conditions can lead to deadly outcomes, and we do not have a comprehensive understanding of the conditions that may lead to the virus’ lethality. Even when the virus does not cause death, the respiratory impact of contracting it has put a significant burden on patient’s long term health, and can lead to the need for hospitalization and incubation for breathing support. The long-term effects of the illness even for those lucky enough to avoid these outcomes are still unknown, but appear to persist past initial recovery and seem to include lung damage and potential stroke and brain complications.

One of the most concerning things, given these serious outcomes of the virus, is how contagious it is. Because of this, there have been efforts to distance members of societies affected by the virus across the globe (with the US notoriously falling behind).

Despite the serious issues involved in contracting the virus, in order to keep society safe and healthy, multiple segments of society need to continue to interact with one another and the public. There are, of course risks for pharmacists, doctors, grocery store workers, and the essential workers that produce and distribute the necessary products that keep a society running.

When there are necessary risks, there is a responsibility of a society to support those people that are exposing themselves to these risks on the behalf of the members of society that require their services to continue in health and safety. When someone takes on a burden in order to keep you safe and healthy, we typically think either a moral obligation is formed, or, more minimally, it would be appropriate to be grateful, or, in compromise, to be obligated not to put people in a scenario where they must accrue further risks in order to maintain your safety.

We can consider non-pandemic scenarios that support these intuitions. An extreme case would if you chose to sky-dive (knowingly taking on a risk) with a tandem guide. The partner is jumping with you, exposing themselves to risk to keep you safe. As a beginner, you rely on the tandem partner for your safety. It would be morally wrong of you to act so as to put the tandem partner at further risk.

In circumstances where others are placing themselves at risk for your benefit and you knowing accept that relationship, it is wrong to exacerbate that exchange of burdens (their risk) and benefits (the service they are offering at their sacrifice).

This minimization of risk exposure supports the narrowing of operations and functioning of businesses activities in our society until we can mitigate the risk to one another that gathering together would pose. By opening your doors for business, you are posing a risk to your employees, and by frequenting the business, you are posing a risk to the fellow patrons and employees of the business. With risks like those associated with COVID-19, this threat is significant enough that such behavior, when it is avoidable and unnecessary, is morally problematic.

When a group of people comes together for activities like taking a cruise, or attending a university, the moral assessment of risk is different than for these essential operations. Universities expose students, faculty, and staff to a high risk of contracting the disease because, like cruise ships, the amount of personnel required to keep food, board, courses, and administration functioning is immense and it all occurs in relatively small areas, every day. These are specialized activities that are voluntary, and so they significantly differ from the necessary operations that provide food and services to a society to keep people safe and healthy.

Universities have acknowledged the liability issues in the Fall, perhaps most obviously by seeking legal shields or waivers from students returning to campus. However, at the end of May, according to a survey conducted by the Chronicle of Higher Education, over 2/3 of universities planned to bring students back to campus for the upcoming term. This strategy attempts to redirect the institutional burdens of risk assessment and decision-making back onto individuals.

This parallels the situation where individual businesses are placed when there is a lack of governmental or higher legislation regarding managing risk. Without a policy dictating when it is permissible to have non-essential services enter back into the risk-exchange of societal functioning, individual businesses are left to weigh the risk of their employees, their impact on societal spread, etc. Government oversight makes the decision on the basis of overall risk that society faces, which is the level at which the risk of disease exists. When individuals need to determine what risks they are willing to bear against other priorities, their choices become coercive – the cost of businesses closing due to lack of government assistance, the pressure to open when other businesses are doing so and thereby losing competition in the market, etc. By changing the systemic problem of the risk to society into individual problems of how to navigate that risk based on individual priorities, privileges, and disadvantages, we face structural injustice.

Universities face this very problem in determining the just distribution of systemic risk. Should they pursue universal policies to protect everyone regardless of privileges, priorities, or disadvantages, or should they leave individuals to navigate these decisions themselves. Giving individuals the opportunity to choose a remote-learning track does not mitigate the moral burden of universities offering face-to-face (on-campus) learning. In offering this choice, universities have simply transformed an institutional obstacle into a problem for individuals to navigate on their own. But this choice offered individuals cannot be read as an assumption of risk; these choices are not commensurable. University systems were designed for those able and willing to opt for on-campus, f2f learning, signaled by the university to be optimal.

The instructors that have opted for f2f learning have created a difference in course delivery that puts a burden on students who would ideally choose not to return to campus in their selection of courses. The disparity in support services that are optimally delivered while on-campus would also create distance between those students who return and those that cannot or would choose to avoid the risk of returning to campuses that admit the risk to which they are exposing all present.

A statement from the American Anthropological Association emphasized how the default f2f policies undermined the equitable access to education that would result for minority and underserved populations:

“Given the disproportionate representation of COVID-19 infection and death in Black and brown communities, university policies and practices that emphasize in-person work and teaching run the risk of compounding the impact of racial inequity. These policies also risk endangering already-marginalized members of university communities, including staff and contingent faculty, who are less likely to have the option to take time away from work. As a matter of equity and ethics, while we acknowledge the financial challenges colleges and universities face because of the pandemic, we encourage university administrators to keep the health and safety of marginalized people at the forefront of their decisions.”

Finally, there is the question of liability on the part of universities for allowing students back on their campuses. As noted above, some universities are seeking “liability shields” for the health risks facing their students, staff, and instructors this Fall. Despite taking precautions against the contagious virus, there are no foolproof measures that can be taken against contracting this illness, especially at a campus with students living, eating, and studying in such close proximity. It is difficult to imagine such a group acting in ways outside of the classroom that would significantly reduce the spread of the virus when research has shown that, among the young, this disease has not been taken very seriously since its very onset.

But these failings do not absolve the universities of liability for what happens on their campus. What students do in their lives has a different legal status than what they do in sanctioned activities and conditions condoned by an institution. Further, by acknowledging the likelihood of risky behavior on the part of students, a university also acknowledges that it is putting staff and instructors at greater risk than if they did not return to campus.

There is a legal and moral responsibility to provide a working environment that is safe to employees. The risk of contracting this virus is significant, due to its rate of contagion and health outcomes. With this risk of contracting a serious illness, and the coercive environment created by the justice issues raised above, universities do not satisfy this condition of safe work environments by having students, staff, and instructors return to campus. At a time when we have a moral obligation to behave in ways to mitigate the spread of the virus, or at the very least not exacerbate its spread, 2/3 of universities are taking steps to actively put students, staff, and instructors in positions that make them more vulnerable to contracting and spreading this illness.

In Washington, D.C., A March Against Fear

Collage of three people from the March

Reporting by Eleanor Price, Photos by Conner Gordon

On February 14, 2018, a gunman walked into Marjory Stoneman Douglas High School in Parkland, Florida, and shot 17 of his former classmates to death. Six weeks later, the survivors of the shooting led over 200,000 people through the streets of Washington, D.C., to call for gun safety measures at the March For Our Lives. At the time, Congress was in recess; many of the country’s leaders were either back in their districts or overseas, far from the streets where their constituents were demanding change.

Many of the march’s attendees were students themselves, outraged at how routine shootings have become in their schools and neighborhoods. Others had felt the impact of gun violence from afar — a mass shooting on the news, an ever-present worry that they or their families could someday be a target. The people we spoke to gave voice to these fears. But each attendee also made one thing clear: though their leaders may be absent, inaction is no longer acceptable.

Continue reading “In Washington, D.C., A March Against Fear”

The 21st-Century Valedictorian and the Battle for First Place

An image of high school graduates during a commencement ceremony.

This article has a set of discussion questions tailored for classroom use. Click here to download them. To see a full list of articles with discussion questions and other resources, visit our “Educational Resources” page.


According to 16-year-old Ryan Walters of North Carolina, abolishing the title of valedictorian in high schools only serves to “recogniz[e] mediocrity, not greatness.” Ryan was interviewed for a Wall Street Journal article about ridding schools of valedictorian titles, and he provides a voice of disapproval and disappointment. After working toward the glorious title of valedictorian for many years of his life, Ryan’s dream is over, as his high school has decided to do away with recognizing the top performer in each graduating class. This harsh critique by the Heritage High School junior may have some validity, but it can also be refuted.

Across the country, high school administrators are beginning to question the productivity of declaring a valedictorian every year. Many students work toward the title of valedictorian from a young age; it is a testament to perseverance, intelligence, and hard work. However, it can also create extreme competition among students and determine one’s value based heavily upon grades. Some school administrators argue that the title of valedictorian motivates students to study harder and achieve more academically. Others argue that declaring a valedictorian promotes unhealthy competition and does more to harm students than to help them. This debate raises the question: is it ethical for high school administrations to declare a valedictorian each year?

The critics of the valedictorian system argue that recognizing a valedictorian places an unhealthy amount of pressure on students. This is a large reason why around half of the schools in the country have eliminated the title. According to the National Institute of Mental Health, 8 percent of high schoolers are diagnosed with some form of anxiety. Suicide was the second leading cause of death in teenagers 15-19 years old in 2014. Although a direct correlation between the stress of school and suicide cannot be made, the anxiety developed because of academic pressures surely contributes. School counselors have expressed concern about the impact that pressure to perform is having on adolescent anxiety. In an article in The Atlantic, Kirkwood High School counselor Amber Lutz said, “high performance expectations surrounding school and sports often result in stress and, in turn, anxiety.

Declaring a valedictorian increases competition among students. As classmates vie for first in their class, the emphasis can be taken off of learning and bettering oneself, and placed upon winning. If a student is aiming for valedictorian but does not achieve it, they may lose appreciation for their accomplishments and simply focus on the fact that they “lost.” In addition, a GPA is not a reflection of one’s high school experience. It does not include creativity, learning style, experience, and passion for certain subjects. It is a number, not a holistic view of an individual. The title of valedictorian separates one student from their peers who may have worked as hard or be of equal inteligence. Many factors affect a grade, including distribution of points, class load, grading rubrics, and more. A GPA is too narrow in its summary of achievement, and too dependent on other factors for it to declare the best student in a class of many.

A question follows this conclusion: should schools be comparing their students to one another at all? Is ranking adolescents based on GPA an exercise that will push students to do their best work? Or is it counterproductive to development?

Competition can be productive. Advancements are made because of competition, and individuals are pushed to achieve more when they are not the only ones aiming for a goal. Certain aspects of society do not function without competition. A customer is not going to buy all five versions of a laptop; rather, they are going to buy what they consider the best option. Competition is also the reason there are five laptops to choose from. In the same way, that technology company is not going to hire all applicants for an open software developer position. They are able hire the best developer out of the five and create a better laptop because of competition. It is important that students are aware of competition and the ways it manifests within society. However, declaring a valedictorian is not the sole method with which this can be taught.

Many high school students play sports in which they win or lose. One may question how this is different from declaring a valedictorian. This question requires the examination of the purpose of education. Schools must decide whether education is meant to increase equality or separate “the best” from the rest. Pittsburg high school superintendent, Patrick J. Mannarino of North Hills High, rid his school of the valedictorian designation and said:  “Education’s not a game. It’s not about ‘I finished first and you finished second.’ That high school diploma declares you all winners.” If a sports game ends in defeat for a teenager, they are surely upset, but their entire athletic career is not rated based on a single game. However, a class ranking does summarize a student’s academic career; therefore, the title must have a greater impact on the self esteem of a student than the outcome of a sports game.

A compromise has been implemented across the country. In recent years, schools have started declaring multiple valedictorians in an effort to recognize more than one high-achieving student. Some argue this solution minimizes the glory that one valedictorian could have and harms the motivation to work hard. Others argue that it presents the same dilemmas as declaring a single valedictorian. The difference between one and seven valedictorians is nonexistent, in the sense that it still separates students and equates the value of each student with their GPA.

The tradition of declaring a valedictorian has been passed down for generations, and valedictorians go on to make great contributions to society. But, if the title of valedictorian was taken away, would the futures success of those students be affected? Would students lose motivation to work hard? Or would schools adapt a more inclusive environment in which students are intrinsically motivated and want to work together? It may be time for schools to reconsider what environment is best for producing intelligent, hardworking students who appreciate what they have accomplished and do not need to compete to have these accomplishments recognized.

Perhaps declaring a valedictorian provides a healthy dose of competition to schools around the country. Maybe it is teaching students to work hard and preparing them for adult life. Or, perhaps ranking adolescents based on their academic performance is contributing to  the growing rates of anxiety and depression in the United States. Maybe declaring a valedictorian is taking the emphasis off of learning and placing it on competing.

The Snooze Button: Should School Start Earlier?

Currently, 75-100% of public schools in 42 states have start-time earlier than 8:30 a.m. , with Louisiana leading the pack with an average school starting time of 7:40 a.m. The push for schools to adopt later starting times enters the political sphere periodically as new studies come out annually suggesting that sleep has become both a hot commodity and a scarcity for students.

Continue reading “The Snooze Button: Should School Start Earlier?”

Should Professors Ban Student Emails?

A few weeks ago Spring-Serenity Duvall, an assistant professor of communications at Salem College, banned student emails. It began with a syllabus policy identifying when students could email her for things, the idea being that students simply shouldn’t email for things that they could easily find out for themselves. But the list became complicated, and finally she decided to…ban all student emails, unless it was to set up an appointment to come see her. As draconian as it sounds, Duvall self reports success, and even that her students enjoyed the policy. As this article notes

After one semester, Duvall said, the email policy has been an “unqualified success.” She reported spending less time filtering through “hundreds of brief, inconsequential emails,” and noticed that students came to class better-prepared and wrote better papers. She allowed one exception to the rule — students emailing her content relevant to the course. During her decadelong career as a college instructor, Duvall said, she has never received more phone calls and more student visits during her office hours. 

Students, in turn, gave the course better evaluations than previous cohorts, and rated Duvall’s concern for their progress and efforts to make herself accessible as “excellent.” Only one student out of 48 had something to say about the email policy — a quibble about not being able to ask simple yes-no questions — but even that student endorsed Duvall’s preference for in-person meetings.

But is this really a good policy? Should professors ban student emails? Danielle DeRise, responds to the above article arguing that they should not. Interestingly, not because she thinks the policy is draconian, but that students ought to figure out email etiquette for themselves. As she notes:

But isn’t there something to be said for letting young adults — especially those enrolled in a communications course — navigate the delicate rules of student-professor etiquette on their own? For letting them fail at it even? Suppose you email about a problem your professor deems trifling. The two worst consequences are (a) no response or (b) a snippy response. In my own college days, I sent emails that at the time seemed vital but that I now recognize as self-absorbed and/or irritatingly Type A. After a few terse one-liners from professors I admired, I became a less zealous emailer.

What do you all think? Is it permissible for professors to ban email, or not?