← Return to search results
Back to Prindle Institute

The Perils of Perfectionism in Energy Policy

nuclear power plant tucked in rolling green hills

Last month, Germany closed its three remaining nuclear power plants, eliciting an open letter of protest from two Nobel laureates, senior professors, and climate scientists. Nuclear energy is one of, perhaps the, least carbon-intensive power sources, additionally boasting a smaller environmental impact than some other low-carbon alternatives due to its compact footprint. However, Germany has struggled to replace its fossil fuel plants with greener options. Consequently, phasing out nuclear energy will require burning more coal and gas, increasing emissions of CO2 and deadly air pollutants.

Ironically, the political movement against German nuclear power was led by ecological activists and the Green Party. According to their election manifesto, nuclear energy is “a high-risk technology.” Steffi Lemke, Federal Minister for the Environment and Nuclear Safety, argued, “The phase-out of nuclear power makes our country safer; ultimately, the risks of nuclear power are uncontrollable.”

While there is some risk associated with nuclear energy, as evidenced by disasters like Chernobyl, the question remains: Are the German Greens justified in shutting down nuclear power plants due to these risks?

Risks, even very deadly ones, can be justified if the benefits are significant and the chance of a bad outcome is sufficiently low. The tradeoff with nuclear power is receiving energy at some level of associated risk, such as a nuclear meltdown or terrorist attack. Despite these risks, having access to energy is crucial for maintaining modern life and its conveniences – lights, computers, the internet. In fact, our lives might be more dangerous without energy, as our society would be much poorer and less capable of caring for its citizens.

It might be argued that another energy source could provide the same benefits without the risks of nuclear power. However, it is essential to gain perspective on the relative risks involved. Despite the fixation on nuclear meltdowns, nuclear power is significantly less risky than alternatives.

For every terawatt hour (TWh) produced, coal energy, still widely used in Germany, causes an estimated 25 deaths through accidents and air pollution. Natural gas, which is growing in German energy production, is safer, causing around three deaths per TWh. In contrast, nuclear power results in only 0.07 deaths/TWh, making it 467 times safer than brown coal and 40 times safer than natural gas. Accounting for deaths linked to climate change would further widen these disparities. A coal plant emits 273 times more CO2 (and 100 times more radiation) than a similar-sized nuclear plant. By eliminating the risks of nuclear energy, Germany inadvertently takes on even greater environmental and health risks.

Germany is in the process of transitioning to renewable energy sources, such as wind and solar. It may be justifiable to shut down nuclear power and eliminate the associated risks assuming that nuclear power is being entirely replaced with renewable sources. However, as of 2021, 75% of German energy came from fossil fuels. Had Germany maintained its nuclear power plants, its growing renewables could be replacing much more fossil fuel energy production. Replacing good with good is not as impactful as replacing bad with good.

The German Greens are correct that nuclear power has some associated environmental and health risks. They chose a strategy of moral perfectionism, doing whatever was necessary to eliminate those risks.

But pushing to eliminate nuclear energy, in the name of safety and environmentalism, has inadvertently led to increased reliance on fossil fuels and heightened environmental and health risks. This demonstrates the potential pitfalls of adhering to our principles and values without considering compromises and trade-offs.

We should, however, be cautious. Just as moral perfectionism can lead us astray, too easily abandoning our principles in the name of pragmatism risks ethical failures of other kinds.

Act consequentialism is probably the most “pragmatic” moral theory. It posits that the right action is whatever creates the best consequences. You should lie, steal, and kill whenever it produces the best outcome (although it rarely does).

Critics of consequentialism argue that it leaves little room for individuals to maintain their integrity or act on their personal values. The philosopher Bernard Williams provided an illustration: Jim, a tourist in a small South American town, finds himself with a terrible choice to either kill one innocent villager or let the local captain kill all twenty villagers. The utilitarian answer is clear: Jim should kill one villager to save the others, as it produces the best outcome. However, Williams argued that we could understand if Jim couldn’t bring himself to kill the innocent villager. If Jim failed to do so, we might not blame him, or at least not blame him harshly. Yet, utilitarianism suggests that Jim would be doing just as much wrong as if he personally killed all but one of the villagers. His action resulted in nineteen more deaths. This example demonstrates the extreme moral pragmatism of consequentialism, which seemingly overlooks the importance of personal integrity and living according to one’s beliefs and values. This is the danger of taking moral pragmatism too far.

But the anti-nuclear Greens may provide an example of moral perfectionism going too far. Morality is not solely about sticking to your principles. Balancing costs and benefits, compromising, and prioritizing are all equally important. We cannot afford to let the pursuit of perfection prevent us from doing the good we can. But neither can we entirely abandon our personal values and principles, as doing so risks devaluing the personal factors that allow us to make sense of our lives. Perhaps there is some room, in some cases, for acting on principle even if it doesn’t result in the best consequences.

A Game Worth Dying For?

image of "Game Over" screen displayed on monitor

There’s a game mechanic called permadeath. The idea behind it is simple. If your character – be that on a computer, board, tabletop, or any other medium – dies, they stay dead. So instead of the standard gaming affair of having extra lives or being revived at a save point, for those games with permadeath, you lose all your equipment, merch, coins, etc. and are considered entirely dead. Some of the most famous games that use this feature include The Long Dark, XCOM: Enemy Unknown, and DayZ.

The purpose of permadeath is relatively simple. It drives up the tension by driving up the stakes.

If you know your character comes back to life when they’re killed, then there’s little risk. The time you invest in a game is safe because it won’t be lost when you get hit by a fireball or trip into a bottomless pit. You can simply dust yourself off and try again.

But, if you’re at risk of losing that progress, the time, effort, and emotions you’ve put into a game become far more precious. Knowing that one wrong move means all that progress gets thrown into the bin means that every step, every look around the corner, and every opening of a mysterious box has tension. Knowing that in-game death means starting over again after spending days reaching a game’s final stages means your investment skyrockets.

However, a game’s stakes are rarely anything more valuable than time. Sure, losing all your progress can be frustrating when a ghoul kills your character in the game’s final moments, but you’re still able to get up and walk away.

While your character may face oblivion, you, as the player, don’t. You may think you’ve wasted your time, but ultimately, that’s all that would have been wasted (and if you had fun, is it really wasted?).

But, in early November 2022, Palmer Luckey, the founder of the VR firm Oculus, claimed he designed a headset that transcends permadeath out of a game and into reality – he developed a headset that kills you if you die in-game.

The headset is fitted with three explosives. Luckey wired these to detect certain shades of red at a specific frequency. So, when your character dies in-game, and the VR headset displays that shade of red, the explosives detonate, and the player’s brain is destroyed. This system is still in its developmental stages, with the headset currently acting as a piece of office art. However, Luckey’s stated that he wants to explore its potential further, eventually ensuring that it’s tamperproof and cannot be removed by external parties. In effect, he wants to prevent someone from helping the player remove the headset if they change their mind after starting the game. Also, a game that would work with the headset needs to be created. Specifically, one that avoids using the triggering shade and frequency of red before the character, and consequentially the player, meets their end.

The prospect of someone using such a headset raises numerous questions. These include whether someone could genuinely consent to use the headset and whether Luckey would be a murderer if/when someone died while using it to play a game.

We may return to these in another article. For now, however, I want to focus on why Palmer Luckey created this maniacal contraption.

Luckey says he got the idea from the manga and anime series Sword Art Online. It features a VR headset called the NerveGear, allowing total immersion in a virtual world. The headset is released with the titular Sword Art Online game. Ten thousand beta players sign in when the game launches but soon discover they cannot sign out and are trapped within the game. The game’s designer then appears to the players and tells them they must beat all 100 floors of the game’s monster-infested mega-castle if they want to escape. At this point, he also reveals that death in the game results in death in real life. The idea of an immersive virtual world captured Luckey’s imagination, as he writes in his blog:

The idea of tying your real life to your virtual avatar has always fascinated me – you instantly raise the stakes to the maximum level and force people to fundamentally rethink how they interact with the virtual world and the players inside it. Pumped up graphics might make a game look more real, but only the threat of serious consequences can make a game feel real to you and every other person in the game. This is an area of videogame mechanics that has never been explored, despite the long history of real-world sports revolving around similar stakes.

At first, this prospect might strike many as patently absurd. It seems that few, if any, would sign up to play a game that could result in death. Games usually are a form of escapism from real-life’s woes, and a game that includes as a mechanic one of life’s (arguably) most significant downsides – mortality – seems to run entirely counter to this goal.

But, with some consideration, Luckey’s perspective on risk’s relationship with gaming seems to hold at least some value, specifically concerning gaming’s attempts at raising the stakes. Games have little material value in and of themselves – it’s what makes them games. This is one of the reasons gaming, in its various forms (including sports), is closely tied to gambling.

Gambling raises the stakes of what is happening in the game and gives it real-world value and impact. For example, you’re much more likely to care about who wins a round of Super Smash Bros if you have money riding on the outcome.

The in-game risk is given real-world form, and the greater the value bet, the greater one’s emotional and cognitive investment is; you care more when there’s more on the line. When it comes to putting things on the line, there’s nothing more valuable than your life.

Also, while it might seem madness to design a game that kills the player if they fail to perform, countless people already undertake recreational activities that involve the prospect of death if mistakes are made. Skydiving is an obvious one.

Plummeting out of a plane and reaching terminal velocity, with only a couple of layers of fabric preventing you from dying upon impact with the earth, is a risk most of us don’t have to take. But the prospect of death in this context doesn’t have people up in arms demanding that skydiving be stopped.

On the contrary, the activity’s value is, in some measure, derived from the inseparable risk of immeasurable harm. It’s arguably what makes diving out of a plane different from indoor skydiving; despite all the measures put in place, you’re aware that death is a potential outcome.

So, provided safeguards are put in place to prevent system errors, and the games offer players a beyond excellent chance of survival, is it such an obscene prospect?

Having said that, if offered the chance to play an almost unlosable game on Luckey’s murderous headset, you can be sure I’d say no

Risk, Regret, and Sport

photograph of two soccer players competing in air for ball

The legendary soccer player Denis Law recently announced that he has been suffering with dementia for several years. Law attributes his dementia to heading soccer balls. We’ve known for decades – in 2002 Jeff Astle’s death from dementia was linked to heading – that there is a link between heading and brain damage.

Other sports face similar issues. American football’s problem with Chronic Traumatic Encephalopathy (CTE) is well documented. CTE can lead to, amongst other things: aggression, depression, and paranoia that can arise in people in their 20s; it also can bring memory loss, dementia, and eventually death. Other sports like rugby and hockey also have links to CTE, and they have their own problems with brain damage.

Broadly, people who partake in sports that involve collisions (including things like headers) are at risk of brain injury. This is true especially when playing at higher levels of competition (as opposed to playing occasional pickup games), where impacts are bigger and players spend more time playing their sport.

How should players think about this risk? Last year, Jamie Carragher, a former top-level player for Liverpool FC and current pundit, said: “If I suffer from dementia in my old age and research suggests that is because of my football career, I will have no regrets.” Carragher recognizes that we are now better informed about the risks and need to make changes to minimize the risks (here is one: fewer headers in training), but he thinks the risks are still worthwhile, and that we must keep some of the risky elements in football: players should still be able to challenge each other in ways that risk sickening head-clashes.

I think Carragher’s thoughts are widely shared. Playing soccer, or rugby, or football is worth the risk of dementia later in life, so much so that players won’t regret playing their sport. But I think this line of thought rests on some troubling assumptions.

The first is the temptation to make a false comparison between the ordinary risks of sport and brain damage. We should obviously grant that some injuries are acceptable risks. I played rugby for over a decade, and I spent several months with sprained ankles and bad shoulders. It’s no surprise that I now occasionally get the odd ache. Almost every sport carries some risk of injury, and if we grant (as I think we should) that playing sports can be a meaningful part of our lives, these risks should not get in the way of us playing. When Carragher says that “there was a danger of injury every time I played,” he is right, but he misses the point. These brain injuries are not the same as (to take his example) a broken leg. They are highly damaging – far more long-term and life-changing than a broken leg usually is.

This leads to a deeper point. Living with dementia can involve a loss of awareness, a loss of memory, and confusion; CTE can lead to personality changes. We might reasonably think of these as transformative experiences. L. A. Paul developed the notion of a transformative experience. To take one of her examples, it’s impossible to know what it is like to be a parent – what it is to love your offspring, what it is to have such a particular duty of care – before becoming a parent. We can only know what it is like to be a parent by becoming a parent. But that means that choosing to become (or not become) a parent is always shrouded in ignorance. (Her other major example is becoming a vampire: we can’t tell what it will be like to be immortal creatures of the night.)

Perhaps the decision to play a sport that might lead to a serious brain injury involves some element of a transformative experience: you can’t know what your life would be like if you had CTE or dementia – confused, with a ruined memory and a changed personality – so perhaps you shouldn’t be so keen to declare that you won’t regret it. You might not feel that way when dementia takes its grip.

Here is another problem. Carragher’s line of thought also assumes that regret lines up with justification. That is to say, if you won’t regret something, then you were justified in taking that risk – you were right to do it. But, as R. Jay Wallace has argued, this isn’t always the case. In Wallace’s example, a young girl might get pregnant. She was far too young, and both she and her child would have had a better time of it had she waited several more years. Her decision to have a child was unjustified. Yet she surely cannot regret her decision: after all, she loves this child.

It isn’t surprising that people who have dedicated decades to their sports – sports that make their lives meaningful – won’t regret what they have done. But that doesn’t mean they made the right choice. There are plenty of other meaningful options out there: like taking up sculpting, squash, or chess.

Yet thinking about regret and justification also brings up something in favor of taking these risks: some people will have nothing to regret at all because brain damage is far from guaranteed, even in football. Bernard Williams argued that we might sometimes take a risk and that risk will be justified by the results. If you abandon your wife and children to set off on a career as a painter, you might have made a grave error if you fail in your career – but perhaps it will all have been worth it if you succeed. Likewise, Carragher, if he avoids dementia, might have been perfectly justified in playing soccer. Others might not be so lucky.

Sports play a meaningful role in many of our lives, and we are all happy to live with some level of risk. But we shouldn’t just say: “I won’t regret playing, even if I get dementia.” To note that you wouldn’t regret playing just because of a broken leg is to compare chalk and cheese; we don’t really know what our lives would be like with dementia, so we shouldn’t be confident in such assertions; and even if we end up with no regrets, that doesn’t mean we did the right thing. This discussion requires serious conversations about risk management and the meaningfulness of sport – it shouldn’t be conducted at the level of glib sayings.

Should College Football Be Canceled?

photograph of footbal next to the 50 yard line

On August 11, the Big Ten conference announced it would be postponing its fall sports season to Spring 2021. This decision shocked many, as it was the first Division I college football conference to cancel its fall season.  After the announcement, Vice President Mike Pence took to twitter to voice his disapproval and to declare that, “America needs college football” and President Donald Trump simply tweeted, “Play College Football!” Trump and Pence weren’t the only politicians to express this belief, though they are certainly the highest-ranking members of government to express a moral position in favor of continuing the college sports season amidst the pandemic.

Questions surrounding America’s 2020 college football season make up a few of the many ethical dilemmas surrounding higher education during the pandemic. Canceling this season means further economic loss and the potential suppression of a labor movement, while playing ball could have dire consequences for the safety of players and associated colleges. Navigating this dilemma requires asking several questions about both the economic importance and cultural significance of college football.

Do schools have an ethical duty to cancel their football season? What values do athletic programs hold in college education? And what is at stake for the players, the schools, and Americans at large?

Is a sports season, in and of itself, dangerous to attempt during the pandemic? The official CDC guidance on playing sports advises that participants should wear masks, keep a 6-feet distance, and bring their own equipment. They also rank sports activities from low to high risk, with the lowest risk being skill-building drills at home and the highest being competition with those from different areas. While the CDC does not necessarily advocate against the continuation of athletic programs during the pandemic, can the same be said for other medical professionals? After VP Mike Pence’s tweet, several prominent health professionals “clapped back” on Twitter, pushing back against the need for football, and even suggesting that continuing fall sports is of least priority during the pandemic. Some medical professionals have even ranked football as one of the most dangerous sports for COVID-19 transmission.

Despite the physical dangers, cancelling football season has serious economic consequences for colleges. It is estimated that there is at least $4 billion at stake if college football is cancelled. While losing one year’s worth of revenue on sports might not seem like a big deal, many colleges rely on athletic revenue to cover the costs of student scholarships and coaching contracts. In fact, a 2018 study by the NCAA found that overall, Division I athletic programs were operating at a deficit, and their revenues were helping them scrape by. Without revenues this season thousands of professors and staff members could face the risk of job loss, due to colleges’ lack of money to cover athletic investments. Small businesses that see large profits from the influx of fans during football season face a huge decrease in revenue. Even sports bars and restaurants, which draw in customers by airing current games, face significant economic losses.

Additionally, college sports serve as a primary form of entertainment for millions of people. In 2019 alone, over 47 million spectators attended college football games and an average of over 7 million people watched games on TV. College football clearly holds large cultural value in American society. During a time which is already financially, emotionally, and mentally troubling, losing one’s hobby, or ties to a community of like-minded people, might worsen the growing mental health crisis spurred by the pandemic.

The question of whether or not college football season should continue is also further complicated by the existing ethical debates within the sport itself. NCAA football teams have had a wide-ranging history of corruption, from academic violations to embezzlement schemes. Even more disturbing are the several sexual abuse scandals that have rocked major college football teams in recent years, both involving athletes and athletic officials. The clear racial divides in the makeup of players and athletic officials, has stirred debates about the haunting similarities between college football and slavery.

Over the past decade, there has been a growing movement in favor of instituting labor rights for college athletes. Several lawsuits against the NCAA, primarily on behalf of football players, have argued that widespread lack of compensation violates labor laws. Movements to unionize college football have become even more common during COVID, with some arguing that recent league debates about canceling the football season are more about controlling players’ ability to organize than it is about players’ health and safety. In an op-ed in The Guardian, Johanna Mellis, Derak Silva, and Nathan Kalman-Lamb argue that the decision to cancel the college football season was motivated by fear of the growing movement demanding widespread reform in the NCAA. They assert that if colleges really cared about the health and safety of their players, they would not have “compelled thousands of players back on to campus for workouts over the spring and summer, exposing them to the threat of Covid-19.” The argument is especially strong when one considers the fact that a growing movement of athletes, using the hashtags #WeAreUnited and #WeWantToPlay, have been threatening to refuse to play without the ability to unionize.

Despite potentially ulterior motives for cancelling the college football season, it still might arguably be the most ethical decision. Nearly a dozen college football players have already suffered life-threatening conditions as a result of the spread of COVID. The continuation of a fall sports season will endanger athletes, athletic officials, spectators, and also non-athlete students. Even if in-person spectators are prohibited, the continuation of fall sports requires cross-state team competition, which is ranked as the highest risk sports activity by the CDC. Several outbreaks have already occurred during fall training at colleges across the nation. Outbreaks on teams have not only the potential to harm athletes, but also students at the universities which they attend.

While two Division I conferences across the country have canceled their season, others appear unwavering in their desire to play football. Fortunately, the NCAA has developed a set of regulations aimed to protect players from retaliation if they choose not to play. With human lives, the economic survival of colleges, and a labor organization movement at stake, America’s 2020 college football season is set to be the most ethically confuddling in history.

The Moral Pitfalls of Color-Coded Coronavirus Warning Systems

Color-coded chart showing the risk of covid-19 in UK

As states around the country ease lockdown restrictions, some are putting into place systems advising people about threat levels. In some states, these are color-coded systems that strongly resemble the Homeland Security Advisory system, put into place by George W. Bush to inform people about the risk of threats from terrorism after the September 11th terrorist attacks. 

Utah, for example, has a four-tiered system: a red designation indicates high risk, an orange designation indicates moderate risk, a yellow designation indicates low risk, and a green designation indicates “new normal.” The color-coded systems of other states and some other countries largely follow this same model.

It’s important to remember that there were lots of serious problems with the Homeland Security Advisory System, and it was eventually abandoned and replaced. Many of the problems had to do with the fact that color-coded systems are vague by their very nature. People have a sense that red means “stop” and green means “go.” Very few people will investigate the situation further. As a result, these systems are easily manipulated for political purposes. Color-coded systems don’t encourage careful, responsible thinking about risk. They encourage behavior motivated by sentiment rather than reason, and sentiment is easily coerced. Politicians tend to be excellent at cultivating certain kinds of common sentiments that drive political behavior and the Homeland Security Advisory System roused both fear and xenophobia. These are powerful forces and invoking them caused people to make voting decisions that they might not otherwise have made, to support wars they might otherwise have found unjustified, and to accept unprecedented privacy violations on the understanding that they were being protected from imminent harm.

Color-coded responses to coronavirus operate according to similar principles. If people want to know the current level of danger posed by coronavirus, they should be paying attention to the relevant data. How many new cases is a state discovering each day? What are the hospitalization rates? How many people are dying? These color-coded systems are not responsive to these important considerations. For example, there was recently a major outbreak of coronavirus at the JBS meatpacking plant in Hyrum, a city in Northern Utah. The outbreak was the biggest hotspot yet discovered in the state. At this point, 287 workers at the plant have tested positive for COVID-19. This meat packing plant refuses to shut down or to give employees meaningful time off to heal. 

Hyrum is in Cache County, and despite the unknown extent of the spread, Cache County remains in the yellow “low risk” zone. In fact, even in light of the outbreak, the Cache County Council voted to request that the county be moved to the green designation. As one councilperson put it, “I’m in the age group that’s most likely to die, but I’ve had a good life and I say let’s get on with it. That may sound like I’m being pretty casual about it, but that’s the way I feel.” If an area like Cache County requests a green designation on the basis of these kinds of considerations, the system is not responsive enough to actual data.

The Cache County example also illustrates the point that these vague, color-coded systems track not a set of facts, but a set of values. Many states have decided that thriving economies are more important than the lives of the vulnerable, but they haven’t exactly made this value judgment explicit so that people can evaluate it and respond accordingly. Instead, the values are obscured by color designations that look for all the world like they are based on public health considerations.

Instead of motivating people with fear, coronavirus color-coding systems encourage a different form of cognitive bias—wishful thinking. People across the country are sick of lockdown. They are exhibiting quarantine fatigue. They are sick of travel restrictions and of being prevented from engaging in their favorite consumer activities, especially during the summer. The fact that coronavirus cases have reduced dramatically in places like New York is causing the national curve to flatten. This doesn’t tell us anything encouraging about what is happening across the rest of the country. People have self-interested reasons to interpret the numbers favorably, even though there is no evidence-based justification for doing so. These warning systems also undercut good critical thinking practices in another way—they encourage people to disregard the advice of experts on infectious disease. The best available evidence we have now suggests that people should wash their hands regularly, maintain social distance from others, and wear a mask in areas where social distancing is difficult. It’s hard to get people to engage in these practices regularly anyway, and it is even more difficult to convince them that they should be doing so when their county is in a yellow or even a green risk designation.

At least in Utah, these systems do track some data, but not the data people might be inclined to believe. They are not tracking information relevant to whether people are actually safe to participate in social and consumer activities again. Instead, decisions are being made on the basis of how many hospital beds are available in a given area. The concern is not about whether people will contract the disease, but about whether health care systems will be overwhelmed if and when they do. This isn’t a metric we would stand for in other cases. Consider the following analogy. City officials are aware that the water at the local beach is infested with dangerous man-eating sharks. They are tasked with making recommendations about the safety of getting in the water. Tourism to the beach generates a lot of revenue every year, so it is in the state’s interest for the water not to be infested with man-eating sharks. Officials determine that the health care system is well equipped enough to treat people for shark bites, so they advise people that it is safe to swim in the water. Presumably, residents would think that this was an unconscionable decision and there would rightly be a degrading of trust in the public officials that were so callous with people’s lives.

The Homeland Security Advisory System was eventually replaced with the National Terror Advisory System, which was designed to “more effectively communicate information about terrorist threats by providing timely, detailed information to the American public.” Both terrorism and public health are high information issues about which it is difficult for the public to be fully informed.  Nevertheless, we should encourage people to be engaging with actual data rather than with colors that lull them into a false sense of security. 

The Reasoning Behind the $417 Million Baby Powder Lawsuit

Photos of Johnson's baby powder.

Last month, according to the Los Angeles Times, a court ordered Johnson & Johnson, purveyor of several household health and beauty products, to pay $417 million in damages to Eva Echeverria, a 63-year-old Los Angeles resident who claims the company failed to warn her and other consumers about the cancer risk of their talc-based products, such as their baby powder.

Continue reading “The Reasoning Behind the $417 Million Baby Powder Lawsuit”

Alcohol Legislation in Utah: Drunk with Power?

The United States has long struggled with a set of deeply divided attitudes toward alcohol.  To be sure, alcohol can be quite dangerous, so it is certainly reasonable to be cautious and concerned about its use in certain contexts.  On the other hand, one of the clear lessons taught by our experiment with Prohibition is that individuals feel that restrictive alcohol policies constitute unwarranted violations of their autonomy.

Continue reading “Alcohol Legislation in Utah: Drunk with Power?”