Back to Prindle Institute

Unions and Worker Agency

photograph of workers standing together, arms crossed

The past few years have seen a resurgence of organized labor in the United States, with especially intense activity in just the past few months. This includes high profile union drives at Starbucks, Amazon, the media conglomerate Condé Nast, and even MIT.

Parallel to this resurgence is the so-called “Great Resignation.” As the frenetic early days of the pandemic receded into the distance, workers began quitting at elevated rates. According to the Pew Research Center, the three main reasons for quitting were low pay, a lack of opportunity for advancement, and feeling disrespected. Former U.S. Secretary of Labor Robert Reich even analogized it to a general strike, in which workers across multiple industries stop work simultaneously.

Undoubtedly, the core cause of both the Great Resignation and growing organized labor are the same – dissatisfaction with working conditions – but they are also importantly different. The aim of quitting is to leave the workplace, the aim of unions and strikes are to change it. They do this by trying to shift the balance of power in the workplace and give more voice and agency to workers.

Workplaces are often highly hierarchical with orders and direction coming down from the top, controlling everything from mouse clicks to uniforms. This has even led some philosophers, like the noted political philosopher Elizabeth Anderson, to refer to workplaces as dictatorships. She contends that the workplace is a blind spot in the American love for democracy, with the American public confusing free markets with free workers, despite the often autocratic nature of the workplace. Managers may hold almost all the power in the workplace, even in cases where the actual working conditions themselves are good.

Advocates of greater workplace democracy emphasize “non-domination,” or that at the very least workers should be free from arbitrary exercises of managerial power in the workplace. While legal workplace regulations provide some checks on managerial power, the fact remains that not everything can or should be governmentally regulated. Here, worker organizations like unions can step in. This is especially important in cases where, for whatever reasons, workers cannot easily quit.

Conversations about unionization generally focus on wages and benefits. Unions themselves refer to the advantage of unionization as the “union difference,” and emphasize the increases in pay, healthcare, sick leave, and other benefits compared to non-unionized workplaces. But what causes this difference? Through allowing workers to bargain a contract with management, unions enable workers to be part of a typically management-side discussion about workplace priorities. Employer representatives and union representatives must sit at the same table and come to some kind of agreement about wages, benefits, and working conditions. That is, for good or for ill, unions at least partially democratize the workplace – although it is far from full workplace democracy, in which workers would democratically exercise managerial control.

Few would hold that, all things being equal, workers should not have more agency in the workplace. More likely, their concern is either that worker collectives like unions come at the cost of broader economic interests, or that unions specifically do not secure worker agency but in fact saddle workers with even more restrictions.

The overall economic effect of unions is contentious, but there is little evidence that they hobble otherwise productive industries. A 2019 survey of hundreds of studies on unionization found that while unionization did lower company profits, it did not negatively impact company productivity and decreased overall societal inequality.

More generally, two assumptions must be avoided. The first is that the interests of the workers are necessarily separate from the interests of the company. No doubt company interests do sometimes diverge from union interests, but at a minimum unionized workers still need the company to stay in business. This argument does not apply to public sector unions (government workers), but even there, unions can arguably lead to more invested workers and stronger recruitment.

The second assumption to avoid is that management interests are necessarily company interests. Just as workers may sometimes pursue their personal interests over broader company interest, so too can management. This concern is especially acute when investment groups, like hedge funds, buy a company. Their incentive is to turn a profit on their investment, whether that is best achieved by the long-term health of the company or by selling it for parts. Stock options were historically proposed as a strategy to tie the personal compensation of management to the broader performance of a company. This strategy is limited however, as what it does more precisely is tie management compensation to the value of stock, which can be manipulated in various ways, such as stock buybacks.

Beyond these economic considerations, a worker may also question whether their individual agency in the workplace is best represented by a union. Every organization is going to bring some strictures with it, and this can include union regulations and red tape. The core argument on behalf of unions as a tool for workplace agency is that due to asymmetries of power in the workplace, the best way for workers to have agency is collective agency. This is especially effective for goals that are shared widely among workers, such as better pay. Hypothetically, something like a fully democratic workplace (or having each individual worker well positioned to be part of company decision making) would be better for worker agency than unions. The question of whether these alternatives would work is more practical than ethical.

There can be other tensions between individual and collective agency. In America specifically, unions have been viewed as highly optional. The most potent union relationship is a “closed shop,” in which a union and company agree to only hire union workers. Slightly less restrictive is a “union shop,” under which all new workers must join the union. Both are illegal in the United States under the 1947 Taft Hartley Act, which restricted the power of unions in several ways. State-level  “right to work” laws go even further, forbidding unions from negotiating contracts that automatically deduct union representation fees from employees. The argument is one of personal freedom – that if someone is not in the union they should not have to pay for it. The challenge is that the union still has to represent this individual, who benefits from the union they are not paying for. This invites broader questions about the value of individual freedoms, and how they must be calibrated with respect to the collective good.

 

The author is a member of Indiana Graduate Workers Coalition – United Electrical Workers, which is currently involved in a labor dispute at Indiana University Bloomington.

“Severance,” Identity and Work

split image of woman worrying

The following piece discusses the series Severance. I avoid specific plot details. But if you want to go into the show blind, stop reading now.

Severance follows a group of employees at Lumon Industries, a biotech company of unspecified purpose. The main characters have all received a surgery before starting this job. Referred to as the “severance” procedure, this surgery causes a split in the patient’s personality. After surgery, patients awaken to find that while they have factual memories, they have no autobiographical memories – one character cannot remember her name or the color of her mother’s eyes but remembers that Delaware is a state.

However, the severance procedure does not cause irreversible amnesia. Rather, it creates two distinct aspects of one’s personality. One, called the outie, is the individual who was hired by Lumon and agreed to the procedure. However, when she goes to work, the outie loses consciousness and another aspect, the innie, awakens. The innie has no shared memories with the outie. She comes to awareness at the start of each shift, the last thing she remembers being walking to the exit the previous day. Her life is an uninterrupted sequence of days at the office and nothing else.

Before analyzing the severance procedure closer, let us take a few moments to consider some trends about work. As of 2017, 2.6 million people in the U.S. worked on-call, stopping and starting at a moment’s notice. Our smartphones leave us constantly vulnerable to emails or phone calls that pull us out of our personal lives. The pandemic and the corresponding need for remote, at-home work only accelerated the blurring of lines between our personal lives and spaces, and our work lives. For instance, as workplaces have gone digital, people have begun creating “Zoom corners.” Although seemingly innocuous, practices like these involve ceding control of some of our personal space to be more appealing to our employers and co-workers.

Concerns like these lead Elizabeth Anderson to argue in Private Government that workplaces have become governments. Corporate policies control our behavior when on the clock and our personal activities, which can be easily tracked online, may be subject to the scrutiny of our employers. Unlike with public, democratic institutions where we can shape policy by voting, a vast majority have no say in how their workplace is run. Hence this control is totalitarian. Further, “low skilled” and low-wage workers – because they are deemed more replaceable – are even more subject to their employer’s whims. This increased vulnerability to corporate governance carries with it many negative consequences, on top of those already associated with low income.

Some consequences may be due to a phenomenon Karl Marx called alienation. When working you give yourself up to others. You are told what to produce and how to produce it. You hand control of yourself over to someone or something else. Further, what you do while on the clock significantly affects what you want to do for leisure; even if you loved gardening, surely you would do something else to relax if your job was landscaping. When our work increasingly bleeds into our personal lives, our lives cease to be our own.

So, we can see why the severance procedure would have appeal. It promises us more than just balance between work and life, it makes it impossible for work to interfere with your personal life; your boss cannot email you with questions about your work on the weekend and you cannot be asked to take a project home if you literally have no recollection of your time in the office. To ensure that you will always leave your work at the door may sound like a dream to many.

Further, one might argue that the severance procedure is just an exercise of autonomy. The person agreeing to work at Lumon agrees to get the procedure done and we should not interfere with this choice. At best, it’s like wearing a uniform or following a code of conduct; it’s just a condition of employment which one can reject by quitting. At worst, it’s comparable to our reactions to “elective disability”; we see someone choosing a medical procedure that makes us uncomfortable, but our discomfort does not imply someone should not have the choice. We must not interfere with people’s ability to make choices that only affect themselves, and the severance procedure is such a choice.

Yet the show itself presents the severance procedure as morally dubious. Background TV programs show talking heads debating it, activists known as the “Whole Mind Collective” are campaigning to outlaw severance, and when others learn that the main character, Mark, is severed, they are visibly uncomfortable and uncertain what to say. So, what is the argument against it?

To explain what is objectionable about the severance procedure, we need to consider what makes us who we are. This is an issue referred to in philosophy as “personal identity.” In some sense, the innie and the outie are two parts of the same whole. No new person is born because of the surgery and the two exist within the same human organism; they share the same body and the same brain.

However, it is not immediately obvious that people are simply organisms. A common view is that a significant portion, if not all, of our identity deals with psychological factors like our memories. To demonstrate this, consider a case that Derek Parfit presented in Reasons and Persons. He refers to this case as the Psychological Spectrum. It goes roughly as follows:

Imagine that a nefarious surgeon installed a microchip on my brain. This microchip is connected to several buttons. As the surgeon presses each button, a portion of my memories change to Napoleon Bonaparte’s memories. When the surgeon pushes the last button, I would all of and only Napoleon’s memories.

What can we say about this case? It seems that, after the doctor presses the last button Nick no longer exists. It’s unclear when I stopped existing – after a few buttons, there seems to be a kind of weird Nick-Napoleon hybrid, who gradually goes full Napoleon. Nonetheless, even though Nick the organism survives, Nick the person does not.

And this allows us to see the full scope of the objection to the severance procedure. The choice is not just self-regarding. When one gets severed, they are arguably creating a new person. A person whose life is spent utterly alienated. The innie spends her days performing the tasks demanded of her by management. Her entire life is her work. And what’s more troubling is that this is the only way she can exist – any attempts to leave will merely result in the outie taking over, having no idea what happened at work.

This reveals the true horror of what Severance presents to us. The protagonists have an escape from increasing corporate protrusion into their personal lives. But this release comes at a price. They must wholly sacrifice a third of their lives. For eight hours a day, they no longer exist. And in that time, a different person lives a life under the thumb of a totalitarian government she has no bargaining power against.

The world of Severance is one without a good move for the worker. She is personally subject to private government which threatens to consume her whole life, or she severs her work and personal selves. Either way, her employer wins.

Hybrid Workplaces and Epistemic Injustice

photograph of blurred motion in the office

The pandemic has, among other things, been a massive experiment in the nature of work. The percentage of people who worked from home either part- or full-time jumped massively over the past year, not by design but by necessity. We are, however, nearing a time in which people may be able to return to working conditions that existed pre-pandemic, and there have thus been a lot of questions about what work will look like going forward. Recent studies have indicated that while many people want to continue working from home at least some of the time, many also miss face-to-face interactions with coworkers, not to mention having a reason to get out of the house every once in a while. Businesses may also have financial incentives to have their employees working from home more often in the future: having already invested in the infrastructure needed to have people work from home over the past year, businesses could save money by not having to pay for the space for all their employees to work in-person at once. Instead of having everyone return to the office, many businesses are thus contemplating a “hybrid” model, with employees splitting their time between the office and home.

While a hybrid workplace may sound like the best of both worlds, some have expressed concerns with such an arrangement. Here’s a big one: those who are able to go into the office more frequently will be more visible, and thus may be presented with more opportunities for advancement than those who spend most of their working hours at home. There are many reasons why one might want or need to work from home more frequently, but one significant reason is that one has obligations to care for children or other family members. This may result in greater gender inequalities in the workplace, as women who take on childcare responsibilities will especially be at a disadvantage in comparison to single men who are able to put in a full workweek in the office.

Hybrid workplaces thus risk creating injustices, in which some employees will be unfairly disadvantaged, even if it is not the explicit intention of the employer. While these potential disadvantages have been portrayed in terms of opportunities for advancement, here I want to discuss another potential form of disadvantage which could result in injustices of a different sort, namely epistemic injustices.

Epistemic injustices are ones that affect people in terms of their capacities as knowers. For instance, if you know something but are unfairly treated as if you don’t, or are not taken as seriously as you should be, then you may be experiencing an epistemic injustice. Or, you might be prevented from gaining or sharing knowledge, not because you don’t have anything interesting to contribute, but because you’re unfairly being left out of the conversation. While anyone can experience epistemic injustice, marginalized groups that are negatively stereotyped and underrepresented in positions of power are especially prone to be treated as lacking knowledge when they possess it, and to be left out of opportunities to gain knowledge and share the knowledge they possess.

We can see, then, how hybrid workplaces may contribute to a disparity not only in terms of opportunities for advancement, but also in terms of epistemic opportunities. These are not necessarily unrelated phenomena: for instance, if those who are able to put in more hours in the office are more likely to be promoted, then they will also have more opportunities to gain and share knowledge pertinent to the workplace. There may also be more subtle ways in which those working from home can be left out of the conversation. For instance, one can still be in communication with their fellow employees from home (via virtual meetings, chats, etc.), they will miss out on the more organic interactions that occur when people are working face-to-face. It tends to be easier to just walk over to a coworker if you have a question then to schedule a Zoom call, a convenience that can result in some people being asked for their input much more frequently than others.

Of course, those working in hybrid environments do not need to have any malicious intent to contribute to epistemic injustices. Again, consider a situation in which you and a colleague are able to go back to the office on a full-time basis. You are likely to acquire a lot more information from that colleague who you are able to have quick and easy conversations with than the person working from home whose schedule you need to work around. You might not necessarily think that one of your colleagues is necessarily better than the other, but it’s just easier to talk to the person who’s right over there. What ends up happening, however, is that those who need to work from home more often are gradually going to be left out of the conversation, which will prevent them from being able to contribute in the same way as those working in the office.

These problems are not necessarily insurmountable. Writing in Wired, Sid Sijbrandij, CEO of GitLab, writes that, “Unquestionably sticking to systems and processes that made an office-based model successful will doom any remote model to fail,” and mentions a number of measures that his company has taken to attempt to help remote workers communicate with one another, including “coffee chats” and “all-remote talent shows.” While I cannot in good conscience condone remote talent shows, it is clear that if businesses are going to have concerns of epistemic justice in mind, then making sure that there are more opportunities for there to be open lines of communication, including the possibility for informal conversations with remote workers, will be crucial.

Workers’ Well-Being and Employers’ Duties of Care

photograph of amazon warehouse

If you’ve been working from home during the pandemic then there’s a good chance your employer has sent you an email expressing their concern about your well-being and general level of happiness. Perhaps they’ve suggested some activities you could perform from the comfort of your own home working space, or offered Zoom classes or workshops on things like meditation, exercise, and mindfulness. While most likely well-intentioned, these kinds of emails have become notorious for being out of touch with the scale of the stresses that workers face. It is understandable why: it is, after all, unlikely that a half-hour mindfulness webinar is going to make a dent in the stress accumulated while living in a pandemic over the last year.

It goes without saying that the pandemic has taken a toll on many people’s physical and mental health. And while employers certainly have obligations towards their employees, do they have any specific duties to try to improve the well-being of their employees that has taken a hit during the pandemic?

In one sense, employers clearly do have some obligations towards the happiness and well-being of their employees. Consider, for instance, a recent scandal involving Amazon: the company baldly denied a statement that some Amazon workers were under so much pressure at their jobs that they were unable to take bathroom breaks, and were forced to urinate in bottles instead. Great quantities of evidence were then quickly accumulated that such practices were, in fact, taking place, and Amazon was forced to issue a weak conciliatory reply. It is reasonable in this case to say that Amazon has put their workers in situations in which their well-being is compromised, and they have an obligation to treat them better.

“Don’t make your workers pee in bottles” is an extremely low bar to clear, and it is an indictment of our times that it has to be said at all. People working from home offices, however, are typically not in the same circumstances: while they likely have access to washrooms, their stressors will instead be those that stem from isolation, uncertainty, and many potential additional burdens in the form of needing to care for themselves and others. So, as long as an employer is allowing its employees to meet a certain minimal standard of comfort, and assuming that those working from home during the pandemic meet this standard, do they have any additional obligations to care for employees happiness and well-being?

One might think that the answer to this question is “no.” One reason why we might think this is that we typically regard one’s own happiness as being one’s own responsibility. Indeed, much of the recent narrative on happiness and well-being emphasizes the extent to which we have control over these aspects of our lives. For example, consider a passage from a recent Wall Street Journal article, entitled “Forget What You Think Happiness Is,” that considers how the pandemic has impacted how we conceive of happiness:

“Mary Pipher, clinical psychologist and author of ‘Women Rowing North’ and ‘Reviving Ophelia,’ says the pandemic underscored what she long believed: that happiness is a choice and a skill. This past Christmas, she and her husband spent the day alone in their Lincoln, Neb., home, without family and friends, for the first time since their now adult children were born. ‘I thought, ‘What are we going to do?’ We went out for a walk on the prairie and saw buffalo. I ended up that day feeling really happy.’”

If happiness is a choice then it is not a choice that I can make for you; if happiness is a skill then it’s something you have to learn on your own. Perhaps I can help you out – I can help you learn about happiness activities like gratitude exercises, meditation, and mindfulness – but the rest is then up to you. If this is all we’re able to do for someone else, then perhaps the mindfulness webinars really are all we are entitled to expect from our employers.

There are a couple of worries here. First, to say that “happiness is a choice and a skill” is clearly a gross oversimplification: while serendipitous buffalo sightings will no doubt lift the spirits of many, happiness may not be so easily chosen for those who suffer from depression and anxiety. Second, while there is a lot of hype around the “skills” involved in acquiring happiness, empirical studies of gratitude interventions (as well as the notion of “gratitude” itself), meditation, and mindfulness (especially mindfulness, as discussed here, and here), have had mixed results, with researchers expressing concerns over vague concepts and a general lack of efficacy, especially when it comes to those who are, again, suffering from depression and anxiety. Of course, such studies concern averages across many individuals, meaning that any or all of these activities may work for some while failing to work for others. If you find yourself a member of the former group, then that’s great. A concern, however, is that claims that there are simple skills that can increase happiness are still very much up for debate within the psychological community.

Of course, those working from home will likely have much more practical roots of their decreased happiness; a guided meditation session over Zoom will not, for instance, ameliorate one’s childcare needs. Here, then, is the second worry: there are potentially much more practical measures that employers could take to help increase the happiness and well-being of employees.

For comparison, consider a current debate occurring in my home province of Ontario, Canada: while the federal government has made certain benefits available to those who are forced to miss work due to illness or need to quarantine, many have called on the provincial government to create a separate fund for paid sick days. The idea is that since the former is a prolonged process – taking weeks or months for workers to receive money – this disincentivizes workers to take days off when they may need to. This can result in more people going into work while sick, which is clearly something that should be minimized. The point, then, is that while recommendations for how you can exercise at your desk may be popular among employers, it seems that it would be much more effective to offer more practical solutions to problems of employee well-being, e.g., allowing for more time off.

The question of what an employer owes its employees is, of course, a complex one. While there are clear cases in which corporations fail to meet even the most basic standard of appropriate treatment of its employees – e.g., the recent Amazon debacle – it is up for debate just how much is owed to those with comparatively much more comfortable jobs working from home. Part of the frustration, however, no doubt stems from the fact that if employers are, in fact, concerned about employee well-being, then there are probably better ways of increasing it than offering yet another mindfulness webinar.

The Short- and Long-Term Ethical Issues of Working from Home

photograph of an empty office looking out over city

The COVID-19 pandemic has resulted in a shift of working habits as almost half of the U.S. workforce now are able to work from home. This has led many to ask whether it might be the beginning of a more permanent change in working habits. Such a move carries significant ethically-salient benefits and drawbacks regarding urban development, climate change, mental health, and more.

Despite the apparent novelty of the idea of having many permanently working from home, this was the norm for most people for most of human civilization. It was the industrial revolution and 18th and 19th century reforms to travel which encouraged our need for a separate place of work. Two hundred years ago most people in North America lived and worked on farms. Artisans making textiles and other goods largely working from home. The steam engine allowed for a centralized location that could allow for efficient mass production. Early industrial production even still relied on the “putting-out system,” where centralized factories would make goods and then subcontract the finishing work on the item to people who worked from home. In other words, the concept of “going to work” everyday is a relatively recent invention in human history.

This change had many far-reaching effects. The need to be close to work resulted in urbanization. In the United States the population who lived in urban areas between 1800 and 1900 jumped from 6% to 40%. Urban development and infrastructure followed suit. Artisans who once worked for a price for their goods now worked for a wage for their time. Work that was once governed by sunlight became governed by the clock. Our political and social norms all changed as a result in ways that affect us today. It’s no surprise, for instance that during this time, as employees began working together in a common area, that the first labour unions were formed. Returning to the working habits of our ancestors could have similarly profound effects that are difficult to imagine today, however there are several morally-salient factors that we can identify in a 21st century context.

There are several moral advantages of having more people work from home rather than going to work every day. Working from home during COVID is obviously a move directed at minimizing the spread of the virus. However, permanently working from home also permanently reduces the risk of spreading other infections in the workplace, particularly if it involves less long-distance travel. Approximately 14 million workers in the United Stated are employed in occupations where exposure to disease or infection occurs at least once per week. Reducing physical interaction in the workplace and thereby minimizing infections within it can improve productivity.

In addition, less people going to work means less commuting. 135 million Americans commute to work. Avoiding commute could save an employee up to thousands of dollars per year. The shift has secondary effects as well; less commuting means less wear and tear to public infrastructure like roads and highways and less congestion in urban areas. This is helpful because new infrastructure projects are having a hard time keeping up the increases in traffic congestion. Such changes may also help with tackling climate change, since 30% of U.S. greenhouse gas emissions are for transportation.

On the other hand, it’s possible that working from home could be more harmful to the climate. Research from WSP UK shows that remote working in the UK may only be helpful in the summer. They found that environmental impacts could be higher in the winter due to the need to heat individual buildings instead of a single office building which can be more efficient. In other words, the effect on climate change may not be one-sided.

Working from home can also be less healthy. For example, the concept of the sick day is heavily intertwined with the idea of going to a workplace. The temptation may be to abolish the concept of the sick day with the reasoning being that the whole point of a sick day is to stay home and avoid making co-workers sick. However, even if one can work from home our bodies need rest. Workplace experts have found that those who work from home tend to continue to work during a sickness and this may lengthen recovery time, lead to burn-out, and ultimately lead to less productivity. It can also be unhealthy to develop an “always on” mentality where the line between work and home becomes blurred. According to a recent Monster survey, 51% of Americans admitted to experiencing burnout while working from home as the place of rest and relaxation merges with the place of work. This may have the effect of increasing the number of mental health problems in the workplace while simultaneously making them more physically isolated from fellow workers.

Another potential downside centers on the employer-employee relationship. For example, working from home permanently allows employees to reside in areas where the cost of living is cheaper. This may mean salary reductions since a business will now have a larger pool of potential employees to choose from and thus can offer lower, but still competitive, salaries in areas where the cost of living is cheaper. Facebook has already made moves in this direction. This means job searches will become more competitive and this could drive down salaries even lower. At the same time, large offices will not be needed, and larger urban areas may find decreased economic activity and a drop in the value of office buildings.

The shift also means that an employer is able to infringe on the privacy of home life. Employers are now tracking employees at home to ensure productivity, with software able to track every word typed, GPS location, and even to use a computer’s camera. In some cases, these features can be enabled without an employee even knowing they are being monitored. This will only exacerbate a long-standing ethical concern over privacy in the 21st century.

Finally, it is morally important to recognize that shifting to working from home on a large scale could have disproportionate effects on different communities and different job sectors. The service sector may struggle in areas that no longer have heavy workplace congestion. Also, plumbers and electricians cannot work from home so there are certain industries that literally cannot move in that direction completely. Service industries are often segregated by race and gender, thus ensuring that any of the opportunities enjoyed by working from home will not be equitably shared. It also means that disruptions in these industries caused by the shifting working habits of others could be disproportionately felt.

A permanent shift towards remote working habits carries certain specific moral concerns that will need to be worked out. Whether it will lead to more productivity, less productivity, a greater carbon footprint, a smaller carbon footprint, and so on, will depend on the specific means used and on the new work habits we adopt over the course of time as new laws, policies, and regulations are formulated, tested, and reformed. In the long term, however, the most significant ethical impacts could be the radical social changes it may cause. The shift from working from home to working at work dramatically changed society in less than a century, and the shift back may do the same.

Stories of Vulnerability: COVID-19 in Slaughterhouses

photograph of conveyor line at meat-packing plant

Cases of famous people who have contracted COVID-19 have made headlines. Tom Hanks and Rita Wilson tested positive and later recovered. U.K. Prime Minister Boris Johnson wound up in intensive care. Many professional athletes have contracted the disease. More often than not, however, when we zoom in on coronavirus hotspots, we find that stories about vulnerability come into focus. Many of these stories go unheard unless they cause hardship or inconvenience for groups with more power.

One such case has to do with the production and slaughter of animals that people consume for food. Across the country, there are meat shortages caused by coronavirus. For example, nearly 1 in 5 Wendy’s restaurants has run out of beef, and at many locations other meat products such as pork and chicken are unavailable as well. Supermarkets are also facing shortages. The reason is that the conditions in slaughterhouses are particularly conducive to the spread of coronavirus. Hot spots are popping up at many such sites. 700 employees at a Tyson factory in Perry, Iowa tested positive. At a Tyson plant in Indiana, 900 employees tested positive. According to a CDC report, across 19 states there have been 4,913 cases of coronavirus among slaughterhouse employees. So far, there have been 20 deaths.

Slaughterhouses, also known as meat packing plants, are the next stop for most farm animals after their time in factory farms. When mammals like pigs and chickens arrive, they are put on conveyor belts, stunned, then killed. Their bodies are then sent to a series of “stations” where people carve them up for packaging and, later, consumption.

Work in a slaughterhouse is both physically and psychologically strenuous. Carving flesh and bone requires real effort, and many employees sweat profusely while doing it. The sheer volume of animals that need to be carved up to satisfy the American appetite for meat ensures that employees work together, standing shoulder to shoulder, in spaces that are often poorly ventilated.

This kind of work is not highly sought after for obvious reasons. It is unpleasant. As is so often the case in the United States, unpleasant work is done by those who struggle to find employment—often undocumented immigrants and people living in low-income communities. This complicates the problems with coronavirus spread in several ways. First, employees often do not speak English fluently, so conveying critical information about the virus is difficult. Second, it is common for members of these communities to live in large families or groups. Third, low-income communities are frequently places that are densely populated. All of these factors contribute to more rapid spread of the virus.

In response to the meat shortage, President Trump signed an executive order declaring that meat processing plants are critical infrastructure in the United States. There is disagreement among legal experts about what this means. Some argue that the president doesn’t have the authority to require that slaughterhouses remain open when their continued operation puts employees’ health in jeopardy. One interpretation is that the order simply exempts slaughterhouses from shutdown orders issued by governors. Despite the executive order, plenty of slaughterhouses have closed because they simply don’t have the healthy staff required to carry on.

Those who are supportive of the order are pleased that it provides support to companies that sell meat. Many Americans also approve because it appears that they can continue to put meat on their plates to feed their families and to satisfy their own gustatory preferences. Others approve of the order because they are concerned about the well-being of animal agriculture more broadly. Factory farms raise astonishing numbers of animals every year. The owners of these facilities are not breeding and raising them because they love animals and want thousands of pigs for pets. In these facilities, animals are treated as products to be bought and sold. During the pandemic, new animals are being born and there is no place to put them. The response, in many cases, has been to kill the older animals en masse. For example, Iowa politicians sent a letter to the Trump administration asking for assistance with the disposal of the 700,000 pigs that must now be euthanized each week across the country. The same problem exists for all species of farm animals. People are concerned that this might mean devastation for animal agriculture.

On the other side, many say “good riddance!” Animal agriculture is a cruel and inhumane industry. The pandemic has few silver linings, but one of them is that it brings injustices that might previously have been hidden into the public eye. Our system of animal agriculture could not exist without exploitation of the most vulnerable members of our communities. Slaughterhouses employ vulnerable workers in unsafe working conditions. Factory farms and slaughterhouses abuse and kill animals that cannot defend themselves. Maybe it is finally time for all of this cruelty and suffering to end. In his executive order, President Trump identified slaughterhouses as critical infrastructure. This means that such places are essential, necessary for the proper functioning of our communities. Since consuming the bodies of slaughtered animals is not necessary for human survival, this designation doesn’t seem appropriate.

What’s more, the conditions present in factory farms are exactly the kind that lead to the spread of zoonotic diseases. It appears that the coronavirus jumped from pangolin to human in a wet market in Wuhan. On other occasions, however, diseases spread in factory farms and slaughterhouses—diseases like the swine flu and mad cow disease. Other flus, like the avian flu, are believed to have originated in wet markets in China, but involved animals, chickens and ducks, that we regularly farm for food in the United States. One way that we can help to prevent the transmission and spread of zoonotic diseases is to stop consuming meat.

For those that love the taste of meat, there are alternatives. Beyond Meat and Impossible, plant based products that are engineered to strongly resemble meat in taste, texture, and appearance, are thriving in general, but are doing exceedingly well during the pandemic in particular. In vitro meat, a cellular cultured product that is produced by taking a biopsy of an animal, is a product that is produced in laboratory conditions rather than slaughterhouse conditions and is, therefore, likely to be much safer.

The pandemic shines a light on some of the ways in which our systems of food production exploit the vulnerable—both employees at risk for disease and the animals people put on their plates. Rather than issuing executive orders protecting this industry, perhaps it’s time to dismantle it altogether.

Incentive, Risk, and Oversight in the Pork Industry

photograph of butcher instruction manual with images of different cuts of meat of pig

On September 17th, the U.S. Department of Agriculture announced an updated rule set for pork industry regulators; in addition to removing restrictions on production line speed limits, the Food Safety and Inspection Service (FSIS) will soon allow swine slaughterhouses to hire their own process control inspectors to maintain food safety and humane handling standards instead of relying on monitors. Critics argue that this move is an unconstitutional abuse of power that will likely lead to less secure operations, thereby increasing the risk to animals, workers, and consumers.

Under the current system, hog slaughterhouses are allowed to slaughter a maximum of 1,106 animals per hour (1 pig roughly every 3.5 seconds) and must operate under the watch of multiple FSIS employees. These inspectors review each animal at several points in the killing and disassembly process, ensuring their proper handling, and removing creatures or carcasses from the line that appear to be sickly or otherwise problematic. Notably, these monitors have the authority to both slow down and stop the production line in the interest of preserving sanitary conditions.

But under the New Swine Slaughter Inspection System (NSIS), the limit on per-hour animal slaughter will be removed and pork producers will be allowed to hire employees of their own to replace FSIS inspectors, thereby allowing the FSIS to reassign its monitors elsewhere. Proponents of the move suggest that this deregulation will promote efficiency without increasing overall risk. As Casey Gallimore, a director with the North American Meat Institute (a trade organization supporting pork and other meat producers) explains, the industry’s new hires will be highly trained and FSIS inspectors will still have a presence inside farming operations; whereas a plant might have once had seven government monitors on its production line, “There’s still going to be three on-line [FSIS] inspectors there all of the time.”

Overall, industry groups estimate that, under these new rules, as much as 40% of the federal workforce dedicated to watching over the pork industry will be replaced by pork industry employees. Given that a 2013 audit of FSIS policies indicated that their current implementation was already failing to meet expectations for worker safety and food sanitation, it is unclear how reducing the number of FSIS employees will improve this poor record.

For critics, removing speed limits drastically increases the risk to slaughterhouse employees and introducing corporate loyalty into the monitoring equation further threatens to dilute the effectiveness of already-flimsy federal regulations on slaughterhouse management. Because industry employees will remain beholden to their corporate bosses (at the very least, to the degree that those bosses sign their paychecks), they will have fewer incentives to make decisions that could feasibly impact profitability – particularly slowing or stopping the production line. 

According to Marc Perrone, president of the United Food and Commercial Workers International Union (which represents at least 30,000 employees of the pork industry), “Increasing pork-plant line speeds is a reckless corporate giveaway that would put thousands of workers in harm’s way as they are forced to meet impossible demands.” The FSIS argues that available public data suggests that faster line speeds don’t threaten worker safety; currently, though, there is no national database specifically designed to track packing house injuries and accidents.

It might be the case that industry officials will be able to consistently promote the safety and security of the employees under their care, but a concern reflected by Socrates gives us cause to be skeptical. In Book III of The Republic, Plato has Socrates discuss the nature of the ruling guardian class in his idealized city; often called “philosopher-kings,” Socrates insists that, because the guardians are both naturally inclined to be virtuous individuals, and because they have been carefully trained within a structured society designed to promote their inborn goodness, then the guardians do not, themselves, need guardians of their own – indeed, one of Socrates’ interlocutors even jokes “that a guardian should require another guardian to take care of him is ridiculous indeed.” Centuries before Juvenal asked “But who is to guard the guards themselves?,” Plato argued that the best guards would not actually need guarding at all.

Later philosophers would lack Plato’s optimism; ethicists would construct normative systems with plenty of rules to advise the less virtuous, constitution writers would build layers of checks and balances into divided branches of government, and policy makers would indeed insist on impartiality as a necessary condition for truly effective monitoring. Unless the pork industry can provide us some reason to think that the NSIS inspectors they’ll soon be hiring have been “framed differently by God…in the composition of these [God] has mingled gold” (who have, furthermore, cultivated that virtue over a lifetime of study and practice), we have good reason to be skeptical that they do not, themselves, need watching.

For what it’s worth, Socrates also thought that the guardians should not be allowed to own private property, but that might really be asking too much of the pork industry.

The Problem with Uber

"Dundas Square" by Michael Gil liscened under CC BY 2.0 (via Flickr)

Uber has been taking the world of city transit by storm and has become the poster-child of the modern “gig economy.” At first glance, Uber’s service seems like a universal improvement over traditional taxi services: the app makes hailing a cab convenient for passengers, and the drivers are given complete flexibility to work as little or as much they like.

Continue reading “The Problem with Uber”

Workers’ Rights in the “Gig Economy”

Working an inflexible nine-to-five schedule is often not conducive to the demands of ordinary life.  Parents find themselves missing events at their children’s schools that occur during the day.  Cautious workers manage their sick days conservatively, not knowing what health challenges the year might bring.  Taking a day to care for personal psychological health strikes many as an impractical luxury.  

Continue reading “Workers’ Rights in the “Gig Economy””