Forbidden Fruit: Apple, the FBI and Institutional Ethics
Your birthday, a pet’s name, or the nostalgia of a high school sports number; the composition of our iPhone password can seem so simple. But a recent case levied by the FBI against Apple has led to a conflict over the integrity of these passwords and sparked debate concerning privacy and security. A California court ordered Apple to produce a feature that would circumvent software preventing the FBI from accessing the phone of Syed Farook, who, along with his wife, committed the San Bernardino terrorist attacks. The couple died in a shootout following their heinous assault, and their electronics were seized by the FBI. They had smashed their cell phones and tampered with their laptop hard drive, but Farook’s work phone, an iPhone 5c, was found undamaged in his car.
Apple is challenging the ruling, claiming that it sets a dangerous precedent for consumer security. Apple CEO Tim Cook said about the case, “some things are hard and some things are right. And some things are both, this is one of those things.” The conflict between Apple and the FBI seems dichotomous, with Apple standing up for privacy rights and the FBI trying to bolster their investigative powers. But in analyzing how the case has played out within the US legal system, moral dilemmas arise concerning US governance structures and the process in which novel legal developments emerge.
After a clerk mistakenly reset the iCloud password on the device, and because of beefed-up encryption technology new to the iOS 9 platform, there is no way to access the data other than manually entering the password. Unfortunately for the FBI, the phone is also equipped with a feature that erases the content of the phone after 10 failed password attempts, making a “brute force” effort to access the code through trying all possible combinations impossible. Thus, the FBI has filed for a court order compelling Apple to produce a feature, dubbed FBiOS by technologist Dan Guido, that would circumvent the failed access erase software. This request was subsequently approved by a California court, prompting Apple to challenge the ruling through legal maneuvers of its own.
Apple is taking a hard stance against any action to circumvent the password feature. Apple CEO Tim Cook has stated, “The only way to get the information – at least currently, the only way we know – would be to write a piece of software that we view as sort of the equivalent of cancer. We think it’s bad news to write. We would never write it. We have never written it – and that is what is at stake here.”
Although Apple has complied with FBI requests to unlock phones in the past, the unique nature of this request has much broader implications for cyber security and privacy issues. Apple, and other tech giants like Google and Twitter, see this as an infringement on digital security that is a threat not only to iPhone users, but to cyber security as a whole. From Apple’s perspective, this case sets a dangerous legal precedent that would grant the government incredible leverage in compelling tech companies to manipulate their technology in order to facilitate backdoor access for government intrusion and investigation in private data.
They argue that creating this loophole technology would open up vulnerable holes in iPhone security, as the requested software code could be stolen or used in other ways. The FBI would also be at a position to use this software time and time again, a development that Apple opposes, even through it would be legal if a search warrant were granted and probable cause sufficient. Cook writes, “In the wrong hands, this software…would have the potential to unlock any iPhone in someone’s physical possession.” It could also allow other nations, specifically China or Russia, to make a compelling cases for even more security concessions from global technology firms, putting dissidents and every day people from around the world in danger of government intrusion.
The FBI may be using the publicity and fear of a terrorist attack to rally sentiment for more aggressive investigative techniques, while Apple capitalizes on an opportunity to, at least superficially, challenge what many of their customers see as an intrusion of the government. Members of the FBI have accused Apple of putting “public brand marketing strategy” over the security concerns of the American people. These self-interested motivations only point to the need for a more robust dialogue and investigation into these issues, especially as the public and the Federal government tries to cope with a rising fear of terrorism in a post-Snowden world.
Technology experts argue that the United States has more to gain from maximizing security than creating any possible vulnerabilities. Even the former NSA chief argues that the US has more to gain out of a stringent commitment to security in the form of sound inscription and data protection than data seizure through these means. In this view, privacy and security both lead to one thing – stronger, not weaker iPhones. The possibility of a government mandate for backdoor access could be possible in the wake of this case, opening up technical questions about the weaknesses this would place on encryption. Many argue that there is no middle ground in encryption technology, and that data is ether secure or it isn’t. “It would be great if we could make a backdoor that only the FBI could walk through,” says Nate Cardozo, an attorney with the Electronic Frontier Foundation. “But that doesn’t exist. And literally every single mathematician, cryptographer, and computer scientist who’s looked at it has agreed.”
If the FBI is successful, this precedent set could allow the FBI to compel a company to manufacture a product that destabilizes the security capabilities of a product already in the hands of consumers. A submission to the court on behalf of Apple gives examples of this development, “like compelling a pharmaceutical company against its will to produce drugs needed to carry out a lethal injection in furtherance of a lawfully issued death warrant or requiring a journalist to plant a false story in order to help lure out a fugitive.”
The FBI argues that this is a unique case, and Apple’s unwillingness to oblige their request may be impeding their access to valuable information concerning domestic terrorism. James Comey, Director of the FBI, states, “We don’t want to break anyone’s encryption or set a master key loose on the land. … Maybe the phone holds the clue to finding more terrorists. Maybe it doesn’t. But we can’t look the survivors in the eye, or ourselves in the mirror, if we don’t follow this lead.” Advocates for increased law enforcement capabilities ask, “How is not solving a murder, or not finding the message that might stop the next terrorist attack, protecting anyone?” Comey criticized Apple and its supporters for helping “bad guys,” asking, “have we become so mistrustful of government and law enforcement in particular that we are willing to let bad guys walk away, willing to leave victims in search of justice?“
Looking at the institutional atmosphere in which this conflict is taking place allows us to see a lack of constitutive dialogue within elective governance structures concerning this imperative issue. The most recent legislation cited in the FBI’s case is from 1977, and their argument rests primarily on the All Writs Act of 1798. With the primary legal document dating back 87 years before Alexander Graham Bell even invented the telephone, there is a clear vacuum of legal material in any way related to the technological developments and cyber security pressures of the 21st century. This is primarily due to the absolute dearth of legislation coming out of Congress concerning these issues. A lack of cyber security expertise and the absence of political will has left the intersection of privacy and security without congressional leadership.
Some members of Congress have proposed a Commission of Cyber Security and Privacy, while others have suggested that the Apple vs. FBI case be put into legislative hands. More opinionated members of Congress are calling for legislation that would force companies like Apple to unlock the phones of criminals. Beyond these limited efforts, there is a lack of genuine dialogue in the legislative chambers of the federal government concerning issues of privacy and security. This stifles mechanisms by which constituents understand their respective representatives’ positions on these issues. There is an obvious need for more robust dialogue and investigation into these issues, especially as the public and the Federal government try to cope with a rising fear of terrorism in a post-Snowden world.
Cook suggests that this ruling would “trample on civil liberties,” while Comey admits that, “the larger question should not be settled in the courts,” and that this extends into, “who we want to be as a country, and how we want to govern ourselves.” This reveals the need for legislative processes at the federal level. Leaving this matter completely to the courts to decide is to steer the debate away from stewardship of elective institutions. This issue is representative of the personal and national values we wish to carry into the technological future, and stagnation in Congress is limiting the ability of US government structures to develop policy based on constructive debate and constituent input.