Why You Should Side With the FBI, Not Apple, in the San Bernardino iPhone Case

The ongoing face-off between the FBI and Apple, stemming from a Federal court order issued Feb. 16 that would force the company to unlock the iPhone used by one of the suspects in the San Bernardino terrorist attacks, has little to do with government surveillance powers and even less to do with imperiling the security of dissidents around the world.

That’s just what the post-Snowden cottage industry of privacy-at-all-costs advocates, and Apple, want you to believe.

“The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers,” wrote Apple CEO Tim Cook, in a Feb. 16 letter to Apple customers posted on the company’s website. “The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.”

Cook clearly bases his argument on the overused David versus Goliath model, pitting the great American technology company against the great Satan of the American surveillance state. But that approach glosses over important questions that are really at the heart of this debate: Do we want to live in a country where consumer technologies can be used to carry out horrific acts of violence and then rob victims of justice by encrypting the evidence for eternity? And should private companies like Apple and Google get to dictate the balance between our privacy and our safety, and what is in the public interest?

To those who take a purely academic approach to these questions, the answers are “complicated,” buried deep within a tangled web of “policy implications.” But to a growing number of average Americans—all of whom have had no other choice but to stand by and watch as Silicon Valley absconded with their personal information and proceeded to sell it to the highest bidder—these questions have rational answers.

This case is not about the FBI testing the limits of its surveillance powers and trying to establish precedent for strong-arming companies into creating so-called backdoors to encryption and other security protections. It’s about our ability as a society to provide for the common defense against real enemies of safety and security—not the perceived enemies that some would have you believe are sitting in the basement at NSA headquarters in Fort Meade, Md., right now trying to read your emails and text messages.

Americans look at this debate and see an increasing number of non-terrorism related violent crimes going unsolved or languishing in the criminal justice system because our law enforcement agencies aren’t able to get access to commercial devices that likely hold critical evidence of wrongdoing.

“A part that gets confusing to me is when people talk like we want access to company’s servers, we want access to their source code,” said FBI director James Comey, during testimony Feb. 9 before the Senate Select Committee on Intelligence. “What we would like is a world where people are able to comply with court orders. It’s not about us trying to get a backdoor….I don’t want a door, I don’t want a window, I don’t want a sliding glass door. I would like people to comply with court orders.”

What’s worse, the argument put forth by Apple and the privacy-at-all-costs community is that changing the legal framework to help protect citizens in the U.S. from acts of terrorism and other violent crimes that are being supported by these commercial devices would somehow put the future of mankind at risk by giving rise to authoritarian governments in every clime and place, from Silicon Valley to Samoa.

PASADENA, CA - JANUARY 18: Protesters march against China's censorship of the internet at the Doo Dah Parade on January 18, 2009 in Pasadena. (Photo: Shutterstock)
Protesters march against China’s censorship of the Internet at the Doo Dah Parade on Jan. 18, 2009, in Pasadena, Calif. (Photo: Shutterstock)

Such an absurd prediction ignores the reality that our digital privacy is already gone. Authoritarian regimes—even that futuristic American boogeyman that hides under our beds—already have easy access to the technological tools of political control. A world in which we give our most personal information to Facebook, Google, and Apple so they can profit from it, but are too paranoid to even consider finding a way to help the government protect us from real dangers is a world turned upside down.

There’s also plenty of hypocrisy to point out. How quickly people forget what Silicon Valley has been willing to do to gain access to markets in repressive societies. In 2006, for example, I called for a boycott of Google after the search engine giant cooperated with the Chinese government to develop a censored version of Google for use in China. The Chinese version of Google not only filters out controversial topics, like democracy, but it deliberately returned results full of official Chinese government propaganda. And Google is not alone in accepting the so-called “cost of doing business” in China.

The bottom line is that questions of security and privacy should not be left to the likes of Apple and Google to determine. Likewise, they should not be left with the courts. These are matters that Americans must decide for themselves through laws and regulations passed by our elected representatives in Congress. And if 86 percent of 18- to 25-year-old technology students believe curing cancer or Alzheimer’s disease is more important than personal privacy (as MeriTalk discovered in a recent national survey), Silicon Valley and the privacy-at-all-costs industry is just going to have to suck it up.

Archives