Government Gets a Back Door into Your iPhone; Apple Vows to Fight Back

Government Gets a Back Door into Your iPhone; Apple Vows to Fight Back

A federal magistrate judge in California yesterday ordered Apple to to create a “back door” into encrypted iPhones in order to assist the FBI’s efforts to break into an phone owned by Syed Rizwan Farook, one of the terrorists who killed 14 people in San Bernardino. Apple has vowed to fight back and is posturing the fight as one to protect its users’ privacy. (For those of us who might otherwise be smug in our use of Androids, the exact same issue applies to Google’s OS. This case just happens to involve Apple.)

In fact, the Government’s application to the court – which mirrors a similar application pending in a Brooklyn federal court – incorporates a very aggressive interpretation of the All Writs Act which allows courts to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.”  It also ignores the fact that this same Department of Justice explicitly considered asking Congress to force Apple, Google and others to create a back door to their technologies and rejected the option.  Apparently rather than test this important policy question in Congress, DOJ has decided to go straight to the courts.

It would be one thing if Apple already possessed the technology necessary to give the FBI access and was refusing to employ it to assist law enforcement’s execution of a valid warrant.  But all parties acknowledge that Apple would have to create the “back door” technology if the court’s order stands (apparently the Government lacks the expertise to do it itself).  If that’s the case, where does the Government’s power end?  Apple CEO Tim Cook put it well:

“If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.”

I share his concerns.  As a former federal prosecutor, I have seen first-hand that law enforcement often pursues its goals with an impressive and admirable dedication, but also with a blind spot to important societal implications.  While it’s hard to argue with getting all the information you can from a phone that belonged to a terrorist/mass-murderer like Farook, it is these types of difficult cases on which bad law can be built or important civil rights can be protected.  Judges need to understand that the easy decision is not always the right one.

There’s one more, very practical issue at play here.  Any “back door” that Apple creates and shares with the Government would inevitably leak to hackers, giving criminals a new tool to victimize millions of people around the globe.  As a former federal employee whose personal information was hacked while stored on government servers (though the government did not tell us until months later) I have little faith in the government’s ability to keep anything secure (Wikileaks anyone?).  Are we willing to create a new class of victims in the course of investigating a particularly heinous crime?  And who gets to decide which investigations warrant that kind of trade-off?

The Government’s singular focus on terrorism has already drained resources from other types of important criminal investigations and impacted our daily lives in untold ways.  This case presents an appropriate vehicle for the three branches of government to consider just how far we want to stray from our values in the name of fighting terror.