Distributed Workforce, Networking

Apple vs DOJ Doesn’t Really Matter

Share
Phone in the dark

Anyone remember the Crypto wars of the 1990s? Back in the early 1990s, the U.S. placed strict regulations on the exportation of cryptography and even put encryption technologies it on the munitions list as auxiliary military equipment. This restriction was a real burden to software firms like Lotus, Microsoft, and Novell as they wanted to offer data confidentiality and integrity features for PC users. Eventually the NSA offered a compromise by approving a weak 40-bit encryption algorithm for export purposes.

This was far from the end of the battle however. In 1993, the U.S. government (NSA) developed and promoted a technology called the Clipper chip, capable of encrypting telecommunications while providing a “back door” for government surveillance. Meanwhile, civil libertarians and privacy advocates fought to loosen government encryption restrictions in court.

Rise of PGP

While the government continued to fight for control of encryption on legal and technical fronts, a grassroots effort really made all other cryptographic battles moot. In 1991, a relatively unknown software engineer named Phil Zimmermann developed encryption software he named “Pretty Good Privacy” (PGP), the first widely-available software program using public-key cryptography. Subsequently, Zimmermann sent PGP around to like-minded friends and it was soon available for download on FTP servers all over the world. Suddenly, strong cryptography was pervasive.

So between public outcry, industry pressure, legal challenges, international competition, and open source projects, the U.S. government realized it couldn’t control cryptographic technology after all. Ultimately, it abandoned the Clipper Chip fight in 1996 and loosened export restrictions in the late 1990s, culminating in simplified Dept. of Commerce rules in 2000.

What History Teaches Us

This history is worth reviewing when considering the current Apple vs. DOJ battle. My thoughts:

  • The first issue in play is relatively myopic, the DOJ wants Apple to hack into a specific iPhone that belonged to one of the San Bernardino terrorists. My assumption is that this wouldn’t be too difficult for Apple from a technology standpoint as it could simply go into the iOS source code, find an exposed service or software vulnerability and develop an exploit to break into the phone. Okay, but this process would violate Apple’s stated privacy policy and commitment to customers.
  • Aside from its reputation, Apple is really concerned about the possible precedent here. If Apple is willing to break into one phone, why not 10, 100, 1000, or more? The Edward Snowden episode demonstrated that the NSA has an insatiable appetite for private data. Apple is worried that its cooperation with the FBI/DOJ would end up sucking the company into a greater government surveillance vortex with no exit. Apple supporters like Facebook, Google, and Microsoft are also concerned that if Apple gives in, others will be forced to follow.
  • And while the DOJ tries to focus the current brouhaha on a single phone, the bigger question is clearly about the dichotomy between privacy vs. national security. Political candidates, law enforcement officials, and intelligence agencies are positioning Apple vs. DOJ as the tip of the spear in a greater push for “back doors” into encryption technology so that the U.S. good guys can catch criminal and terrorist bad guys.

Yup, this issue may be a great one for soundbites and political speeches but history should teach us that we are really wasting our time. Organized criminals and terrorists already understand their outlaw status. To maintain anonymity and hid their data, they’ve adopted stealthing tactics, including the use of commercial and open source encryption technology.

Will Back Doors Work?

So if the NSA had access to a “back door” into my iPhone, Android device, or Windows PC, or if these companies were willing to break into devices on the government’s behalf, I could simply install layers of add-on hardware- and software-based cryptographic technologies developed all over the world. Heck, I could even write my own algorithm that no one has ever seen before. The NSA could throw all the Cray computers it has in Ft. Mead at this and never break the code.

In summary, my point is pretty simple — the encryption genie has been out of the bottle for over 20 years and there’s no way to put it back in. The people wanted and received technologies to protect their privacy and these technologies are readily available to model citizens and dirtbags alike. “Back doors” in U.S.-developed technologies won’t work and make us look like a police state. Apple wants no part of this plan.

Jon Oltsik is a senior principal analyst at ESG, an integrated IT research, analyst, strategy and validation firm. Read more ESG blogs here.