We rarely pause to reflect upon how encryption entered the public domain. Prior to the late ’70s, encryption was a domain for the military, with very little utility for the average person. Computers were expensive hobbies for cashed-up geeks, ARPANET was the Internet, and mobile phones weren’t even a dream.
What I have written below is taken from Steven Levy’s “Crypto“, a book that should sit on every digital privacy enthusiast’s bookshelf. The following is a very high-level overview of events in the US, many of which coincided and influenced each other. Everything is part of the decades-long “crypto war“. As I have written previously, this crypto war remerges approximately every decade; however, the arguments against encryption are beginning to change.
In the ’70s, one could buy puzzle books and books about substitution ciphers; however, precious little was available at the intersection of cryptography and the digital realm. There was no public key cryptography, no digital key exchange — heck, no software to generate keys — no symmetric ciphers for devices, and no devices even powerful enough to handle the complex computations for encryption. Today, apps such as Signal abstract these key ideas away from the average user, and mobile devices are easily powerful enough to perform encryption operations.
Without freedom of speech victories in the ’70s, ’80s, and ’90s, digital privacy — public key cryptography, key generation/exchange, symmetric encryption, etc. — wouldn’t be sitting in your pocket today.
The NSA vs. Academics
In 1975, the NSA targeted the National Science Foundation, an independent government agency for scientific research. This scientific research included cryptography research grants to private mathematicians and scientists, including cryptographers such as Whit Diffie and Martin Hellman.
The NSA attempted to flex its muscle by telling the NSF that only the NSA conducted cryptography research. Fred Weingarten, the man responsible for monitoring grants, had this to say:
NSA is in a bureaucratic bind. In the past the only communications with heavy security demands were military and diplomatic. Now, with the marriage of computer applications with telecommunications . . . the need for highly secure digital processing has hit the civilian sector. NSA is worried, of course, that public domain security research will compromise some of their work. However, even further, they seem to want to maintain their control and corner a bureaucratic expertise in this field. . . .
It seems clear that turning such a huge domestic responsibility, potentially involving such organizations as banking, the U.S. mail, and cable televisions, to an organization such as NSA should be done only after the most serious debate at higher levels of government than represented by peanuts like me.
The NSA would eventually back down because the First Amendment protects private research; the NSA could not legally prevent private cryptography research because of individuals’ freedom of speech rights.
ITAR vs. the First Amendment
ITAR, the International Traffic in Arms Regulation, is a regulation to restrict the export of military-related technology. In 1977, the IEEE received a letter from an NSA employee named Joseph Meyer. The letter claimed that the IEEE was in breach of ITAR by exporting academic papers.
In 1976, Whit Diffie and Martin Hellman published quite possibly the most important paper in cryptography research history, their original key exchange proposal. After their proposal, a flurry of cryptography research papers came from the US, with many researchers and governments around the world requesting the papers. After their failure with the NSF, the NSA was on the war path to ensure that cryptography research stayed within the bounds of government.
In 1978, the Office of the General Counsel issued an opinion on ITAR vs. cryptography research:
It is our view that the existing provisions of the ITAR are unconstitutional insofar as they establish a prior restraint on disclosure of cryptographic ideas and information developed by scientists and mathematicians in the private sector.
Again, the First Amendment protected cryptography researchers’ speech, and by now the cat was well and truly out of the bag. The NSA was forced to admit that the ITAR exemption for “technical publications” effectively neutered the NSA’s attempts to block the export of cryptography research.
Phil Zimmermann vs. the Justice Department
Phil Zimmermann is another legend in the digital privacy world. Creator of PGP, Zimmermann was accused of exporting “dangerous munitions” when re-entering the US. The “dangerous munitions”? PGP software
Under ITAR, encryption software could only be exported if it implemented 40-bit keys, because the NSA had the ability to break these keys. Zimmermann had published PGP on the Internet with 128-bit keys. Never one to back down, ZImmermann responded to legal action against him by publishing the source code in a book, published by MIT Press, which was bought and shared worldwide.
Given publications are protected by the First Amendment, the US government was powerless to stop the book. Eventually the case was dropped and hence never tested in court. However, the distribution of PGP would not have been possible without the First Amendment.
Bernstein vs. United States Department of States
Similar to Zimmermann, Daniel Bernstein, a professor at the University of Illinois, challenged ITAR. Bernstein wanted to export software containing an encryption algorithm.
Again, the First Amendment came to the rescue. Source code is free speech.
Digital privacy advocates should remember and maintain the culture of free speech that enabled cryptography to move from the military to the private sector. Without the US First Amendment, we may never have reached where we are today: Cryptography research in the private/university sectors, source code for algorithms and software freely availably, uninhibited creation and distribution of ideas, all of which are necessary prerequisites for messaging apps such as Signal.
This is an important issue outside the US, too. For example, can software developers be compelled to put backdoors in their code? Under the First Amendment, such compulsion under the law is not possible. In countries such as Australia, such a law would be possible.
It’s important to remember the philosophy upon which much of digital privacy was founded. Without the philosophy — without the underpinning ideas — we are always at risk of losing what we have today. Restrictions on speech should be fought by digital privacy advocates.