# Crypto Wars

Export-restricted RSA encryption source code printed on a T-shirt made the T-shirt an export-restricted munition, as a freedom of speech protest against US encryption export restrictions. (The shirt's back shows relevant clauses of the United States Bill of Rights under a 'VOID' stamp.)[1] Changes in the export law means that it is no longer illegal to export this T-shirt from the US, or for US citizens to show it to foreigners.

The Crypto Wars is an unofficial name for the U.S. and allied governments' attempts to limit the public's and foreign nations' access to cryptography strong enough to resist decryption by national intelligence agencies (especially USA's NSA).[2]

## Export of cryptography from the United States

### Cold War era

In the early days of the Cold War, the U.S. and its allies developed an elaborate series of export control regulations designed to prevent a wide range of Western technology from falling into the hands of others, particularly the Eastern bloc. All export of technology classed as 'critical' required a license. CoCom was organized to coordinate Western export controls.

Two types of technology were protected: technology associated only with weapons of war ("munitions") and dual use technology, which also had commercial applications. In the U.S., dual use technology export was controlled by the Department of Commerce, while munitions were controlled by the State Department. Since in the immediate post WWII period the market for cryptography was almost entirely military, the encryption technology (techniques as well as equipment and, after computers became important, crypto software) was included as a Category XIII item into the United States Munitions List. The multinational control of the export of cryptography on the Western side of the cold war divide was done via the mechanisms of CoCom.

By the 1960s, however, financial organizations were beginning to require strong commercial encryption on the rapidly growing field of wired money transfer. The U.S. Government's introduction of the Data Encryption Standard in 1975 meant that commercial uses of high quality encryption would become common, and serious problems of export control began to arise. Generally these were dealt with through case-by-case export license request proceedings brought by computer manufacturers, such as IBM, and by their large corporate customers.

### PC era

Encryption export controls became a matter of public concern with the introduction of the personal computer. Phil Zimmermann's PGP cryptosystem and its distribution on the Internet in 1991 was the first major 'individual level' challenge to controls on export of cryptography. The growth of electronic commerce in the 1990s created additional pressure for reduced restrictions.[3] Shortly afterward, Netscape's SSL technology was widely adopted as a method for protecting credit card transactions using public key cryptography.

SSL-encrypted messages used the RC4 cipher, and used 128-bit keys. U.S. government export regulations would not permit crypto systems using 128-bit keys to be exported.[4] At this stage Western governments had, in practice, a split personality when it came to encryption; policy was made by the military cryptanalysts, who were solely concerned with preventing their 'enemies' acquiring secrets, but that policy was then communicated to commerce by officials whose job was to support industry.

The longest key size allowed for export without individual license proceedings was 40 bits, so Netscape developed two versions of its web browser. The "U.S. edition" had the full 128-bit strength. The "International Edition" had its effective key length reduced to 40 bits by revealing 88 bits of the key in the SSL protocol. Acquiring the 'U.S. domestic' version turned out to be sufficient hassle that most computer users, even in the U.S., ended up with the 'International' version,[5] whose weak 40-bit encryption could be broken in a matter of days using a single personal computer. A similar situation occurred with Lotus Notes for the same reasons.[6]

Legal challenges by Peter Junger and other civil libertarians and privacy advocates, the widespread availability of encryption software outside the U.S., and the perception by many companies that adverse publicity about weak encryption was limiting their sales and the growth of e-commerce, led to a series of relaxations in US export controls, culminating in 1996 in President Bill Clinton signing the Executive order 13026[7] transferring the commercial encryption from the Munition List to the Commerce Control List. Furthermore, the order stated that, "the software shall not be considered or treated as 'technology'" in the sense of Export Administration Regulations. This order permitted the United States Department of Commerce to implement rules that greatly simplified the export of proprietary and open source software containing cryptography, which they did in 2000.[8]

### Current status

As of 2009, non-military cryptography exports from the U.S. are controlled by the Department of Commerce's Bureau of Industry and Security.[9] Some restrictions still exist, even for mass market products, particularly with regard to export to "rogue states" and terrorist organizations. Militarized encryption equipment, TEMPEST-approved electronics, custom cryptographic software, and even cryptographic consulting services still require an export license[9] (pp. 6–7). Furthermore, encryption registration with the BIS is required for the export of "mass market encryption commodities, software and components with encryption exceeding 64 bits" (75 FR 36494). In addition, other items require a one-time review by or notification to BIS prior to export to most countries.[9] For instance, the BIS must be notified before open-source cryptographic software is made publicly available on the Internet, though no review is required.[10] Export regulations have been relaxed from pre-1996 standards, but are still complex.[9] Other countries, notably those participating in the Wassenaar Arrangement,[11] have similar restrictions.[12]

## Export of cryptography from the UK

Until 1996, the UK government withheld export licenses from exporters unless they used weak ciphers or short keys, and generally discouraged practical public cryptography.[13] A debate about cryptography for the NHS brought this out in the open.[13]

## Mobile phone signals

### Clipper Chip

RSA Security campaigned against the Clipper Chip backdoor, creating this memorable poster which became an icon of that debate.

The Clipper chip was a chipset for mobile phones made by the NSA in the 1990s, which implemented encryption with a backdoor for the US government.[3] The US government tried to get phone manufacturers to adopt the chipset, but without success, and the program was finally defunct by 1996.

### A5/1 (GSM encryption)

A5/1 is a stream cipher used to provide over-the-air communication privacy in the GSM cellular telephone standard.

Security researcher Ross Anderson reported in 1994 that "there was a terrific row between the NATO signal intelligence agencies in the mid-1980s over whether GSM encryption should be strong or not. The Germans said it should be, as they shared a long border with the Warsaw Pact; but the other countries didn't feel this way, and the algorithm as now fielded is a French design."[14]

According to professor Jan Arild Audestad, at the standardization process which started in 1982, A5/1 was originally proposed to have a key length of 128 bits. At that time, 128 bits was projected to be secure for at least 15 years. It is now estimated that 128 bits would in fact also still be secure as of 2014. Audestad, Peter van der Arend, and Thomas Haug say that the British insisted on weaker encryption, with Haug saying he was told by the British delegate that this was to allow the British secret service to eavesdrop more easily. The British proposed a key length of 48 bits, while the West Germans wanted stronger encryption to protect against East German spying, so the compromise became a key length of 56 bits.[15] In general, a key of length 56 is ${\displaystyle 2^{128-56}=2^{72}=4.7\times 10^{21}}$ times easier to break than a key of length 128.

## DES Challenges

The widely used DES encryption algorithm was originally planned by IBM to have a key size of 128 bits;[16] NSA lobbied for a key size of 48 bits. The end compromise were a key size of 64 bits, 8 of which were parity bits, to make an effective key security parameter of 56 bits.[17] DES was considered insecure as early as 1977[18], and documents leaked in the 2013 Snowden leak shows that it was in fact easily crackable by the NSA, but was still recommended by NIST. The DES Challenges were a series of brute force attack contests created by RSA Security to highlight the lack of security provided by the Data Encryption Standard. As part of the successful cracking of the DES-encoded messages, EFF constructed a specialized DES cracking computer nicknamed Deep Crack.

The successful cracking of DES likely helped gather both political and technical support for more advanced encryption in the hands of ordinary citizens.[19] In 1997, NIST began a competition to select a replacement for DES, resulting in the publication in 2000 of the Advanced Encryption Standard (AES).[20] AES is still considered secure as of 2019, and NSA considers AES strong enough to protect information classified at the Top Secret level.[21]

## Snowden and NSA's bullrun program

Fearing widespread adoption of encryption, the NSA set out to stealthily influence and weaken encryption standards and obtain master keys—either by agreement, by force of law, or by computer network exploitation (hacking).[3][22]

According to the New York Times: "But by 2006, an N.S.A. document notes, the agency had broken into communications for three foreign airlines, one travel reservation system, one foreign government’s nuclear department and another’s Internet service by cracking the virtual private networks that protected them. By 2010, the Edgehill program, the British counterencryption effort, was unscrambling VPN traffic for 30 targets and had set a goal of an additional 300."[22]

As part of Bullrun, NSA has also been actively working to "Insert vulnerabilities into commercial encryption systems, IT systems, networks, and endpoint communications devices used by targets".[23] The New York Times has reported that the random number generator Dual_EC_DRBG contains a back door from the NSA, which would allow the NSA to break encryption relying on that random number generator.[24] Even though Dual_EC_DRBG was known to be an insecure and slow random number generator soon after the standard was published, and the potential NSA backdoor was found in 2007, and alternative random number generators without these flaws were certified and widely available, RSA Security continued using Dual_EC_DRBG in the company's BSAFE toolkit and Data Protection Manager until September 2013. While RSA Security has denied knowingly inserting a backdoor into BSAFE, it has not yet given an explanation for the continued usage of Dual_EC_DRBG after its flaws became apparent in 2006 and 2007,[25] however it was reported on December 20, 2013 that RSA had accepted a payment of $10 million from the NSA to set the random number generator as the default.[26][27] Leaked NSA documents state that their effort was “a challenge in finesse” and that “Eventually, N.S.A. became the sole editor” of the standard. By 2010, the NSA had developed “groundbreaking capabilities” against encrypted Internet traffic. A GCHQ document warned however “These capabilities are among the Sigint community’s most fragile, and the inadvertent disclosure of the simple ‘fact of’ could alert the adversary and result in immediate loss of the capability.”[22] Another internal document stated that “there will be NO ‘need to know.’”[22] Several experts, including Bruce Schneier and Christopher Soghoian, have speculated that a successful attack against RC4, a 1987 encryption algorithm still used in at least 50 per cent of all SSL/TLS traffic is a plausible avenue, given several publicly known weaknesses of RC4.[28] Others have speculated that NSA has gained ability to crack 1024-bit RSA and Diffie–Hellman public keys.[29] A team of researchers have pointed out that there is wide reuse of a few non-ephemeral 1024 bit primes in Diffie–Hellman implementations, and that NSA having done precomputation against those primes in order to break encryption using them in real time is very plausibly what NSA's "groundbreaking capabilities" refer to.[30] The Bullrun program is controversial, in that it is believed that NSA deliberately inserts or keeps secret vulnerabilities which affect both law-abiding US citizens as well as NSA's targets, under its NOBUS policy.[31] In theory, NSA has two jobs: prevent vulnerabilities that affect the US, and find vulnerabilities that can be used against US targets; but as argued by Bruce Schneier, NSA seems to prioritize finding (or even creating) and keeping vulnerabilities secret. Bruce Schneier has called for the NSA to be broken up so that the group charged with strengthening cryptography is not subservient to the groups that want to break the cryptography of its targets.[32] ## Encryption of smartphone storage As part of the Snowden leaks, it became widely known that intelligence agencies could bypass encryption of data stored on Android and iOS smartphones by legally ordering Google and Apple to bypass the encryption on specific phones. Around 2014, as a reaction to this, Google and Apple redesigned their encryption so that they did not have the technical ability to bypass it, and it could only be unlocked by knowing the user's password.[33][34] Various law enforcements officials, including the Obama administration's Attorney General Eric Holder[35] responded with strong condemnation, calling it unacceptable that the state could not access alleged criminals' data even with a warrant. One of the more iconic responses being the chief of detectives for Chicago’s police department stating that "Apple will become the phone of choice for the pedophile".[36] Washington Post posted an editorial insisting that "smartphone users must accept that they cannot be above the law if there is a valid search warrant", and after agreeing that backdoors would be undesirable, suggested implementing a "golden key" backdoor which would unlock the data with a warrant.[37][38] FBI Director James Comey cited a number of cases to support the need to decrypt smartphones. Interestingly, in none of the presumably carefully handpicked cases did the smartphone have anything to do with the identification or capture of the culprits, and FBI seems to have been unable to find any strong cases supporting the need for smartphone decryption.[39] Bruce Schneier has labelled the right to smartphone encryption debate Crypto Wars II,[40] while Cory Doctorow called it Crypto Wars redux.[41] Legislators in the US states of California[42] and New York[43] have proposed bills to outlaw the sale of smartphones with unbreakable encryption. As of February 2016, no bills have been passed. In February 2016 the FBI obtained a court order demanding that Apple create and electronically sign new software which would enable the FBI to unlock an iPhone 5c it recovered from one of the shooters in the 2015 terrorist attack in San Bernardino, California. Apple challenged the order. In the end the FBI hired a third party to crack the phone. See FBI–Apple encryption dispute. In April 2016, Dianne Feinstein and Richard Burr sponsored an overly vague bill that would be likely to criminalise all forms of strong encryption.[44][45][46] ## Messengers with end-to-end encryption and responsible encryption In October 2017, Deputy Attorney General Rod Rosenstein called for responsible encryption[47] as a solution to the ongoing problem of "going dark".[48] This refers to wiretapping court orders and police measures increasingly becoming ineffective as strong end-to-end encryption are increasingly added to widespread messenger products. Responsible encryption means that companies need to introduce key escrow that allows them to provide their customers with a way to recover their encrypted data if they forget their password, so that it is not lost forever. According to Rosenstein's reasoning, it would be irresponsible to leave the user helpless in such a case. As a pleasant side effect, this would allow a judge to issue a search warrant instructing the company to decrypt the data, which the company would then be able to comply with. In contrast to previous proposals, the decentral storage of key recovery material by companies instead of government agencies would be an additional safeguard. ## Front doors In 2015 the head of the NSA, Admiral Michael S. Rogers suggested to further decentralize the key escrow by introducing "front doors" instead of back doors into encryption.[49] This way, the key would be split into two halves, with one half being kept by the government authorities and the other by the company responsible for the encryption product. Thus, the government would still have to get a search warrant to obtain the other half of the key from the company, and the company would be unable to abuse the key escrow to access the user's data, since it would lack the other half of the key kept by the government. Experts were not impressed.[50][49] ## Lightweight encryption In 2018, the NSA promoted the use of "lightweight encryption", in particular its ciphers Simon and Speck, for Internet of Things devices.[51] However, the attempt to have those ciphers standardized by ISO failed because of severe criticism raised by the board of cryptography experts which provoked fears that the NSA had non-public knowledge of how to break them.[52] ## 2015 UK call for outlawing non-backdoored cryptography Following the 2015 Charlie Hebdo shooting, a terrorism attack, former UK Prime Minister David Cameron called for outlawing non-backdoored cryptography, saying that there should be no "means of communication" which "we cannot read".[53][54] US president Barack Obama sided with Cameron on this.[55] This call for action does not seem to have resulted in any legislation or changes in the status quo of non-backdoored cryptography being legal and available. ## See also ## References 1. ^ "Munitions T-shirt". cypherspace.org. 2. ^ "The Crypto Wars: Governments Working to Undermine Encryption". Electronic Frontier Foundation. 3. ^ a b c Ranger, Steve (24 March 2015). "The undercover war on your internet secrets: How online surveillance cracked our trust in the web". TechRepublic. Archived from the original on 2016-06-12. Retrieved 2016-06-12. 4. ^ "SSL by Symantec - Learn How SSL Works - Symantec". verisign.com. 5. ^ "Archived copy". Archived from the original on 1999-09-16. Retrieved 2017-03-30.CS1 maint: archived copy as title (link) 6. ^ Crypto: How the Code Rebels Beat the Government—Saving Privacy in the Digital Age, Steven Levy, Penguin, 2001 7. ^ "Administration of Export Controls on Encryption Products" (PDF). Federalregister.gov. Retrieved 2016-06-11. 8. ^ "Revised U.S. Encryption Export Control Regulations (January 2000)". Electronic Privacy Information Center. US Department of Commerce. January 2000. Retrieved 2014-01-06. 9. ^ a b c d Robin Gross. "Regulations" (PDF). gpo.gov. Archived from the original (PDF) on 2010-12-03. Retrieved 2014-10-24. 10. ^ "U. S. Bureau of Industry and Security - Notification Requirements for "Publicly Available" Encryption Source Code". Bis.doc.gov. 2004-12-09. Archived from the original on 2002-09-21. Retrieved 2009-11-08. 11. ^ "Participating States - The Wassenaar Arrangement". Wassenaar.org. Archived from the original on 27 May 2012. Retrieved 11 June 2016. 12. ^ "Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies: Guidelines & Procedures, including the Initial Elements" (PDF). Wassenaar.org. December 2009. Archived from the original (PDF) on 2014-10-14. Retrieved 2016-06-11. 13. ^ a b https://youtube.com/watch/LWwaVe1RF0c?t=1210 14. ^ 15. ^ 16. ^ Stallings, W.: Cryptography and network security: principles and practice. Prentice Hall, 2006. p. 73 17. ^ Stanford Magazine. "Keeping Secrets". Medium. 18. ^ Diffie, Whitfield; Hellman, Martin E. (June 1977). "Exhaustive Cryptanalysis of the NBS Data Encryption Standard" (PDF). Computer. 10 (6): 74–84. doi:10.1109/C-M.1977.217750. Archived from the original (PDF) on 2014-02-26. 19. ^ "Brute Force". google.com. 20. ^ Commerce Department Announces Winner of Global Information Security Competition. Nist.gov (1997-09-12). Retrieved on 2014-05-11. 21. ^ Lynn Hathaway (June 2003). "National Policy on the Use of the Advanced Encryption Standard (AES) to Protect National Security Systems and National Security Information" (PDF). Retrieved 2011-02-15. 22. ^ a b c d "N.S.A. Able to Foil Basic Safeguards of Privacy on Web". The New York Times. 6 September 2013. Retrieved 11 June 2016. 23. ^ "Secret Documents Reveal N.S.A. Campaign Against Encryption". New York Times. 24. ^ 25. ^ Matthew Green. "RSA warns developers not to use RSA products". 26. ^ Menn, Joseph (December 20, 2013). "Exclusive: Secret contract tied NSA and security industry pioneer". San Francisco: Reuters. Retrieved December 20, 2013. 27. ^ Reuters in San Francisco (2013-12-20). "$10m NSA contract with security firm RSA led to encryption 'back door' | World news". theguardian.com. Retrieved 2014-01-23.
28. ^ "That earth-shattering NSA crypto-cracking: Have spooks smashed RC4?". theregister.co.uk.
29. ^ Lucian Constantin (19 November 2013). "Google strengthens its SSL configuration against possible attacks". PCWorld.
30. ^ Adrian, David; Bhargavan, Karthikeyan; Durumeric, Zakir; Gaudry, Pierrick; Green, Matthew; Halderman, J. Alex; Heninger, Nadia; Springall, Drew; Thomé, Emmanuel; Valenta, Luke; VanderSloot, Benjamin; Wustrow, Eric; Zanella-Béguelin, Santiago; Zimmermann, Paul (October 2015). "Imperfect Forward Secrecy: How Diffie-Hellman Fails in Practice" (PDF).
31. ^ Meyer, David. "Dear NSA, Thanks for Making Us All Insecure". Bloomberg.com. Retrieved 11 June 2016.
32. ^ "Schneier on Security". Schneier.com. Retrieved 2016-06-11.
33. ^ Matthew Green. "A Few Thoughts on Cryptographic Engineering". cryptographyengineering.com.
34. ^ "Keeping the Government Out of Your Smartphone". American Civil Liberties Union.
35. ^
36. ^ "FBI blasts Apple, Google for locking police out of phones". Washington Post.
37. ^ "Compromise needed on smartphone encryption". Washington Post.
38. ^
39. ^
40. ^ "Schneier on Security". schneier.com.
41. ^ Cory Doctorow (October 9, 2014). "Crypto wars redux: why the FBI's desire to unlock your private life must be resisted". the Guardian.
42. ^ Farivar, Cyrus (2016-01-21). "Yet another bill seeks to weaken encryption-by-default on smartphones". Ars Technica. Retrieved 2016-06-11.
43. ^ Farivar, Cyrus (2016-01-14). "Bill aims to thwart strong crypto, demands smartphone makers be able to decrypt". Ars Technica. Retrieved 2016-06-11.
44. ^ Dustin Volz and Mark Hosenball (April 8, 2016). "Leak of Senate encryption bill prompts swift backlash". Reuters.
45. ^ "Senate bill effectively bans strong encryption". The Daily Dot.
46. ^