WASHINGTON — Years ago, back when computer users were dialing up the Internet, civilian government scientists already were expressing concerns about the National Security Agency’s role in developing global communication standards, according to documents reviewed by The Associated Press. The records mirror new disclosures, based on classified files 24 years later, that the NSA sought to deliberately weaken Internet encryption in its effort to gather and analyze digital intelligence.
This week, the government’s National Institute of Standards and Technology sought to shore up confidence in the important behind-the-scenes role it plays in setting standards that are used by consumers to make purchases online, access their bank accounts, digitally sign legal documents or file their income taxes electronically. The agency said it “would not deliberately weaken a cryptographic standard” and would continue to work with experts “to create the strongest possible encryption standards for the U.S. government and industry at large.”
It also noted that, under federal law, it was required to consult with the NSA on its computer standards.
Meanwhile, the Office of the Director of National Intelligence said that “it should hardly be surprising that our intelligence agencies seek ways to counteract our adversaries’ use of encryption.” And that office criticized recent disclosures — based on classified records revealed by NSA leaker Edward Snowden — that the NSA for years has used computing power, legal instruments and its role as adviser to NIST to undermine encryption technologies that protect consumers but also could make digital surveillance more difficult for the U.S. government.
Historical NIST records released under the U.S. Freedom of Information Act more than two decades ago show that tensions over security software arose in the early 1990s between the NSA and other scientists in the government who had been working together since 1989 to develop the Digital Signature Standard, a way to electronically sign documents and guarantee their authenticity. That became a federal processing standard by 1994 and was most recently updated in July.
“It’s increasingly evident that it is difficult, if not impossible, to reconcile the concerns and requirements of NSA, NIST and the general public using this approach,” the government experts, who included NSA representatives, wrote in a January 1990 memorandum.
The civilian experts noted that the debate included unspecified “technical and national security equity issues” requiring higher-level officials to intervene. But further explanations weren’t provided because the documents — initially marked “SECRET” — were heavily censored by the government, partly over national security concerns.
Then, in 1992, Stanford University Professor Martin Hellman wrote in an industry journal that the then-proposed standard, eventually embraced by NIST and the NSA, had such serious weaknesses that it undermined NIST’s credibility among civilian cryptography experts.
Additionally, the NSA wanted details of its decisions kept secret. Even the NSA’s reasoning for selecting an algorithm was closely held, stamped “classified” and accessible only to officials with a top secret security clearance. The documents the AP reviewed had been turned over to David Sobel, now senior counsel to the Electronic Frontier Foundation, a civil liberties group.
“This was really the first opportunity the public had to learn of the dominant role that the NSA played in the realm of civilian cryptography and security,” Sobel told the AP. The recent disclosures by Snowden, he said, “demonstrate that dynamic has not changed over the past 20 years — and, if anything, NSA might have become more dominant since those original disclosures were made.”
On one hand, the NSA is responsible for being the Internet’s chief digital locksmith, helping the U.S. government devise standards that have for years protected e-commerce, sensitive documents and citizens’ privacy. On the other, the agency is charged with being cyberspace’s chief electronic lock-picker, capable of stealing the world’s most closely guarded secrets.
“If you wanted to put it in exaggerated terms, the fox is in charge of henhouse,” said Richard Aldrich, whose study of Britain’s signals intelligence agency GCHQ details how it and the NSA worked together to weaken the quality of the encryption used by international diplomats in the 1960s, ’70s and beyond.
But even if the NSA’s campaign to loosen the world’s digital locks has a long pedigree, experts say the fact that encryption has moved out of embassy cypher rooms and into the mainstream means there’s much more at stake. Cryptographers say that the weaknesses left by the NSA might one day be used by America’s rivals in Moscow or Beijing — or even savvy cybercriminals, if the loopholes aren’t being used already.
“What one person can discover, another person can discover. In the end, somebody will figure it out,” said Ben Laurie, a core developer behind OpenSSL, a protocol that helps protect a big chunk of the world’s Internet users from fraudulent websites, credit-card scams and identity theft. “If you deliberately weaken stuff, it will come back to bite you.”
Cryptographers generally accept that the NSA devotes an enormous amount of time and money cracking enemies’ and others’ codes. But the tactics exposed in the newly revealed classified documents — deliberately weakening or installing hidden “back doors” in widely-used encryption protocols — have distressed academics and practitioners alike.
“We’re both surprised and disappointed in the way that they’re doing it,” said Matthew Green, a professor of cryptography at Johns Hopkins University.
Many cryptography experts are groping for a way forward — torn between the need to ferret out back doors the NSA may have hidden over the past decade and the monumental difficulty of updating a security infrastructure deeply embedded in the very fabric of the Internet.
“We’re now faced with the job of cleaning up the mess,” Green wrote in a blog post after Snowden’s revelations. “The question to me — as an American and as someone who cares about the integrity of speech — is how we restore faith in our technology.”