Read Surveillance or Security?: The Risks Posed by New Wiretapping Technologies Online
Authors: Susan Landau
In the summer of 2009 officials became concerned about an Afghan-bom,
Pakistani-raised resident of Aurora, Colorado, Najibullah Zazi. Zazi and
others were purchasing large quantities of beauty products containing
acetone and hydrogen peroxide, two of the three ingredients in triacetone
triperoxide (TATP), the explosive used in the London transit bombings.81
Zazi had done web searches for locations selling muriatic acid, a diluted
version of hydrochloric acid, the third component of TATP.82 On August
28 and September 6 and 7, he rented a hotel room in Aurora. It was after
his second stay, when he "attempted to communicate on multiple occasions-each communication more urgent in tone than the last-seeking
to correct mixtures of ingredients to make explosivesi83 that the FBI tested
the oven vent for explosives and chemical residue and found the presence
of acetone. Bomb making required that the chemicals be highly concentrated, and one way to accomplish this is through heating them.
Zazi rented a car and left for New York on September 9; he was stopped
the next day while crossing the George Washington Bridge into Manhattan. He became suspicious that he was being investigated, a fact later
confirmed by an acquaintance,84 and he returned to Colorado. He was
arrested several days later. Evidence against him included bomb-making
instructions on his laptop (searched while Zazi had been in New York), the
chemical purchases, the communications about the mixture balance, and
the acetone found on the Aurora hotel stove vent. In February 2010 Zazi
pled guilty to conspiracy to conduct terrorist activities and to providing
support for Al-Qaeda.85 His intent was to blow up New York subway cars
using TATP. Others have also been charged in the case. The government
has not said how it came to home in on Zazi, but there are clear indications that wiretaps provided the crucial initial evidence that something
nefarious was afoot.86
Zazi was not the only U.S. case in which such electronic surveillance
played a role. Abu-Jihaad was investigated by the FBI after a computer disk
found during a search of a London apartment revealed classified information about Abu-Jihaad's naval convoy.87 The subsequent investigation led
government agents to Derrick Shareef. Many terrorism cases proceed using
such a set of links. It is often the case that intermediate points lead to more
significant targets.
When considering the efficacy of wiretapping in national-security investigations, it is important to keep in mind that a number of other tools have
been brought to bear in domestic terrorism. One is to "follow the money."
In 2001 under the Bush administration, counterterrorism officials began
examining financial records from a large international database, SWIFT, which runs an international financial messaging service;88 this program,
which had not undergone any public scrutiny prior to being implemented,
netted some useful information.89 In the United States, Uzair Paracha, who
sought to aid an Al-Qaeda member's entry into the United States, was
identified through U.S. surveillance of banking transactions.90 The tool has
also been useful outside the country: Riduan Isamuddin,91 accused of
helping to plan the bombings in Bali in 2000 that killed 202 people as well
as numerous other terrorist attacks, was located and captured in Thailand
as a result of studying a money trail. Given that radical Islamic terrorist
efforts within the U.S have focused on recruitment and funding, tracking
money may be a particularly useful investigative strategy.
The government's read of the value of the President's Surveillance
Program (PSP) was decidedly mixed. While FBI management-including
FBI Director Robert Mueller-called the program "of value," the DoJ's
Office of Inspector General's report described the tool as "one of many"
and concluded it played a limited role in the FBI's counterterrorism efforts.92
The CIA and personnel in the Office of the Director of National Intelligence93 called the PSP one tool of many. The strongest statement about the
value of this highly controversial program was that "there are several cases
identified by IC [intelligence community] officials ... where PSP may have
contributed to a counterterrorism success."" One investigator said that
information garnered from the PSP did not result in much new information, but what it did provide in some cases was that there was no reason to
be concerned about certain people who were under suspicion. Being able
to focus investigations on the serious suspects is, of course, of great value.
The DoJ characterization of the PSP as "one of many" is not surprising;
intelligence investigations are a matter of filling in a confusing picture and
each tool clarifies only a small part. The full picture may be understood
even while there is some fog obscuring the whole, and if a tool such as
wiretapping clears up even one aspect, that may be crucial in developing
a true understanding of the situation. Nonetheless, the less-than-ringing
endorsement by field investigators of the necessity of the PSP is worth
noting. There was strong political pressure in a highly politicized administration to conclude otherwise; the investigators did not.
5.4 A History Lesson
Effectiveness needs to be measured in terms of costs. An ambulance that
races to a hospital causing accidents along the way is not effective. Similarly, communications interception decisions that end up compromising communications security may provide more cost than benefit. What
happens if, in designing for communications intelligence, what results is
communications insecurity? It is worth looking at the Crypto Wars of the
1980s and 1990s to discover what lessons can be learned in hindsight. In
particular, it is worth examining fielded communications products and
their security characteristics.
In 1982 Europe began work on a cell phone system. The Europeans
wanted a mobile network that could be used across the continent. They
wanted the system security to be at least as good as the security of the
wired-line network. So those developing the standards had to solve two
problems: protecting the privacy of the communications between the
handset and the radio tower and protecting the privacy of the phone so
that it could not be identified, except by the cell tower (the latter is really
about providing anonymity for the user). This called for cryptography,
which created complications.
Cryptography is, of course, a dual-use technology with both civilian and
military applications. During the Cold War, the United States and its allies
controlled the export of equipment that could be used for military purposes through the Coordinating Committee on Multilateral Export Controls (CoCOM), an organization whose membership included the United
States, Canada, Australia, New Zealand, Japan, and most western European
countries. Export between CoCOM countries was also regulated, which
meant that the development of the European cell phone system, GSM
(originally standing for Groupe Special Mobile, and now for Global System
for Mobile Communications), fell under the CoCOM regulations.
This was challenging. "It was the first time that [telephone engineers]
had actually developed a universal cryptographic product that went into
the hands of the users," recalled Charles Brookson, who had been chair
of the GSM Algorithmic Experts Group. There were justifiable concerns
about the cryptography, which, because of CoCOM rules, had not been
made public and was of limited strength.
The cryptographic algorithm leaked, and in a series of papers between
1999 and 2003, academic cryptographers found ways to break the GSM
system. These did not appear to result in actual attacks. The bigger security
gap in GSM, however, was a known flaw that no one had expected to be
much of a problem. To simplify the authentication of the user to the system,
the GSM protocol did not require the cell tower to authenticate itself to the
cell phone. This allowed fake towers to be set up and spoof phones in the
network. These problems were corrected in the next-generation system, 3G,
which is rapidly replacing GSM.
The U.S. government did not stop controlling export of military-grade
equipment with the end of the Cold War. Cryptography's dual-use status
meant that export from the United States of products containing cryptographic algorithms was determined by the Office of Munitions Control
with the advice of the NSA.95 In the 1990s the U.S. government constrained
deployment within the country of strong cryptography by controlling the
export of products containing that type of cryptography.96 In 1992 the NSA
and an industry association reached an agreement that "mass-market"
software using 40-bit keys could be freely exported. Even at that time, 40
bits was not very impressive; 240 is about a trillion operations. In 1992, a
workstation could do a brute-force search of the entire key space in about
an hour.
Still, this was a first step in loosening export controls, an important
issue to the computer industry. U.S. companies were loath to produce two
types of products: one with strong cryptography for domestic markets and
one with weaker cryptography (shorter key lengths) for export. Thus export
controls constrained the domestic market as well.
One effect was on secure telephones. In the fall of 1992 AT&T was
poised to change telephony. The company announced a mass-market
device for secure telephones: the Telephone Security Device Model 3600
(TSD 3600). The idea was not new. Other companies, including Gretag and
Crypto AG in Switzerland, and Cylink, Datotek, and TCC in the United
States, had sold voice-encryption systems. AT&T had figured out how to
do this using a single processor for the digital signal processing. Planning
on selling the devices for an initial price of $1295, AT&T envisioned a vast
market of businesspeople who would want secure communications as they
traveled.
Then the U.S. government stepped in. The proposal sounded benign:
the NSA expressed interest in using the TSD 3600 for certain government
applications but there was an export problem. What if AT&T made two
sets of phones, one with their own encryption chip, which used 56-bit
DES, and one with a new cryptographic algorithm developed by the
agency? The latter would not have export-control issues and AT&T could
market them abroad.
The NSA's new algorithm, Skipjack, was embedded within a keymanagement scheme in which the 80-bit key was split into two components, each of which was to be stored at a secure facility operated by an
executive-branch federal agency. Keys would not be released except under
"proper legal authorization. "9' The program was officially known as the
Escrowed Encryption Standard (EES) and unofficially as "Clipper," after the tamper-resistant chip implementing the 80-bit encryption algorithm. The
system was stronger than 56-bit DES, which in 1993 was finally coming to
the end of its lifetime. But storing the keys indefinitely meant that a communication was never actually safe; there is always the danger that stored
keys may be revealed.
As a group of security experts observed, "The ability to recover traffic
continues to exist long after the original communication has occurred....
The relevant keys [are] stored instead of destroyed, so that later government requests for the plaintext can succeed. If the keys are stored, they
can be compromised; if they are destroyed, the threat of compromise ceases
at that moment."" Who would ensure that the communications traffic
would not be read by a rogue insider? There were objections to the idea of
a storage facility that held everyone's keys; that was simply too big a target.
The system's complexity would create too high a risk of compromise. Users
and governments abroad were not impressed with the idea of U.S. government storage of keys. Despite strong government inducements for industry
to develop products using escrowed encryption, there was no interest in
Clipper or Clipper-type efforts, and the U.S. government was unable to
interest other nations in multilateral key-sharing agreements.
AT&T built multiple models of its TSD 3600 device, some of which ran
Clipper and some of which ran algorithms developed by Datotek and by
Gretag AG. Some of the phones were interoperable, some were not. The
large market that AT&T envisioned did not happen, and the program was
not a success. Instead of the TSDs jumping off the shelves at Radio Shack,
a chain of electronic retail shops found at U.S. shopping malls, after two
years, sales of the device were about seventeen thousand.99 A majority of
those went to the FBI, which was attempting to get a market rolling.
Clipper not only killed off AT&T's effort at a mass-market telephone security device, but other efforts as well.
The export controls had exceptions, most notably for the use of cryptography for authentication (determining that the sender is who he claims
to be) and integrity checking (protecting a communication so it is received
in untampered form). Export of cryptography for use for confidentiality
was generally not permitted. This was not a clear-cut line; many of the
technologies used for authenticity purposes and integrity checking could
also be used for ensuring confidentiality of communications. The result
was a confusing government process that delayed the deployment of computer security that network architects, security researchers, industry, and
the government agreed was essential for the safe operating of the network.
One such instance occurred with DNSSEC.
In 1997 Hugh Daniel submitted a classification request to the U.S.
Bureau of Export Administration (BXA) for a DNSSEC implementation.
This implementation, Integrated DNSSEC, used an encryption toolkit,
RSAREF, for performing the authentication; Daniel's export application
clearly stated this (as well as observing that the encryption software was
used only for authentication).100 BXA determined that because Integrated
DNSSEC was "publicly available," the product did not require an export
license for distribution outside the United States and Canada.