Surveillance or Security?: The Risks Posed by New Wiretapping Technologies (23 page)

BOOK: Surveillance or Security?: The Risks Posed by New Wiretapping Technologies
7.63Mb size Format: txt, pdf, ePub
ads

Several months later, a problem arose. Philip Karn, an engineer, wanted
BXA to classify the export status of a computer disk that included RSAREF
for confidentiality purposes. Karn was informed that to ship the disk
outside the United States and Canada he would need an export license.
Karn wrote BXA, pointing out that Daniel's DNSSEC product had not
needed an export license even though it used the same software.101 BXA's
response was to reclassify Daniel's DNSSEC product so that a license was
required for distribution of the product outside the United States and
Canada.102 Lee Tien, Daniel's lawyer, objected, observing that "authentication programs are reviewed on the basis of what they do as published,
rather than on what they might do if someone rewrites them."103 Of course.
But the BXA's reclassification of the DNSSEC implementation as an exportcontrolled item lasted until the U.S. government loosened cryptographic
export-control regulations several years later.

In the world of Web 2.0 and cloud collaboration software, it is easy to
forget that such tools are relatively new. Online collaboration was far more
difficult in the 1980s and 1990s when the desktop was king and the sharing
was essentially email, net news, and file transfer. Lotus Notes was a set of
software applications for coordinating business processes (e.g., tracking a
bug, a help-desk request) and included a discussion tool, email, and a
document database. The working model was personal computers connected via a local-area network with one of the PCs acting as a dedicated
server. That server would be set up to communicate with servers of other
local-area networks; concurrency control that they managed enabled
everything to stay synchronized. In so doing, Lotus Notes put branch
offices on par with the home office, and it was enormously successful
software. It was also software that demanded encryption; collaboration
across networks simply could not be secure without it.

Lotus Notes worked hard to make that happen. The first version of Lotus
Notes used 64-bit cryptography-sort of. The software had two implementations for communications: 64-bit for U.S. communications, 32-bit for
international ones.104 This was in 1989, when a message encoded with a 32-bit key would take about two days-and four billion computations-on
a supercomputer to break. It was a time when few customers were thinking
about security, and so no one queried Lotus's decision regarding the weaker
international algorithm.105

A few years later, this weak security solution no longer pleased Lotus's
international customers. Lotus went to 40-bit cryptography in the international version (it continued using 64-bit cryptography for U.S. customers). This solution pleased no one but the NSA; by the mid-1990s, the 40-bit
cryptography was too easy to break.

Then Lotus built a product with exportable 64-bit cryptography. This
was a neat trick given that export rules at the time limited mass-market
cryptography to 40 bits. Ray Ozzie, the Lotus Notes developer, had worked
out a deal with NSA: systems using 64-bit cryptography that could be
exported, because NSA knew 24 bits of the key.106 Although Ozzie had
explained this key-escrow solution quite publicly in 1996, not everyone
had heard of or understood its import. In particular, members of the
Swedish ministry of defense were quite disturbed when they learned that
the security of their communications was 40 bits-one billionth of the
64-bit cryptographic strength they had thought they were getting."'

The effect of these cases (and many others) was much larger than the
disapproval of the individual export licenses. The cases were like the ripples
after a stone has been dropped into a pond. An export license denial creates
fear, uncertainty, and doubt108 in other developers, engineers, and companies, and creates a strong hesitancy to spend time and money on a project
that might not be able to be marketed (in the 1990s half of U.S. hardware
and software sales were abroad). The upshot of a single export license
denial was not a single product stopped but rather a large set of products
neither built nor sold. The effectiveness of wiretapping, which was the
reason for preventing the widespread deployment of strong cryptography,
must be measured against the set of problems that resulted from deploying
insecure systems.

5.5 Export Control Changes

After years of slowing the deployment of products containing strong cryptography, in 2000 the U.S. government abruptly changed policy. Key
length would no longer determine exportability; instead the determination
would be made on the basis of what the product did. If it was for retail
purposes, meaning a high-volume, noncustomized item that was widely
sold and not for use in communications infrastructure, then the product would largely be free of export controls. Depending on the customer,
nonretail items remained under export controls, with more stringent controls applying if the customer was a government.109 While these new
controls had limitations and some lack of clarity, for the most part the
difficulties that industry had been facing for the preceding two decades
went away.

The FBI was not pleased for the previous policy had worked well for
the bureau. By preventing the export of U.S. products containing strong
cryptography, the U.S. government had slowed the deployment of strong
cryptography within the United States. The NSA, however, supported the
changes (otherwise they would not have occurred). There were several
reasons underlying the NSA's policy change.

The primary one was that the NSA was already losing the cryptography
wars. The computer industry had been extremely unhappy with the controls, and over a period of several years, their allies in Congress had introduced bills to liberalize cryptographic export controls. By the summer of
1999 the Security and Freedom through Encryption Act had passed the five
committees with jurisdiction and was moving to the House floor when the
change in export regulations were announced.11° From the NSA's perspective, it was safer to have the White House control the change in regulations
than to allow Congress to do so.

The shift in NSA policy also allowed the agency to strike a bargain with
Congress. The agency was beginning a program in exploiting enemy networks and information systems for information. In loosening cryptographic export controls, the agency was giving something up, and it could
gain something in return. Funding was low for NSA's new project in
network exploitation,"' and the agency received a substantial appropriation for the new effort.

There was another, very important, issue for the agency. This new effort
would be crucial in the changing communications world of the twenty-first
century and the agency needed to understand the new communications
products it was facing. As always, such products were easiest to examine
if they were manufactured in the United States, but export controls were
driving manufacture of secure systems overseas. Loosening export controls
would give NSA better insight into network exploitation possibilities.

Since about 1996, the U.S. government's effort to press key escrow (a
necessary component of the export controls on strong cryptography) had
been counterproductive for the NSA. The agency was becoming the enemy
of the computer industry while gaining very little. Export controls hampered use of strong cryptography by criminals, which advantaged the FBI, but these efforts had little effect on most collection of foreign intelligence,
the NSA's focus.

There was also an issue that was a whisper in 2000 and has become
much more important in the years since. The U.S. Department of
Defense was unable to obtain the sufficient communications security in
commercial-off-the-shelf (COTS) equipment. After the second Iraq War,
that need was critical. The U.S. military had to communicate securely with
Iraqi forces, but the high-grade government off-the-shelf' 12 equipment
would take years to develop. So the soldiers turned to commercial VPNs
and IPSec implementations and, using interoperable COTS equipment,
were able to communicate securely.

CALEA added a different set of problems. Telecommunications and
computer companies expressed concern over the threat to innovation
posed by the expansion of CALEA.13 Both these risks are important and
real, although in fact, it is difficult to point to specific products that were
not developed. That is unsurprising when one carefully considers innovation in this arena. Two phenomena are at play here. The first is that because
CALEA adds cost, deployment of new products is slowed; such product
delay occurs in multiple ways and is unlikely to explicitly appear as "slowed
by CALEA." The second issue is more subtle. Telephone companies traditionally have a history of incremental innovation, with more radical
change coming from outside the industry. It is no surprise, for example,
that the Internet or, to pick a more recent example, Skype was developed
from outside the telephone industry. CALEA's real effect is to prevent
innovation from being proposed in the first place. Neither of these effects
is easy to delineate.

What we do know is that the controversy over CALEA's standards contributed to a long delay114 in the law's implementation."' Disagreements
over interception capacity, the necessity of capturing post-cut-through
digits, and location information contributed to fear, uncertainty, and
doubt as to what was actually required. One reaction, naturally enough,
has been that some companies have created greater surveillance capabilities than the law requires.116

5.6 NSA Scrutinizes Traffic on Public Networks

In September 2007, a Baltimore Sun reporter, Siobhan Gorman, reported
that the NSA was planning to monitor government and private civilian
networks to prevent unauthorized intrusion and protect the networks from
attacks by hackers and terrorists. 117 The spy agency had always played a role in protecting government communications, but giving the NSA a
major role in securing unclassified networks represented a significant
change. How do you protect civilian networks without observing the traffic
on them? The government was not forthcoming, not even after President
Bush signed National Security Presidential Directive 54118 in January 2008.
NSPD54 called for the formation of the Comprehensive National Cybersecurity Initiative (CNCI), whose purpose was to secure U.S. government
networks and keep the U.S. government's sensitive information safe.

Details were sparse, but a few aspects of the CNCI were made public.
These, however, turned out to be cybersecurity programs that had already
begun a few years earlier:

• Trusted Internet Connect A 2007 Office of Management and Budget
program to limit the number of federal government connections to the
Internet to under 100

• Einstein A 2004 Department of Homeland Security program to examine,
collect, correlate, and analyze electronic traffic into and out of civilian
federal agencies11.

The original Einstein project collected, correlated, and analyzed the
electronic traffic in and out of federal agencies, seeking to understand the
nature of the threats. Data collected included the source and destination
IP address, packet length, source and destination port, time stamp, and
underlying protocol .12' Agency participation was voluntary.

A new version of Einstein, Einstein 2, appeared in 2008, and participation by federal agencies was now required. Deployment has been slow.
This version of Einstein was active, protecting federal systems through an
intrusion-detection system that recognized potential threats through comparing the characteristics of the incoming traffic with known malware and
known malicious actions (e.g., a DDoS attack). Monitoring was done at
each participating agency's Internet access point (per the Trusted Internet
Connect program).

Einstein was located on the federal network, and was monitoring traffic
intended for federal systems. In that sense it was not a privacy threat, since
users knew they were connecting to a federal system. Yet for some usersfor example, people accessing their records at the Veterans Administration
or sensitive information at a public website at the Department of Health
and Human Services-the idea that some personally identifiable data, such
as IP addresses, were being retained, was highly disturbing. The Department of Homeland Security (DHS) concluded there was no privacy threat
in the government's use of Einstein 2.121 DoJ's legal analysis found that so long as there was a banner warning users that their traffic was being monitored, private citizens had no expectation of privacy in accessing government sites.122, 123

Government workers had no legitimate expectation of privacy while on
a government network, even if their actions involved conducting private
business on a private site-for example, reading email on their private
Gmail or Yahoo account.124

In this, government employees were no different than an employee
elsewhere. In the United States, employers are allowed to monitor their
employee's communications if the workers are using the employer's communications system and they have been warned that such monitoring may
occur. What is new to the situation, however, is that the person with
whom the employee is communicating is unlikely to be aware that such
monitoring is occurring. They are communicating with their friend or colleague via a private account but the government may be scanning the
communication because the government employee is accessing the communication through a government-owned system. (As this book went to
press, the Supreme Court ruled in the 2010 case City of Ontario v. Quon,121
that auditing text messages sent by a policeman on a department-supplied
pager did not violate the policeman's expectation of privacy. But the court's
ruling was narrow. The auditing had been ordered because monthly text
limits were being greatly exceeded; the Ontario Police Department wanted
to know whether this was due to professional use of the pager. Given the
circumstances, the Court viewed the search as "reasonable," making clear
that the decision was based on the specifics of the case and that a generalization was inappropriate. 126)

BOOK: Surveillance or Security?: The Risks Posed by New Wiretapping Technologies
7.63Mb size Format: txt, pdf, ePub
ads

Other books

Bali 9: The Untold Story by King, Madonna, Wockner, Cindy
Bleeding Out by Jes Battis
The Coniston Case by Rebecca Tope
Reckless Assignation by Denysé Bridger
A Rose From the Dead by Kate Collins
Alice in Time by Penelope Bush
The Christmas Carriage by Grace Burrowes
The Punishing Game by Nathan Gottlieb
Magnetic by Robin Alexander
The Lizard's Bite by David Hewson