Download NEG – Encryption - University of Michigan Debate Camp Wiki

Document related concepts

United Arab Emirates Legal Process wikipedia , lookup

USA Freedom Act wikipedia , lookup

Transcript
NEG – Encryption
A2 INHERENCY
--xt encryption legal
Encryption for IT companies legal -- CALEA
Cindy Cohn 10/17/14-- Cohn is the Executive Director of the Electronic Frontier Foundation and graduate of the University of Michigan Law
School. The National Law Journal named Ms. Cohn one of 100 most influential lawyers in America in 2013. Served as the outside lead attorney
in Bernstein v. Dept. of Justice, First Amendment challenge to the U.S. export restrictions on cryptography. In 2007 the National Law Journal
one of the 50 most influential women lawyers in America. Specializes in NSA surveillance law and serves as council to various surveillance
Supreme Court cases. (Cohn, “EFF Response to FBI Director Comey's Speech on Encryption”, Electronic Fronteir Foundation.
https://www.eff.org/deeplinks/2014/10/eff-response-fbi-director-comeys-speech-encryption)//ET
Here's the relevant part of CALEA that Comey wants to effectively undo: "47 USC 1002(b)(3): A telecommunications carrier shall
not be
responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or
customer, unless the encryption was provided by the carrier and the carrier possesses the info rmation necessary to decrypt the
communication." Also from the CALEA legislative history: "Finally, telecommunications carriers have no responsibility to decrypt
encrypted communications that are the subject of court-ordered wiretaps, unless the carrier provided the encryption and can decrypt it. This
obligation is consistent with the obligation to furnish all necessary assistance under 18 U.S.C. Section 2518(4). Nothing in this paragraph would prohibit
a carrier from deploying an encryption service for which it does not retain the ability to decrypt communications for law enforcement
access ... Nothing in the bill is intended to limit or otherwise prevent the use of any type of encryption within the United States. Nor does the
Committee intend this bill to be in any way a precursor to any kind of ban or limitation on encryption technology. To the contrary, section 2602 protects the
right to use encryption." H/T Chris Soghoian: http://paranoia.dubfire.net/2010/09/calea-and-encryption.html
--xt no backdoors
Backdoor installation not happening – 3 reasons
Soghoian et al 15 (Christopher Soghoian, researcher at Harvard and Yale, Kevin Bankston, Policy Director of New America’s Open Technology
Institute, Fred Cate, C. Ben Dutton Professor of Law at Indiana University Maurer School of Law, Chris Hoofnagle, Co-Director, Berkeley Center for Law &
Technology, Marcia Hofmann, senior staff attorney at the Electronic Frontier Foundation, Rob Faris, Research Director of the Berkman Center for Internet and
Society at Harvard University, Albert Gidari, partner of Perkins Coie in Privacy & Security, Jennifer Granick, Director of Civil Liberties for the Center for Internet
and Society at Stanford Law School, Orin Kerr, professor of law at the George Washington University , Susan Landau, Professor of Social Science and Policy
Studies at Worcester Polytechnic Institute, Paul Ohm, Professor of Law at the Georgetown University Law Center, Nicole Azer, Technology & Civil Liberties
Policy Director in ACLU California, John Palfrey, previous executive director of Harvard's Berkman Center for Internet & Society, Marc Rotenberg, President and
Executive Director of the Electronic Privacy Information Center, Adam Schostack, expert in security, Ryan Singel, journalist of technology at WIRED, Adam
Thierer, senior research fellow with the Technology Policy Program at the Mercatus Center at George Mason University, Jonathan Zittrain, professor of Internet
law and the George Bemis Professor of International Law at Harvard Law School, “Privacy And Law Enforcement: Caught In The Cloud: Privacy, Encryption, And
Government Back Doors In The Web 2.0 Era”, 12/16/13,
http://www.researchgate.net/publication/228365094_Privacy_And_Law_Enforcement_Caught_In_The_Cloud_Privacy_Encryption_And_Government_Back_Door
s_In_The_Web_2.0_Era, page 417-419)//EM
Traditional Software is Pretty Hard to Covertly Back Door One of the defining features of the Internet era is the ability of technology firms to
later fix problems in their products, to release new features after the date of initial sale, and in some cases, to even remove useful features.198
A fix that would in years past have required a costly and slow product recall can now be deployed to all customers with a mere software
update. This ability to release products half-finished, rushing them to the market confident in the knowledge that remaining issues can be fixed with a later
patch has led to a situation that some experts call a state of perpetual beta.199 In many cases, these updates must be manually downloaded
and installed by the user. When this is the case, adoption rates can be extremely low.200 This can lead to problems for
government agencies that wish to compel a traditional software company, such as an operating system vendor, into creating and
deploying a back door. If users cannot be convinced to download and install critical security updates that might protect
them from hackers , how can they be convinced to download and install government back doors that will pilfer their
private files. Another problem associated with the insertion of back doors in traditional software products is the fact that most vendors do not
know their customers’ identities. Many copies of Microsoft Windows and other software suites are bundled with new computers, negotiated as part of site
licenses for companies and universities. Unless the user registers their software installation, the software supplier simply will not
know which individual is associated with any particular computer . The widespread problem of software piracy makes this even worse,
since these users are even less likely to register their illicit installations under their own names. This inability to tie an identifiable customer to a
particular software installation poses a serious barrier to the government’s ability to compel most traditional software providers
into rolling out covert back doors, even if the customer can be convinced to install it. Sure, the company can opt to supply to the
sneaky update to all customers based on the assumption that the government’s suspect will be one of the impacted users. However, this approach is
likely to draw the attention of security researchers and hackers who routinely reverse engineer software updates
NSA moving away from back doors now
Joel Hruska 4/13/15—IT reviewer and journalist for Extreme Tech covering the impacts of policy on IT. (Hruska, “The NSA
wants ‘front door’ access to your encrypted data”, Extreme Tech. http://www.extremetech.com/extreme/203275-the-nsa-wantsfront-door-access-to-your-encrypted-data)//ET
Last December, I had the opportunity to travel to the Netherlands to meet with multiple European tech companies, web hosts, and other infrastructure providers.
The topic of intelligence agency backdoors and US corporate involvement with such policies came up more than once, often in not-entirelyfriendly ways. It’s therefore refreshing to see the head of the NSA, Admiral Michael S. Rogers, state up front that the NSA isn’t interested in a
backdoor solution to digital surveillance. Instead, he wants a so-called “front-door” solution — which could be even worse. Instead of handing
the NSA a unilateral window into encrypted communications taking place at Google or Apple, Rogers suggested a future in which the encryption
keys to access such information would be divided between at least two groups — possibly more. In the simplest example, Google
would retain half the key, while the NSA held the other half. Thus, the agency wouldn’t be able to unilaterally snoop inside
anyone’s files — it would need Google’s support. “I don’t want a back door,” Rogers, the director of the nation’s top electronic spy agency,
said during a speech at Princeton University, according to the Washington Post. “I want a front door. And I want the front door to have multiple
locks. Big locks.”
--xt no mandate
There’s no backdoor mandate now nor coming
Ackerman 15 (Spencer, national security editor for Guardian US. A former senior writer for Wired, 2012 National Magazine
Award for Digital Reporting, “FBI chief wants 'backdoor access' to encrypted communications to fight Isis”, 7/8/15,
http://www.theguardian.com/technology/2015/jul/08/fbi-chief-backdoor-access-encryption-isis) WZ
The director of the Federal Bureau of Investigation has warned US senators that the threat from the Islamic State merits a “debate” about limiting
commercial encryption – the linchpin of digital security – despite a growing chorus of technical experts who say that undermining encryption
would prove an enormous boon for hackers, cybercriminals, foreign spies and terrorists. In a twin pair of appearances before the Senate’s judiciary
and intelligence committees on Wednesday, James Comey testified that Isis’s use of end-to-end encryption, whereby the messaging service being used to send
information does not have access to the decryption keys of those who receive it, helped the group place a “devil” on the shoulders of potential recruits “saying kill,
kill, kill, kill”. Comey said that while the FBI is thus far disrupting Isis plots, “I cannot see me stopping these indefinitely”. He added: “I am not trying to scare folks.”
Since October, following Apple’s decision to bolster its mobile-device security, Comey has called for a “debate” about inserting “back doors” – or
“front doors”, as he prefers to call them – into encryption software, warning that “encryption threatens to lead us all to a very, very dark place”. But Comey
and deputy attorney general Sally Quillian Yates testified that they do not at the moment envision proposing legislation to mandate surreptitious
or backdoor access to law enforcement. Both said they did not wish the government to itself hold user encryption keys and preferred to “engage”
communications providers for access, though technicians have stated that what Comey and Yates seek is fundamentally incompatible with end-to-end
encryption. Comey, who is not a software engineer, said his response to that was: “Really?” He framed himself as an advocate of commercial encryption to
protect personal data who believed that the finest minds of Silicon Valley can invent new modes of encryption that can work for US law enforcement and
intelligence agencies without inevitably introducing security flaws. While the FBI director did not specifically cite which encrypted messaging apps Isis uses, the
Guardian reported in December that its grand mufti used WhatsApp to communicate with his former mentor. WhatsApp adopted end-to-end encryption last year.
“I think we need to provide a court-ordered process for obtaining that data,” said Dianne Feinstein, the California Democrat and former intelligence committee
chair who represents Silicon Valley. But Comey’s campaign against encryption has run into a wall of opposition from digital security experts and engineers. Their
response is that there is no technical way to insert a back door into security systems for governments that does not leave the door
ajar for anyone – hackers, criminals, foreign intelligence services – to exploit and gain access to enormous treasure troves of user data, including medical
records, financial information and much more. The cybersecurity expert Susan Landau, writing on the prominent blog Lawfare, called Comey’s vision of a security
flaw only the US government could exploit “magical thinking”. Comey is aided in his fight against encryption by two allies, one natural and the other accidental.
The natural ally is the National Security Agency director, Michael Rogers, who in February sparred with Yahoo’s chief of information security when the Yahoo
official likened the anti-crypto push to “drilling a hole in the windshield”, saying: “I just believe that this is achievable. We’ll have to work our way through it.” The
Guardian, thanks to Edward Snowden’s disclosures, revealed in September 2013 that the NSA already undermines encryption. The less obvious ally is
China, whom the FBI blamed last month for stealing a massive hoard of federal personnel data. In May, China unveiled a national security law calling for “secure
and controllable” technologies, something US and foreign companies fear is a prelude to a demand for backdoor entry into companies’ encryption software or
outright provision of encryption keys. Without ever mentioning his own FBI director’s and NSA director’s similar demands, Barack Obama castigated China’s antiencryption push in March. Obama has also declined to criticize efforts in the UK, the US’s premier foreign ally, to undermine encryption. Prime minister David
Cameron is proposing to introduce legislation in the autumn to force companies such as Apple, Google and Microsoft to provide access to encrypted data. Under
questioning from some skeptical senators, Comey made a number of concessions. When Ron Wyden, an Oregon Democrat, asked if foreign countries would
attempt to mandate similar access, Comey replied, “I think they might.” The director acknowledged that foreign companies, exempt from any hypothetical US
mandate, would be free to market encryption software. In advance of Comey’s testimony, several of the world’s leading cryptographers, alarmed by
the return of a battle they thought won during the 1990s “Crypto Wars”, rejected the effort as pernicious from a security perspective and
technologically illiterate. A paper they released on Tuesday, called “Keys Under Doormats”, said the transatlantic effort to insert backdoors into
encryption was “unworkable in practice, raise[s] enormous legal and ethical questions, and would undo progress on security at a time
when internet vulnerabilities are causing extreme economic harm”. Asked by Feinstein if the experts had a point, Comey said: “Maybe. If that’s the case, I guess
we’re stuck.” Kevin Bankston of the New America Foundation called into question the necessity of Comey’s warnings that encryption would lead to law
enforcement “going dark” against threats. Bankston, in a Tuesday blogpost, noted that the government’s latest wiretap disclosure found that state and federal
governments could not access four encrypted conversations out of 3,554 wiretapped in 2014. Yet Yates said both that the Justice
Department was “increasingly” facing the encryption challenge and that she lacked the data quantifying how serious the challenge was. Yates told the Senate
judiciary committee that law enforcement declined to seek warrants in cases of encrypted communications and did not say how often it made such a decision.
A2 SOLVENCY – GENERAL
--xt plan fails
Backdoors inevitable – plan fails
Lewis 13
James Lewis(James Andrew Lewis is a senior fellow and program director at the Center for Strategic and International Studies (CSIS).
Before joining CSIS, he worked at the Departments of State and Commerce as a Foreign Service officer and as a member of the Senior
Executive Service. His government experience includes work on Asian politico-military issues, as a negotiator on conventional arms and
technology transfers, and on military and intelligence-related technologies. Lewis led the U.S. delegation to the Wassenaar
Arrangement Experts Group on advanced civil and military technologies and was the rapporteur for the UN Group of Government
Experts on Information Security for their successful 2010 and 2013 sessions. He was assigned to U.S. Southern Command for Operation
Just Cause, U.S. Central Command for Operation Desert Shield, and to the U.S. Central American Task Force]
http://csis.org/publication/backdoors-and-encryption)
There
is a general myth that the “geeks” defeated the Feds in the “crypto wars” of the 1990s, blocking efforts to prevent the
sale and export of advanced encryption products. This is an article of faith with some people, particularly on the West Coast, and
if you interview them you will get this story presented as an accurate account of what happened. An article in the New York Times
hinted at a more accurate picture. The geeks did not win the crypto war. They were deluded into thinking they had done so,
producing a false sense of security. Now, wounded that their cherished myth has been punctured like a balloon, they claim
that the NSA coerced IT companies to build back doors into encryption products and this is what let it defeat
encryption. This is wishful thinking. Wishful because the backdoor argument points to what is one central myth of the
internet – that it is possible to use technology to make it secure. If only there weren’t back doors put in by coercion, then
we could be safe. Sorry, but while security measures can make it harder to steal data, there are perhaps half a dozen intelligence
agencies in the world with the resources and skills to defeat any internet security measure without the need for backdoors. The
internet can be made more secure, but it will never be fully secure. The notion of back doors leads immediately to bad policy, and this
was the one point that gave me pause in writing this piece. Should I tell them that their proposed fix is useless? If the capabilities
that let an intelligence agency defeat encryption do not rely on back doors, switching to foreign products will not
make you any safer, although it may provide a degree of comfort rather like an umbrella in a hurricane.
A2 SOLVENCY – SECURE DATA ACT
--xt sda fails
SDA is insufficient – fails to close backdoors
Newman 14 (Lily Hay Newman, Future Tense, a partnership of Slate, New America, and Arizona State University, “Senator
Proposes Bill to Prohibit Government-Mandated Backdoors in Smartphones,” 12-5-2014,
http://www.slate.com/blogs/future_tense/2014/12/05/senator_wyden_proposes_secure_data_act_to_keep_government_agencie
s_from.html)
It's worth noting, though, that the Secure Data Act doesn't actually prohibit backdoors—it just prohibits agencies from mandating them.
There are a lot of other types of pressure government groups could still use to influence the creation of backdoors, even if they
couldn't flat-out demand them. Here's the wording in the bill: "No agency may mandate that a manufacturer, developer, or seller of covered products
design or alter the security functions in its product or service to allow the surveillance of any user of such product or service, or to allow the physical search of
such product, by any agency."
A2 SOLVENCY – WARRANTS
--xt sq solves
Warrant required for back door surveillance—U.S. code
U.S. Code 2010- most recent amendment. (“50 U.S. Code Chapter 36, Subchapter I - ELECTRONIC SURVEILLANCE”.
https://www.law.cornell.edu/uscode/text/50/chapter-36/subchapter-I)//ET
(f) “Electronic surveillance” means— (1) the acquisition by an electronic, mechanical, or other surveillance device of the contents of any wire or radio
communication sent by or intended to be received by a particular, known United States person who is in the United States, if the contents are acquired by
intentionally targeting that United States person, under circumstances in which a person has a reasonable expectation of privacy and a warrant would be
required for law enforcement purposes; (2) the acquisition by an electronic, mechanical, or other surveillance device of the contents of any wire communication to
or from a person in the United States, without the consent of any party thereto, if such acquisition occurs in the United States, but does not include the acquisition
of those communications of computer trespassers that would be permissible under section 2511 (2)(i) of title 18; (3) the intentional acquisition by an electronic,
mechanical, or other surveillance device of the contents of any radio communication, under circumstances in which a person has a reasonable expectation of
privacy and a warrant would be required for law enforcement purposes, and if both the sender and all intended recipients are located within the United States; or
(4) the installation or use of an electronic, mechanical, or other surveillance device in the United States for monitoring to acquire
information, other than from a wire or radio communication, under circumstances in which a person has a reasonable expectation of privacy
and a warrant would be required for law enforcement purposes.
A2 SOLVENCY – FISC
--xt sq solves
FISC oversight is the status quo – checks back door surveillance
James B. Comey and Sally Quillian Yates 7/8/15---Yates is Deputy Attorney General, received her J.D. in law from the
University of Georgia. Comey is Director Federal Bureau of Investigation. Law degree from UChicago. (Comey, “Joint Statement
with Deputy Attorney General Sally Quillian Yates Before the Senate Judiciary Committee”. Federal Bureau of Investigation.
https://www.fbi.gov/news/testimony/going-dark-encryption-technology-and-the-balances-between-public-safety-and-privacy)//ET
In recent years, new methods of electronic communication have transformed our society, most visibly by enabling ubiquitous digital communications and
facilitating broad e-commerce. As such, it is important for our global economy and our national security to have strong encryption
standards. The development and robust adoption of strong encryption is a key tool to secure commerce and trade, safeguard private
information, promote free expression and association, and strengthen cyber security. The Department is on the frontlines of the fight against cyber
crime, and we know first-hand the damage that can be caused by those who exploit vulnerable and insecure systems. We support and encourage the use of
secure networks to prevent cyber threats to our critical national infrastructure, our intellectual property, and our data so as to promote our overall safety.
American citizens care deeply about privacy, and rightly so. Many companies have been responding to a market demand for products and services that protect
the privacy and security of their customers. This has generated positive innovation that has been crucial to the digital economy. We, too, care about these
important principles. Indeed, it is our obligation to uphold civil liberties, including the right to privacy. We have always respected the fundamental right of people to
engage in private communications, regardless of the medium or technology. Whether it is instant messages, texts, or old-fashioned letters, citizens have the right
to communicate with one another in private without unauthorized government surveillance—not simply because the Constitution demands it, but because the free
flow of information is vital to a thriving democracy. The benefits of our increasingly digital lives, however, have been accompanied by new dangers, and we have
been forced to consider how criminals and terrorists might use advances in technology to their advantage. For example, malicious actors can take advantage of
the Internet to covertly plot violent robberies, murders, and kidnappings; sex offenders can establish virtual communities to buy, sell, and encourage the creation
of new depictions of horrific sexual abuse of children; and individuals, organized criminal networks, and nation-states can exploit weaknesses in our cyberdefenses to steal our sensitive, personal information. Investigating and prosecuting these offenders is a core responsibility and priority of the Department of
Justice. As national security and criminal threats continue to evolve, the Department has worked hard to stay ahead of changing threats and changing
technology. We must ensure both the fundamental right of people to engage in private communications as well as the protection of the public. One of the bedrock
principles upon which we rely to guide us is the principle of judicial authorization: that if an independent judge finds reason to believe that certain private
communications contain evidence of a crime, then the government can conduct a limited search for that evidence. For example, by having a neutral arbiter—the
judge—evaluate whether the government’s evidence satisfies the appropriate standard, we have been able to protect the public and safeguard citizens’
Constitutional rights. The Department of Justice has been and will always be committed to protecting the liberty and security of those whom we serve. In recent
months, however, we have on a new scale seen mainstream products and services designed in a way that gives users sole control over access to their data. As
a result, law enforcement is sometimes unable to recover the content of electronic communications from the technology provider even in response to a court
order or duly-authorized warrant issued by a federal judge. For example, many communications services now encrypt certain communications by default, with the
key necessary to decrypt the communications solely in the hands of the end user. This applies both when the data is “in motion” over electronic networks, or “at
rest” on an electronic device. If the communications provider is served with a warrant seeking those communications, the provider cannot provide the data
because it has designed the technology such that it cannot be accessed by any third party. ThreatsThe more we as a society rely on electronic devices to
communicate and store information, the more likely it is that information that was once found in filing cabinets, letters, and photo albums will now be stored only
in electronic form. We have seen case after case—from homicides and kidnappings, to drug trafficking, financial fraud, and child exploitation—where critical
evidence came from smart phones, computers, and online communications. When changes in technology hinder law enforcement’s ability to exercise
investigative tools and follow critical leads, we may not be able to identify and stop terrorists who are using social media to recruit, plan, and execute an attack in
our country. We may not be able to root out the child predators hiding in the shadows of the Internet, or find and arrest violent criminals who are targeting our
neighborhoods. We may not be able to recover critical information from a device that belongs to a victim who cannot provide us with the password, especially
when time is of the essence. These are not just theoretical concerns. We continue to identify individuals who seek to join the ranks of foreign fighters traveling in
support of the Islamic State of Iraq and the Levant, commonly known as ISIL, and also homegrown violent extremists who may aspire to attack the United States
from within. These threats remain among the highest priorities for the Department of Justice, including the FBI, and the United States government as a whole. Of
course, encryption is not the only technology terrorists and criminals use to further their ends. Terrorist groups, such as ISIL, use the Internet to great effect.
With the widespread horizontal distribution of social media, terrorists can spot, assess, recruit, and radicalize vulnerable individuals of all ages in the United
States either to travel or to conduct a homeland attack. As a result, foreign terrorist organizations now have direct access into the United States like never before.
For example, in recent arrests, a group of individuals was contacted by a known ISIL supporter who had already successfully traveled to Syria and encouraged
them to do the same. Some of these conversations occur in publicly accessed social networking sites, but others take place via private messaging platforms.
These encrypted direct messaging platforms are tremendously problematic when used by terrorist plotters. Outside of the terrorism arena we see countless
examples of the impact changing technology is having on our ability to affect our court authorized investigative tools. For example, last December a long-haul
trucker kidnapped his girlfriend, held her in his truck, drove her from state to state and repeatedly sexually assaulted her. She eventually escaped and pressed
charges for sexual assault and kidnapping. The trucker claimed that the woman he had kidnapped engaged in consensual sex. The trucker in this case
happened to record his assault on video using a smartphone, and law enforcement was able to access the content stored on that phone pursuant to a search
warrant, retrieving video that revealed that the sex was not consensual. A jury subsequently convicted the trucker. In a world where users have sole control over
access to their devices and communications, and so can easily block all lawfully authorized access to their data, the jury would not have been able to consider
that evidence, unless the truck driver, against his own interest, provided the data. And the theoretical availability of other types of evidence, irrelevant to the case,
would have made no difference. In that world, the grim likelihood that he would go free is a cost that we must forthrightly acknowledge and consider. We are
seeing more and more cases where we believe significant evidence resides on a phone, a tablet, or a laptop—evidence that may be the difference between an
offender being convicted or acquitted. If we cannot access this evidence, it will have ongoing, significant impacts on our ability to identify, stop, and prosecute
these offenders. We would like to emphasize that the Going Dark problem is, at base, one of technological choices and capability. We are not asking to expand
the government’s surveillance authority, but rather we are asking to ensure that we can continue to obtain electronic information and evidence pursuant to the
legal authority that Congress has provided to us to keep America safe. The rules for the collection of the content of communications in order to protect
public safety have been worked out by Congress and the courts over decades. Our country is justifiably proud of the strong privacy protections
established by the Constitution and by Congress, and the Department of Justice fully complies with those protections. The core question is this: Once all of the
requirements and safeguards of the laws and the Constitution have been met, are we comfortable with technical design decisions that result in barriers to
obtaining evidence of a crime? We would like to describe briefly the law and the extensive checks, balances, and safeguards that it contains.
In addition to the Constitution, two statutes are particularly relevant to the Going Dark problem. Generally speaking, in order for the government
to conduct real-time—i.e., data in motion—electronic surveillance of the content of a suspect’s communications, it must meet the standards
set forth in either the amended versions of Title III of the Omnibus Crime Control and Safe Streets Act of 1968 (often referred to as “Title III” or the “Wiretap Act”)
or the Foreign Intelligence Surveillance Act of 1978 (or “FISA”). Title III authorizes the government to obtain a court order to conduct
surveillance of wire, oral, or electronic communications when it is investigating federal felonies. Generally speaking, FISA similarly relies upon
judicial authorization, through the Foreign Intelligence Surveillance Court (FISC), to approve surveillance directed at foreign intelligence and international
terrorism threats. Regardless of which statute governs, however, the standards for the real-time electronic surveillance of United States persons’ communications
are demanding. For instance, if federal law enforcement seeks the authority to intercept phone calls in a criminal case using the Wiretap Act, a federal district
court judge must find: That there is probable cause to believe the person whose communications are targeted for interception is committing, has committed, or is
about to commit, a felony offense; That alternative investigative procedures have failed, are unlikely to succeed, or are too dangerous; and That there is probable
cause to believe that evidence of the felony will be obtained through the surveillance. The law also requires that before an application is even brought to a
court, it must be approved by a high-ranking Department of Justice official. In addition, court orders allowing wiretap authority expire after 30 days;
if the government seeks to extend surveillance beyond this period, it must submit another application with a fresh showing of probable cause and investigative
necessity. And the government is required to minimize to the extent possible its electronic interceptions to exclude non-pertinent and
privileged communications. All of these requirements are approved by a federal court. The statutory requirements for electronic surveillance of U.S.
persons under FISA are also demanding. To approve that surveillance, the FISC, must, among other things, find probable cause to believe: That the
target of the surveillance is a foreign power or agent of a foreign power; and That each of the facilities or places at which the electronic surveillance is directed is
being used or is about to be used by a foreign power or an agent of a foreign power. Similarly, when law enforcement investigators seek access to electronic
information stored—i.e., data at rest—on a device, such as a smartphone, they are likewise bound by the mandates of the Fourth Amendment, which typically
require them to demonstrate probable cause to a neutral judge, who independently decides whether to issue a search warrant for that data. Collectively, these
statutes reflect a concerted Congressional effort, overseen by an independent judiciary, to validate the principles enshrined in our Constitution and balance
several sometimes competing, yet equally legitimate social interests: privacy, public safety, national security, and effective justice. The evolution and operation of
technology today has led to recent trends that threaten this time-honored approach. In short, the same ingenuity that has improved our lives in so many ways has
also resulted in the proliferation of products and services where providers can no longer assist law enforcement in executing warrants.
--xt fisc fails
FISC oversight fails—NSA lies to FISC on encryption
Carol M. Bast and Cynthia A. Brown 12/16/13--- Bast is Associate Professor of Legal Studies, Department of Legal Studies. Brown J.D.,
Ph.D., is in an attorney in private practice with the law firm of Brown and Associates, PLLC. (Bast and Brown, “GUILTY BY ASSOCIATION:
SMALL-WORLD PROBLEM EMPHASIZES CRITICAL NEED FOR BUSINESS STRATEGIES IN RESPONSE TO THE FOREIGN
INTELLIGENCE SURVEILLANCE ACT”, Michigan State Law Review, p. 1086-1088.
http://digitalcommons.law.msu.edu/cgi/viewcontent.cgi?article=1097&context=lr)//ET
In September 2013, The Guardian, The New York Times, and ProPublica disclosed that the NSA used a multi-pronged approach to evading the
encryption of much information traveling on the Internet, such as emails, banking, and medical data.379 One approach is to infiltrate target
computers prior to data being encrypted; a second approach is to break encryption codes; a third approach is to induce
technology companies to allow “back doors” into technology products or to take advantage of security flaws in technology
products; and a final approach is to insert weaknesses into encryption standards.380 In addition, the NSA maintains a library of
encryption keys and is permitted to store encrypted data as long as necessary to decipher it; however, there is some encryption that
NSA has not succeeded in breaking.381 In October 2013, The Washington Post disclosed that the NSA had been attempting to identify Tor
users and their locations.382 Tor, which “originally stood for The Onion Router,” is a network of servers scattered across the globe, together with software to
communicate with the network, providing anonymity to a user to communicate and browse the Web.383 Although the NSA apparently was unsuccessful in
conducting surveillance on communication traveling on the Tor network, NSA was successful in learning the identity of a small number of Tor
users by sending malware to a Tor user’s browser.384 In the wake of the Snowden disclosures, FISC released several opinions concerning NSA
mass surveillance. One opinion, dated October 3, 2011, and authored by Judge John D. Bates concerned “‘upstream collection’ of Internet communications,”
which “refers to NSA’s interception of Internet communications as they transit [redacted], rather than to acquisitions directly from Internet service providers.”385
Judge Bates held that certain NSA targeting and The Court now understands that each year, NSA’s upstream collection likely results in the acquisition of roughly
two to ten thousand discrete wholly domestic communications that are neither to, from, nor about a targeted selector, as well as tens of thousands of other
communications that are to or from a United States person or a person in the United States but that are neither to, from, nor about a targeted selector. Id. at 72.
The opinion and accompanying order were released on August 21, 2013. Bill Chappell, Secret Court: NSA Surveillance Program Was Unconstitutional, NPR
1088 Michigan State Law Review 2014:1035 minimization procedures were unconstitutional and that the minimization procedures did not comply with FISA.386
“NSA’s collection of MCTs [multiple communications] results in the acquisition of a very large number of Fourth Amendment-protected communications that have
no direct connection to any targeted facility and thus do not serve the national security needs underlying the Section 702 collection as a whole.”387 Judge Bates
recognized that the NSA had been collecting Internet data since at least 2008, but that the NSA had delayed until 2011 in bringing this
collection information to the court’s attention.388 The judge pointed out that this was not the first time that the government had
misrepresented its surveillance activities.389 “The Court is troubled that the government’s revelations regarding NSA’s acquisition
of Internet transactions mark the third instance in less than three years in which the government has disclosed a substantial
misrepresentation regarding the scope of a major collection program.”390 Judge Bates referenced earlier NSA activities: “Contrary to the government’s
repeated assurances, NSA had been routinely running queries of the metadata using querying terms that did not meet the required standard for querying.”391
Judge Bates quoted from an earlier opinion of FISC concerning the query standard, which “had been ‘so frequently and systemically violated that it can fairly be
said that this critical element of the overall . . . regime never functioned effectively.’
Oversight fails--- XO 12333 prevents and NSA classifies key Info
David Greene Katitza Rodriguez, 5/29/14--- Green is an EFF Senior Staf Attorney, 2013 California Lawyer Attorney of the Year, and by
the SPJ Northern California as the recipient of its 2007 James Madison Freedom of information Award for Legal Counsel. Professor at the
University of San Francisco School of Law. Degree in aw from Duke University. Rodriguez is EFF International Rights Director. Worked for the
UN Internet Governance Forum and a member of the Advisory Board of Privacy International. Katitza is well known for her work at the U.N.
Internet Governance Forum Katitza holds a Bachelor of Law degree from the University of Lima, Peru. (Greene and Rodriguez, “NSA Mass
Surveillance Programs Unnecessary and Disproportionate”, Electronic Frontier Foundation. p.8-20.
https://www.eff.org/files/2014/05/29/unnecessary_and_disproportionate.pdf)//ET
Executive Order (EO) 12333 • Executive Order 12333 authorizes surveillance conducted primarily outside the United States, although there are indications that
the government maintains that some amount of US-based surveillance can also occur under this authority.12 President Ronald Reagan issued EO 12333 in
December 1981 to extend the powers and responsibilities of the various US intelligence agencies that existed under previous executive orders. The
organizational structure established by EO 12333 was revised by executive orders in 2004 and 2008, the latter of which consolidated power under the
President’s Director of National Intelligence. The US government asserts that programs conducted under the authority of EO 12333 do not require judicial
approval or non-executive oversight of any type.13 The following is a small subset of publicly-known activities operated under the purported
authority of EO 12333: 12 Executive Order (EO) 12333 was amended on January 23, 2003 b3wy Executive Order 13284, on August 27, 2004 by Executive
Order 13355, and further amended on July 30, 2008 by Executive Order 13470. The resulting text of Executive Order 12333, following the 2008 amendment, is
available here http://www.fas.org/irp/offdocs/eo/eo-12333-2008.pdf 13 http://www.washingtonpost.com/world/national-security/nsa-collects-millions-of-e-mailaddress-booksglobally/2013/10/14/8e58b5be-34f9-11e3-80c6-7e6dd8d22d8f_print.html ELECTRONIC FRONTIER FOUNDATION EFF.ORG 8 MYSTIC • Under
this operation, the NSA has built a surveillance system capable of recording “100 percent” of a foreign country’s telephone calls, enabling the agency to rewind
and review conversations as long as a month after they take place,.14 MYSTIC has been used against one nation, according recent leaks, and may have been
subsequently used in other countries .. MUSCULAR • This operation, which began in 2009, infiltrates links between global data centers of technology companies,
such as Google and Yahoo!, not on US soil. These two companies responded to the revelation of MUSCULAR by encrypting those exchanges. XKEYSCORE •
XKEYSCORE appears to be the name of the software interface through which NSA analysts search vast databases of information—collected under various other
operations—containing emails, online chats, and the browsing histories of millions of individuals anywhere in the world. The XKEYSCORE data has been shared
with other secret services including Australia's Defence Signals Directorate and New Zealand's Government Communications Security Bureau. BULLRUN • Not
in and of itself a surveillance program, BULLRUN is an operation by which the NSA undermines the security tools relied upon by users, targets and
non-targets, and US persons and non-US persons alike. The specific activities include dramatic and unprecedented efforts to attack security
tools, including: • Inserting vulnerabilities into commercial encryption systems, IT systems, networks, and endpoint communications
devices used by targets; • Actively engaging US and foreign IT industries to covertly influence and/or overtly leverage their commercial
products' designs; • Shaping the worldwide commercial cryptography marketplace to make it more vulnerable to the NSA’s surveillance
capabilities; • Secretly inserting design changes in systems to make them more vulnerable to NSA surveillance, and • Influencing policies,
international standards, and specifications for commercial public key technologies. z DISHFIRE • The Dishfire operation is the worldwide mass collection of
records including location data, contact retrievals, credit card details, missed call alerts, roaming alerts (which indicate border crossings), electronic business
cards, credit card payment notifications, travel itinerary alerts, meeting information, text messages, and more. Communications from US phones were allegedly
minimized, although not necessarily purged, from this database. The messages and associated data from non-US persons were retained and analyzed. COTRAVELER • Under this operation, the US collects location information from global cell tower, WiFi, and GPS hubs. This information is collected and analyzed
over time, in part, in order to determine the traveling companions of targets. In addition to these programs, the NSA also surveilled messaging conducted through
“leaky” mobile applications, monitored the mobile phone communications of 35 world leaders, and monitored, for example, approximately 70 million phone calls
per month originating in France and 60 million per month originating in Spain. Also, the NSA collected financial records—180 million in 2011—from SWIFT, the
network used by worldwide financial institutions to securely transmit interbank messages and transactions. US Legal Challenges to NSA Surveillance The US
Government has asserted that its current communications spying operations are fully in compliance with international law, primarily by claiming that its practices
are conducted according to domestic US law. However, there are several ongoing legal challenges in US courts to NSA surveillance, including several in which
EFF serves as counsel.15 These lawsuits challenge the programs as being both unconstitutional—under the 4th Amendment, 1st Amendment, and in some
places the 5th Amendment of the United States Constitution—and illegal under the statutes used to justify them. There have thus far been no legal challenges in
US courts to any of the US actions under the purported authority of EO 12333 and no challenges directly regarding the rights of non-US persons. Challenges to
“Upstream” Internet Surveillance The following lawsuits are challenges to the collection of Internet data through the installation of fiber optic splitters at
transmission hubs: 15 EFF’s statements and positions here are not those of its clients in the litigations where EFF is counsel and nothing said here shall be
construed as a statement or admission by any of those plaintiffs. ELECTRONIC FRONTIER FOUNDATION EFF.ORG 10 • Jewel v. NSA (an action by AT&T
customers in a federal court in California);16 • Shubert v. Obama (a class action on behalf of all Americans against the NSA's domestic dragnet surveillance); •
Criminal prosecutions: Section 702 surveillance is being challenged in several cases in which the government has brought criminal charges, largely terrorismrelated. The defendants, many of whom only recently received notice of their prosecution despite being charged long ago, are mounting challenges to the
evidence used against them on the grounds that it was illegally and unconstitutionally collected and used. Challenges to Section 215 Telephone Call Detail
Records Collection The following lawsuits challenge the mass collection of telephone call detail records from US persons: • First Unitarian Church of Los Angeles
v. NSA (an action by 22 organizations in a federal court in California);17 • Jewel v. NSA (see above); • ACLU v. Clapper (an action by the ACLU and its New York
chapter in a federal court in New York; the trial judge dismissed the lawsuit, that dismissal is currently on appeal); • Klayman v. United States (a class action in
the federal court in the District of Columbia; the trial judge found the call detail records surveillance unconstitutional on 4th Amendment grounds; that decision
has been appealed); • Smith v. Obama (an action by an individual filed in a federal court in Idaho); • Paul v. Obama (a class action filed in federal court in the
District of Columbia); • Perez v. Clapper (an action by two individuals filed in a federal court in Texas). These lawsuits all address the legality of the program with
respect to US persons. These lawsuits do not raise the non-discrimination rights of non-US persons under the ICCPR and European law, or the Inter-American
system. 16 Jewel vs. NSA, https://www.eff.org/cases/jewel 17 First Unitarian Church of Los Angeles v. NSA, https://www.eff.org/cases/first-unitarian-church-losangeles-v-nsa ELECTRONIC FRONTIER FOUNDATION EFF.ORG 11 Application of the Principles to US Surveillance The US surveillance programs plainly
violate international human rights law, especially when compared to the Necessary and Proportionate Principles; the gaps between US surveillance programs
and the standards for human rights are readily apparent. The Necessary and Proportionate Principles are based upon the existence of a fundamental human
right—the right to privacy—as recognized under international human rights law.18 The right to privacy is not only a fundamental right in and of itself, it bolsters
other fundamental rights as well—including freedom of expression, freedom of information, and freedom of association.19 Defnitions “Metadata”/”Content”
Distinction The Principles define “protected information” to include “all information that includes, reflects, arises from or is about a person’s communications and
that is not readily available and easily accessible to the general public.” The definition is aimed at protecting both privacy and freedom of expression, which in
many cases flourishes only with assurances that communications and associations can remain free from governmental tracking. The Principles recognize that
individuals, who believe that the government is gaining access to records containing information that reveals, for example, to whom they are speaking, when they
are speaking, and for how long, especially over time, they are speaking, will be less willing to communicate about sensitive or political topics. In doing so, the
Principles expressly recognize that the old distinctions between content and “non-content” or “metadata” are “no longer appropriate for measuring the degree of
intrusion that communications surveillance makes into individuals’ private lives and associations.” Indeed, “metadata” is information-rich; this information may
reveal a person’s identity, behavior, political and social associations, medical conditions, race, or sexual orientation. The information may enable the mapping of
an individual’s movements and interactions over time, revealing whether the individual was present at a 18 Universal Declaration of Human Rights Article 12,
United Nations Convention on Migrant Workers Article 14, UN Convention of the Protection of the Child Article 16, International Covenant on Civil and Political
Rights, International Covenant on Civil and Political Rights Article 17; regional conventions including Article 10 of the African Charter on the Rights and Welfare
of the Child, Article 11 of the American Convention on Human Rights, Article 4 of the African Union Principles on Freedom of Expression, Article 5 of the
American Declaration of the Rights and Duties of Man, Article 21 of the Arab Charter on Human Rights, and Article 8 of the European Convention for the
Protection of Human Rights and Fundamental Freedoms; Johannesburg Principles on National Security, Free Expression and Access to Information, Camden
Principles on Freedom of Expression and Equality. 19 The freedom of association and freedom of speech are inherently linked. The freedom of association
recognizes that individuals may have a stronger and more influential voice in public discussions by joining with other like-minded persons and advocating as a
group. The right to privacy bolsters this right by allowing such groups to form and communicate while permitting the individual associates to remain anonymous.
This ability to remain anonymous is especially important where the group’s views are unpopular, dissenting, or involve deeply personal private information—
situations in which one might choose not to speak at all if the fact of her association with the group were to become known. ELECTRONIC FRONTIER
FOUNDATION EFF.ORG 12 political demonstration, for example. Because of this, the President’s Review Group cited the Principles in noting that the distinction
between content and non-content was increasingly untenable.20 Useful explanations about how using metadata can reveal intimate and private information
about people are contained in a declaration—filed by Princeton professor, Edward Felten—in support of one of the lawsuits challenging the telephone records
collection and recent research by a team from Stanford University, which notes how intimate details of a persons’ life can be discerned from a relatively small
amount of metadata. 21 The Principles also instruct that “[w]hen adopting a new communications surveillance technique or expanding the scope of an existing
technique, the State should ascertain whether the information likely to be procured falls within the ambit of ‘protected information’ before seeking it, and should
submit to the scrutiny of the judiciary or other democratic oversight mechanism.” The US, particularly in justifying the Section 215 mass collection of call detail
records, has relied on this distinction between “content” and “metadata,” citing Supreme Court authority from over 40 years ago. 22 The US has argued that there
are no privacy interests in non-content information protected by the 4th Amendment. This position is inconsistent with the Principles and inconsistent with the
need to protect privacy and freedom of expression in the digital age. Metadata Matters IP addresses collected by a web service can reveal whether two people
spent the night in the same place. • This is because an IP address at a particular point in time will usually be unique to a single residence. • If two people both
logged in to services from the same IP address late at night and early in the morning, they probably spent the night together in the place distinguished by that IP
address. Stanford researchers found (experimentally) that information about who people call can be used to infer extraordinarily sensitive facts about them,
including the fact that they sought and received treatment for particular a medical condition, that they had an abortion, or that they purchased firearms, among
other things.23 20 “Liberty and Security in a Changing World; Report and Recommendations of The President’s Review Group on Intelligence and
Communications Technologies.” 12 Dec. 2013. http://www.whitehouse.gov/sites/default/files/docs/2013-12-12_rg_final_report.pdf 21 Felton, Edward W. “Case
1:13-cv-03994-WHP Document 27,” filed August 26, 2013. https://www.documentcloud.org/documents/781486-declaration-felten.html 22 Smith v. Maryland, 442
U.S. 735 (1979). http://caselaw.lp.findlaw.com/scripts/getcase.pl?court=US&invol=735&vol=442 23 Mayer, Jonathan and Patrick Mutchler. “MetaPhone: The
Sensitivity of Telephone Metadata.” 12 March 2014. http://webpolicy.org/2014/03/12/metaphone-the-sensitivity-of-telephone-metadata/ ELECTRONIC
FRONTIER FOUNDATION EFF.ORG 13 Retail stores now have the ability to track individuals' physical whereabouts by observing data packets transmitted from
smartphones and other mobile devices. • They can recognize when people return to a store (and how often), see which part of the store visitors spend their time
in, and figure out how long people wait in lines. • Some entities are in a position to associate this information with a person's name because the entities observe
mobile device identifiers together with other identifying information. Law enforcement and intelligence agencies are using technology to track individuals'
whereabouts—on a massive scale, twenty-four hours a day—whether by directly observing the signals transmitted from phones or by demanding that mobile
carriers turn over information about users' locations. • Information about where people go reveals sensitive religious, medical, sexual, and political information
about them, including the kinds of medical specialists, religious services, or political meetings a person meets with or attends. • Information about the proximity or
lack of proximity of multiple people to one another can reveal individuals who attended a protest, the beginning or end of a romantic relationship, or a person's
marital infidelity. • Information from telephone companies has been repeatedly sought and used to identify the sources who gave information to journalists. First
Look Media's publication, The Intercept, reported that the United States is using telecommunications metadata as a means of targeting lethal drone strikes aimed
at the cellular phones of individual people, recognized by wireless signals that they transmit. In the Ukraine, cell tower dumps were used to determine who had
participated in the Maidan protests against the previous regime, and then to let them know that the government was watching. • The ability to automatically get a
complete list of who attended a protest is an extremely serious threat to the freedom of expression and association if people believe that there is a potential for
future backlash (or violence!) from being identified as a participant. Bulk and Persistent Surveillance According to the Principles, in determining whether
surveillance will sweep up “protected information,” the form, scope, and duration of the surveillance must be considered: “Because pervasive or systematic
monitoring has the capacity to reveal private information far in excess of its constituent parts, it can elevate surveillance of non-protected information to a level of
invasiveness that demands strong protection.”24 24 "Moreover, public information can fall within the scope of private life where it is systematically collected and
stored in files held by the authorities. That is all the truer where such information concerns a person's distant past…In the Court's opinion, such information, when
systematically collected and stored in a file held by agents of the State, falls within the scope of 'private life' for the purposes of Article 8(1) of the Convention."
(Rotaru v. Romania, [2000] ECHR 28341/95, paras. 43-44). ELECTRONIC FRONTIER FOUNDATION EFF.ORG 14 The Section 215 program and significant
kinds of collection under Section 702 and EO 12333 involve bulk or mass collection of communications data over an extended period of time on a continuous or
nearly continuous basis. For the Section 215 program, at any point in time, the NSA is likely to have five years worth of call detail records about an individual.
“Collection” = “Surveillance” = Interference with Privacy Much of the expansive NSA surveillance revealed in the past year has been defended by the United
States on the basis that the mere collection of communications data, even in troves, is not “surveillance” because a human eye never looks at it. Indeed, under
this definition, the NSA also does not surveil a person’s data by subjecting it to computerized analysis, again up until the point a human being lays eyes on it. The
Principles, reflecting the human right to privacy, defines “surveillance” to include the monitoring, interception, collection, analysis, use, preservation, and retention
of, interference with, or access to information that includes, reflects, or arises from or a person’s communications in the past, present, or future. States should not
be able to bypass privacy protections on the basis of arbitrary definitions. Applying the Principles The Legality Principle The first of the Necessary and
Proportionate Principles is “Legality.” Any limitation to the right to privacy must be prescribed by law. The State must not adopt or implement a measure that
interferes with the right to privacy in the absence of an existing publicly reviewable legislative act, which meets a standard of clarity and precision that is sufficient
to ensure that individuals have advance notice of and can foresee its application. As the European Court of Human Rights has explained, “Firstly, the law must
be adequately accessible: the citizen must be able to have an indication that is adequate in the circumstances of the legal rules applicable to a given case.
Secondly, a norm cannot be regarded as a ‘law’ unless it is formulated with sufficient precision to enable the citizen to regulate his conduct: he must be able—if
need be with appropriate advice—to foresee, to a degree that is reasonable in the circumstances, the consequences which a given action may entail.”25 Thus
the Legality principle requires that laws be non-secret and subject to oversight and that they not vest governmental officials with excessive discretion.26 25
Judgment in The Sunday Times v. The United Kingdom, Application no. 6538/74, Judgment of 26 April 1979, para.49. 26 Siver v. the UK, Petra v. Romania,
1998. The Human Rights Committee takes the very same approach. General Comment No. 34, CCPR/C/GC/34, 12 September 2011, paras. 24 – 26.
http://tbinternet.ohchr.org/_layouts/treatybodyexternal/Download.aspx? symbolno=CCPR%2fC%2fGC%2f34&Lang=en ELECTRONIC FRONTIER
FOUNDATION EFF.ORG 15 The Legality principle is not a mere reference to domestic law. It is therefore not sufficient for the US to contend that its surveillance
programs are sanctioned by US laws (even if that lawfulness were not subject to ongoing litigation). The Legality principle is violated by the fact that the US
surveillance programs are almost all conducted in secret, and are largely governed by a body of secret law developed by a secret court—the FISC—which
selectively publishes its legal interpretations of the law. Many, if not most, of the FISC’s rulings are not subject to public review or oversight; individuals are thus
uninformed as to what their rights are vis-à-vis the US surveillance programs. Moreover, many of the programs, especially under EO 12333 as described above,
are not subject to any judicial oversight, and lack any defined standards of implementation. This position has been recently confirmed by the UN Human Rights
Committee in its concluding observations from the United States' review on its compliance with the ICCPR. Necessity and Proportionality in Pursuit of a
Legitimate Aim The principle of “Necessity” reflects the requirement under International law that restrictions on fundamental rights, such as the right of privacy,
must be strictly and demonstrably necessary to achieve a legitimate aim. Each of these factors—necessity, legitimate aim, adequacy, and proportionality—is
included in the Principles. As stated in the Principles, the State must establish “that (1) other available less invasive investigative techniques have been
considered, (2) information accessed will be confined to what is reasonably relevant and any excess information collected will be promptly destroyed or returned
to the impacted individual, and (3) information is accessed only by the specified authority and used for the purpose for which the authorization was given.” The
US mass surveillance programs under Section 215 and 702 and EO 12333 fail to meet these requirements in that the dragnet collection of information about
nonsuspicious individuals is a far too inclusive, and thus disproportionate, method. The US government is accumulating a tremendous amount of data and, as the
US concedes, the vast amount of it will ultimately prove to be wholly unrelated to international terrorism. Moreover, the US legal system fails to require a
threshold of showing for collection of any communications or communications records or an individualized suspicion for targeting non-US persons. As Martin
Scheinin, the former United Nations special rapporteur on human rights and counterterrorism, has noted, mass surveillance is inherently a disproportionate
measure.27 The collection of all data is seldom, perhaps never, a “necessary” measure, by 27 Joergensen, Rikke Frank. “Can human rights law bend mass
surveillance?” 27 Feb. 2014. http://policyreview.info/articles/analysis/can-human-rights-law-bend-mass-surveillance ELECTRONIC FRONTIER FOUNDATION
EFF.ORG 16 any definition of the word “necessary.” Mass surveillance will inevitably and unavoidably sweep up masses of private information that will be of no
use or relevance in antiterrorism investigations. This lack of necessity has been borne out, at least as to the Section 215 surveillance programs, by the reports of
two committees, hand-picked by the President, the President’s Review Group, and the Privacy and Civil Liberties Oversight Board. Each received classified
information about the necessity and efficacy of the program and each concluded that it had not resulted in the prevention of any terrorist attacks or had even
been more than marginally useful in a terrorism investigation. Facts: The US is “sitting on the wire,” that is, much of the global Internet traffic travels through wires
on US territory. The NSA accesses this traffic to illegitimately track who visits online pornography websites, and use this information to discredit those it deems
dangerous.28 The FISA surveillance law was originally intended to be used only in certain specific, authorized national security investigations. But informationsharing rules implemented after 9/11 allow the NSA to hand over information to traditional domestic law-enforcement agencies, without any connection to
terrorism or national security investigations.29 As the NSA scoops up phone records and other forms of electronic evidence while investigating national security
and terrorism leads, they have turned over "tips" to a division of the Drug Enforcement Agency, which is inappropriate to fulfill the specific Legitimate Aim
identified.30 The telephone records program, at least, has now been evaluated by two hand-picked Presidential panels to be unnecessary, since it has not had a
significant impact in preventing terrorist attacks or been more than marginally useful to terrorism investigations in the United States.31 Competent Judicial
Authority The Principles require that “determinations related to communications surveillance must be made by competent judicial authority that is impartial and
independent. This judicial authority must be: 1) separate from the authorities conducting communications surveillance; 2) conversant in issues related to and
competent to make judicial decisions 28 Opsahl, Kurt. “The NSA is Tracking Online Porn Viewing to Discredit 'Radicalizers.'” 27 Nov. 2013.
https://www.eff.org/deeplinks/2013/11/nsa-tracking-online-porn-viewing-discredit-radicalizers 29 Fakhoury, Hanni. “DEA and NSA Team Up to Share Intelligence,
Leading to Secret Use of Surveillance in Ordinary Investigations.” 6 Aug. 2013. https://www.eff.org/deeplinks/2013/08/dea-and-nsa-team-intelligence-laundering
30 Id. 31 Report and Recommendations of The President’s Review Group on Intelligence and Communications Technologies. 12 Dec. 2013.
http://www.whitehouse.gov/sites/default/files/docs/2013-12-12_rg_final_report.pdf See EFF's “Statement on President's Review Group's NSA Report.” 18 Dec.
2013. https://www.eff.org/deeplinks/2013/12/eff-statement-presidents-review-groups-nsa-report See “President’s Review Group Puzzler: Why is Massively
Overbroad Surveillance Wrong under 215 but OK under Section 702?.” 10 Jan. 2014. https://www.eff.org/deeplinks/2014/01/presidents-review-group-puzzlerwhy-mass-surveillance-wrong-under-215-ok-under ELECTRONIC FRONTIER FOUNDATION EFF.ORG 17 about the legality of communications surveillance,
the technologies used and human rights; and 3) have adequate resources in exercising the functions assigned to them.” Significant doubts exist as to whether
the mass surveillance operations are reviewed by “competent” judicial authority. With regard to surveillance under Patriot Act section 215 or FISA Amendments
Act section 702, there are serious questions about whether the FISC has a sufficient understanding of the technologies used, or has sufficient resources to
conduct the oversight required of it. The Chief Judge of the FISC, Judge Walton, has recognized that the court is limited in its ability to scrutinize the NSA's
abuses: “The FISC is forced to rely upon the accuracy of the information that is provided to the Court…The FISC does not have the capacity to investigate issues
of noncompliance."32 And as discussed above, there is no judicial oversight at all for NSA surveillance justified under under EO 12333. Facts: EO 12333
programs, consisting mainly of foreign collection, are conducted without any judicial involvement.33 Oversight of domestic collection programs is conducted by a
secret court, the Foreign Intelligence Surveillance Court. The FISC is fully dependent on the authorities conducting the surveillance to provide it with information
about their activities. Due Process The Principles require that every individual seeking a determination about whether or not her human rights are being infringed
upon have access to “a fair and public hearing within a reasonable time by an independent, competent and impartial tribunal established by law.” NSA
surveillance violates this principle in that those whose information is gathered are given neither notice nor any opportunity to contest the practice. The FISA and
the FISA Amendments Act specifically limit judicial access to the FISC to the third-party entities from which the information is sought. Those about whom the
information pertains have no opportunity to contest the demand made to the third party. Moreover, the US has stated that no telecommunication service provider
who has been required to produce records under Sections 215 or 702 has ever contested those demands in the FISC. As a result, the FISC proceedings have
been non-adversarial within a traditionally adversarial 32 Leonnig, Carol D. “Court: Ability to police U.S. spying program limited.” 15 Aug. 2013.
http://www.washingtonpost.com/politics/courtability-to-police-us-spying-program-limited/2013/08/15/4a8c8c44-05cd-11e3-a07f-49ddc7417125_print.html. 33
Jaycox, Mark M. “Three Leaks, Three Weeks, and What We've Learned About the US Government's Other Spying Authority: Executive Order 12333.” 5 Nov.
2013. https://www.eff.org/deeplinks/2013/10/three-leaks-three-weeks-and-what-weve-learned-aboutgovernments-other-spying ELECTRONIC FRONTIER
FOUNDATION EFF.ORG 18 judicial system—with the government presenting its case, but with no one representing the case against such surveillance
practices. While the litigation described above is attempting to bring at least some process to bear on the surveillance, the US government’s position is that all
such challenges should be dismissed without a substantive review of its activities. Facts: NSA surveillance violates due process since, at least as the government
currently maintains, those subject to it have no right to learn about it, much less challenge it. The New York Times reports that communications between an
American law firm and its foreign client may have been among the information the Australian Signals Directorate shared with the NSA. Surveillance of attorneyclient communications is anathema to the fundamental system of justice.34 User Notifcation The Principles, with certain exceptions, require that individuals be
notified of decisions authorizing surveillance of their communications with enough time and information to appeal the decision or seek other forms of remedial
relief. However, with few exceptions, the Section 215 and 702 programs are conducted in secret and individuals are never notified that the NSA is collecting their
communications data. Surveillance under EO 12333 is similarly conducted without notice. Moreover, those telecommunications service providers that do receive
demands for business records, under Section 215, or any materials as described in National Security Letters, are forbidden from notifying anyone of the
demands. These gags are perpetual. Facts: NSA surveillance prevents those surveilled to be notified about it, much less be notified in time to either challenge it
beforehand or seek some remedial relief afterwards. The purported governing legal authority fails to require the NSA to provide notice, and requires that
permanent gag orders be placed on service providers who were ordered to disclose their customers’ data. Transparency and Public Oversight The Principles
require that States be transparent about their use and scope of communications surveillance techniques and powers, and that they publish enough information to
enable the public “to fully comprehend the scope, nature and application of the laws permitting communication surveillance.” Service providers must be able to 34
Kayyali, Nadia. “The Tepid NSA-American Bar Association “Dialogue” Around Spying on Lawyers.” 21 March 2014. https://www.eff.org/deeplinks/2014/03/tepidnsa-american-bar-association-dialogue-around-spying-lawyers ELECTRONIC FRONTIER FOUNDATION EFF.ORG 19 publish the procedures they apply when
addressing surveillance, adhere to those procedures, and publish records of surveillance. The Principles further require that, “States should establish
independent oversight mechanisms to ensure transparency and accountability of communications surveillance.” The Principles require independent oversight
mechanisms in addition to any oversight provided through another branch of the government. NSA surveillance does not meet these requirements. The NSA
surveillance programs operate almost entirely in secret. Indeed, much of what we know now about the programs was provided to the public by various
whistleblowers. The US government, until very recently, has steadfastly worked to make sure that the public does not “fully comprehend the scope, nature and
application of the laws permitting communications surveillance.” Moreover, service providers receiving demands for customer information are typically gagged
from reporting even the fact of the demand. First, many of the NSA surveillance programs are subject to no external oversight at all, such as
those under EO 12333 Second, even the programs subject to Congressional and judicial review face problems with transparency and accountability.35
Although the programs run under the FISA are subject to FISC review—which has not been completely toothless; the FISC shut down the phone records
collection for 9 months in 2009 because of the government’s failure to comply with minimization procedures—there is no oversight provided by an
external entity, as required by the Principles. Moreover, because it lacks technical expertise in anti-terrorism, the FISC is often forced to defer to the
judgments made by the NSA regarding the effectiveness and necessity of the surveillance operations. The Senate Intelligence Committee, which provides
Congressional oversight of the NSA, relies on the information provided by the NSA. Many members of Congress have complained of a lack of candor and a
failure to provide sufficient information to allow them to conduct genuine oversight.36 Facts: Members of US Congress confirm that they were repeatedly
misled about the mass surveillance or denied reasonable access to information necessary to conduct oversight.37 35 Cohn, Cindy
and Mark M. Jaycox. “NSA Spying: The Three Pillars of Government Trust Have Fallen.” 15 Aug. 2013. https://www.eff.org/deeplinks/2013/08/nsa-spying-threepillars-government-trust-have-fallen 36 Timm, Trevor. ”A Guide to the Deceptions, Misinformation, and Word Games Officials Use to Mislead the Public About
NSA Surveillance.” 14 Aug. 2013. https://www.eff.org/deeplinks/2013/08/guide-deceptions-word-games-obfuscations-officials-use-mislead-publicabout-nsa 37
Electronic Frontier Foundation. “The Government's Word Games When Talking About NSA Domestic Spying.” https://www.eff.org/nsaspying/wordgames
ELECTRONIC FRONTIER FOUNDATION EFF.ORG 20 Similarly, the Chief Judge of the FISC has confirmed that the court cannot conduct
broad oversight of the NSA.38 Recently the government has allowed service providers to release very general information about
requests for information by the NSA, but those are still grossly insufficient.
NSA illegally bypasses FISC oversight on encryption
Shaina Kalanges spring 2014--- a second-year law student at the Northern Illinois University College of Law with a Bachelor of Arts from
the University of Illinois Urbana-Champaign. She is an Assistant Editor of the Northern Illinois University Law Review. (Kalenges, “Modern
Private Data Collection and National Security Agency Surveillance: A Comprehensive Package of Solutions Addressing Domestic Surveillance
Concerns” p.667. NIU law review. http://law.niu.edu/law/organizations/law_review/pdfs/full_issues/34_3/Kalanges_FINAL%206.pdf)//ET
Edward Snowden’s leak of classified information informed the public about the scope of NSA surveillance techniques.203 Snowden reported
accusations of the NSA continuously
violating federal laws and FISC rulings , hacking into “communications links of major data centers”
worldwide to gain access to user information, breaking online “encryption systems,” and blatantly lying about collecting information on U.S.
citizens to Congress.204 Now Snowden faces criminal charges of espionage and theft.205 Russia granted Snowden political asylum in August 2013 to
escape prosecution in the United States.206 Snowden fled the U.S. in fear of receiving an unfair trial following his leak to the press.207 However, there is some
disagreement as to whether or not Snowden did anything wrong.20
NSA bypasses FISC oversight on encryption—2011 case set precedent
Julian Sanchez Nov/Dec 13 - Sanchez is a research fellow at the Cato Institute, he studies issues at the busy intersection of
technology, and privacy. Wired named him one of the top 15 government and security resources. (Sanchez, “Decoding the
Summer of Snowden”. CATO institute. http://www.cato.org/policy-report/novemberdecember-2013/decoding-summersnowden)//ET
Once communications are seized, even if they’re domestic, they can be retained for further analysis if encrypted, or if they contain
evidence of a crime unrelated to terrorism. Perhaps most disturbingly, analysts can search through NSA’s huge trove of intercepted communications for
“selectors” associated with particular Americans. Initially, the rules prohibited such “backdoor searches,” but government lawyers were able
to get that restriction lifted in 2011 . The blanket surveillance orders issued under the FAA resemble nothing so much as a modern
version of the “general warrants” — or “writs of assistance” — that outraged the American colonists and inspired the Fourth Amendment. They may
“target” information about foreigners, but they give the NSA — not neutral judges — the discretion to determine which particular “places” and digital “papers” will
be searched or seized. Gripped by the fear of terrorism, Americans have allowed the resurrection of the very practice that once sparked a revolution. Since its
inception, breaking enemy codes and ciphers has been one of the primary missions of the NSA. In recent years, however, the agency has taken that a step
further: Now it seeks to ensure that the encryption software relied on by millions of ordinary Internet users — from businesspeople engaged in
sensitive professional communications to dissidents in repressive regimes — comes out of the box pre-broken. The idea that the government should
have backdoor access to encrypted communications was proposed and hotly debated in the late 1990s — and, wisely, defeated thanks to strong opposition from
both privacy advocates and security experts. Having lost the public debate, NSA pressured software developers to include those backdoors
secretly. Under a program known as BULLRUN, the agency has sought, in its own words, to “insert vulnerabilities into commercial encryption
systems, IT systems, networks and endpoint communications devices used by targets,” and to “covertly influence” the design of commercial software —
potentially infiltrating companies when necessary — in order to “make it more tractable to advanced cryptanalytic capabilities being developed by
NSA.”
A2 ADV – CYBERSECURITY
--xt sq solves
SQUO solves, private companies are already increasing encryption standards
Kharpal 15 – News Assistant for CNBC in London (Arjun, “iPhone encryption 'petrified’ NSA: Greenwald”, 3/18/15,
http://www.cnbc.com/2015/03/18/iphone-encryption-petrified-nsa-greenwald.html)
Stronger encryption in Apple's iPhones and on websites like Facebook has "petrified" the U.S. government because it has made
it harder to spy on communications, Glenn Greenwald, the writer who first reported on Edward Snowden's stolen files, told
CNBC. Former National Security Agency (NSA) contractor Edward Snowden caused major shockwaves around the world in
2013 when he unveiled the surveillance body's wide ranging spying practices, which included regularly attempting to snoop of
data held by major technology companies. Glenn Greenwald, the man who helped Snowden publish the documents, said that
Silicon Valley companies have bolstered the encryption on their products, thereby making it harder for governments to
eavesdrop. "They (Apple) are now starting to put serious encryption technologies in their new iPhones in their new releases and
this has really petrified governments around the world," Greenwald told CNBC in an interview at tech fair CeBIT in Germany.
Apple, Google, Facebook and Yahoo are some of the major companies that have been in the spotlight after Snowden's
revelations. Information from the Snowden documents released earlier this month detailed how the CIA had been trying for a
decade to crack the security in Apple's products. And last year, Yahoo revealed that it was threatened with a $250,000 per day
fine if it didn't hand over data to the NSA. The tech giants have been taking major steps to make sure their communications are
safe from spying, a move Greenwald – who won a Pulitzer prize for his reporting on the topic – said was motivated by the fear of
losing customers rather than care for data privacy. "I don't…(think) they suddenly care about privacy," Greenwald said.
"If…you're a Facebook executive or an Apple executive, you're extremely worried that the next generation of users…are going to
be vulnerable to the pitch from Brazilian, and Korean and German social media companies where they advertise and say don't
use Facebook and Google because they'll give your data to the NSA." Snowden is due to address CeBIT later today.
1nc cybersec – investigations turn
They’ve got it backwards – vulnerabilities are inevitable and backdoors are key to cybersecurity
Hess, Executive Assistant Director of the FBI, 15 (Amy Hess, Executive Assistant Director of the FBI,
“ENCRYPTION TECHNOLOGY POLICY ISSUES”, 4/29/15,
HTTP://congressional.proquest.com.proxy.lib.umich.edu/congressional/docview/t39.d40.04293003.d94?accountid=14667)//EM
The reality is that cyber adversaries will exploit any vulnerability they find. But security risks are better addressed by developing
solutions during the design phase of a specific product or service, rather than resorting to a patchwork solution when law enforcement presents
the company with a court order after the product or service has been deployed. To be clear, we in the FBI support and encourage the use of secure
networks and sophisticated encryption to prevent cyber threats to our critical national infrastructure, our intellectual property, and our data. We have been on the
front lines of the fight against cybercrime and economic espionage and we recognize that absolute security does not exist in either the physical or
digital world. Any lawful intercept or access solution should not lower the overall security. But without a solution that enables law
enforcement to access critical evidence, many investigations could be at a dead end. The same is true for cyber security
investigations; if there is no way to access encrypted systems and data, we may not be able to identify those who seek to
steal our technology, our state secrets, our intellectual property, and our trade secrets.
1nc cybersec – tracking turn
Turn – encrypted software doesn’t allow for malware tracking, increasing the amount of successful
hacks
Aggarwal 15 (Varun, Principal Correspondent at The Economic Times, “Here's how data encryption is making companies
less secure”, 14 April 2015, http://articles.economictimes.indiatimes.com/2015-04-14/news/61142361_1_malware-credit-carddata-theft-encryption)
While security experts have been advising companies to encrypt all their sensitive data to secure themselves post the NSA
snooping scandal, a new study by Dell reveals that encryption could be doing just the opposite. "Although there are many benefits to
using more Internet encryption, we are seeing a less positive trend emerge as hackers exploit this encryption as a way of "hiding"
malware from corporate firewalls," Amit Singh, country manager, Dell SonicWALL, India, told ET. In early 2014, hackers successfully
distributed malware to about 27,000 Europeans per hour over the course of four days, simply by infecting a group of banner
advertisements on Yahoo's news site. "Since Yahoo's site was encrypted, this malware was able to tunnel through users'
firewalls unseen," Dell's annual security threat report said. Dell saw an increase in the volume of HTTPS web connections from 182 billion in January 2014 to
382 billion in January 2015, and this number continues to grow. As of March 2015, the number was 437 billion. " More companies were exposed to
attackers hiding in plain sight as a result of SSL/TLS encrypted traffic. For many years, financial institutions and other companies
that deal with sensitive information have opted for the secure HTTPS protocol that encrypts information being shared. Now other
sites like Google, Facebook, and Twitter are adopting this practice as well in response to a growing demand for user privacy and
security," the report said. "The only way to manage this threat is by using new age firewalls that provide SSL inspection, thereby telling you if there is any
malicious code in the encrypted traffic," Singh said. The report also highlighted a spike in attacks on retail point of sale systems. In US, Home Depot,
Target, Michaels, and Staples all became targets of credit card data theft, with each breach exposing millions of consumers to
potential fraudulent purchases and/or identity theft. Target's was considered the largest breach in the history of U.S. retail, with
40 million card numbers stolen, until Home Depot's breach compromised 56 million card numbers just a few months later . In the
case of Home Depot and Michaels, the attacks took place over several months before they were detected. Dell SonicWALL saw 13 new types of POS malware in
2014, compared with three in 2013 - a 333% increase in the number of new POS malware countermeasures developed and deployed. The majority of these POS
hits targeted the U.S. retail industry. However, Singh said with the growth of retail in India, Indian customers could possibly face similar attacks
--a2 private sector solves
Private sector cannot self-regulate—need federal regulations
Nakashima 10 (Ellen Nakashima, staff writer for the Washington Post, Washington post “Federal regulation urged on
Cybersecurity” 02/24/10 http://www.washingtonpost.com/wp-dyn/content/article/2010/02/23/AR2010022305033.html)
The federal government must become more aggressive in getting industry to protect computer networks because self-regulation is
not working, leading cybersecurity experts told Congress on Tuesday. The private sector has pushed back, arguing that it can protect itself,
Senate Commerce Committee Chairman John D. Rockefeller IV (D-W.Va.) said at a hearing on protecting critical industry systems. Rockefeller is preparing
legislation with Sen. Olympia J. Snowe (R-Maine) to strengthen cybersecurity. "Many people will say we should let the market fix it," said James A. Lewis,
a technology expert at the Center for Strategic and International Studies. "The government needs to give the market a kick.'' He noted that cars
were not made safe until government pressure changed automakers' behavior . Former director of national intelligence Mike McConnell, now
an executive at Booz Allen Hamilton, a large federal cyber contractor, said that any fix must be mandatory "because industry is not going to
embrace it unless they're forced to do it." "There may be a role for regulation in terms of identity management" to safeguard the telecommunications
networks, said James "Jamie" Barnett Jr., a senior Federal Communications Commission official. " You can't have piecemeal answers. A regulatory
framework may help."
--xt no attacks
Cyberwar isn’t happening, hasn’t happened, and won’t happen
Rid 2013 (Thomas Rid reader in the department of war studies at King’s College London and the author of “Cyber War Will Not
Take Place”, New Scientist issue 2933 “Why a Cyberwar Won’t Happen 07/07/2013
https://www.newscientist.com/article/mg21929334.800-why-a-cyberwar-wont-happen/)NF
Leaks revealed last week that the US government spends a staggering $4.3 billion a year on cyber operations . In 2011,
American intelligence agencies reportedly mounted 231 offensive operations. The US, it seems, is gearing up for cyber combat. What
would an act of cyberwar look like? History suggests three features. To count as an armed attack , a computer breach would need to be
violent. If it can’t hurt or kill, it can’t be war. An act of cyberwar would also need to be instrumenta l. In a military
confrontation, one party generally uses force to compel the other party to do something they would otherwise not do. Finally, it would need
to be political, in the sense that one opponent says, “If you don’t do X, we’ll strike you.” That’s the gist of two centuries of
strategic thought. No past cyberattack meets these criteria. Very few meet even a single one. Never has a human been
injured or hurt as an immediate consequence of a cyberattack. Never did a state coerce another state by cyberattack.
Very rarely did state-sponsored offenders take credit for an attack. So if we’re talking about war – the real thing, not a metaphor, as in the “war
on drugs” – then cyberwar has never happened in the past, is not taking place at present, and seems unlikely in the
future. “Cyberwar has not taken place in the past, is not taking place at present and is unlikely in the future” That is not to say that
cyberattacks do not happen. In 2010, the US and Israel attacked Iran’s nuclear enrichment programme with a computer worm called
Stuxnet. A computer breach could cause an electricity blackout or interrupt a city’s water supply, although that also has never happened. If that
isn’t war, what is it? Such attacks are better understood as either sabotage, espionage or subversion. Code-borne sabotage is a
real risk. Industrial control systems run all sorts of things that move fast and can burn: trains, gas pipelines, civilian aircraft, refineries, even
elevators and medical devices. Many of these are highly susceptible to breaches, and information about system vulnerabilities is easily available.
Even so, the number of violent computer-sabotage attacks against Western targets is zero. Why? Because causing havoc
through weaponised code is harder than it looks. Target intelligence is needed. Control systems are often configured
for specific tasks, limiting the possibility of generic attacks. Even if they happened, such attacks may not constitute a
use of force. The second threat is cyber espionage. Data breaches are not just a risk, but a real bleeding wound for the US, Europe and other
advanced economies. But espionage is not war, and cyber espionage is not cyberwar. Finally, there is subversion – using social media
and other internet services to undermine established authority. It is not a surprise that subversives, from Anonymous and Occupy to Arab
protesters, use new technologies. Twitter and Facebook have made organising non-violent protest easier than ever, often in the service of liberty
and freedom. But again, subversion is not war, and cyber subversion is not cyberwar . There are other problems with the concept of
cyberwar. First, it is misleading. Closer examination of the facts reveals that what is happening is the opposite of war:
computer breaches are less violent than old-style attacks. Violent sabotage is harder if it is done through computers, while nonviolent sabotage is now easier and is happening more often: crashing websites, deleting files and so on. The same goes for espionage: infiltrating
software and opening remote back doors is much less risky than sending in human agents and clandestinely bugging embassy walls.
--a2 impact – cyber attacks
There are tons of cyberattacks now- should have triggered their impact
Bennett 15 (Cory Bennett, Cybersecurity reporter at The Hill, degree from Columbia University, Assistant editor of Warren
Communications News, Inc, The Hill, “Cyberattacks on federal government hit record high” 03/04/15,
http://thehill.com/policy/cybersecurity/234601-cyberattacks-on-government-hit-record-high)
Federal network cybersecurity incidents were up 15 percent in fiscal 2014 from the previous year, according to a recent government report. An
annual Office of Management and Budget (OMB) report details information security practices across the government. ADVERTISEMENT A “cybersecurity
incident” doesn’t necessarily mean a network was breached, but it does mean hackers were trying. Those efforts hit record highs in FY 2014, up to
70,000. Nearly half of these incidents “were related to or could have been prevented by strong authentication,” the report said. “Although some agencies are
making significant progress, this report underscores the troubling reality that cyber attacks and intrusions continue to occur at an increasing
rate, and agencies need to be better prepared,” said Senate Homeland Security and Governmental Affairs Ranking Member Tom Carper (D-Del.), in a
Wednesday statement. The OMB cautioned that some of the increased activity noted in the report is the result of better monitoring efforts. Indeed, researchers
gave agencies credit for improving their network monitoring and improved user authentication systems. Nearly all agencies, 92 percent, now have some sort of
continuous monitoring program in place, up from 81 percent last year. This equips agencies with “tools and practices to better manage cyber vulnerabilities when
they arise,” the report said. Additionally, nearly three-quarters of agencies now use secure log-in methods in some capacity, up from about two-thirds last year.
This means more agencies are making staffers use some sort of unique personal identification card to log in, instead of a generic or transferable method like a
password and username. In civilian agencies, however, only 41 percent of user accounts use strong authentication methods, “well below” target, the OMB said.
“Now more than ever, the federal government needs to fully implement meaningful security programs that can withstand the serious cyber challenges our
nation faces today and will face for the foreseeable future,” Carper said. Congress must help government agencies achieve this goal, he added.
Lawmakers were able to pass a rare cybersecurity bill late last year that updated the 12-year-old Federal Information Security Management Act. The bill gave
greater authority to OMB and the Department of Homeland Security in creating and implementing security strategies for federal agencies. “This report makes it
clear that we cannot rest on our laurels,” Carper said.
--a2 impact – cyber-terror
Cyberattacks aren’t enough for terrorists- they don’t care
Lewis 02 (James A. Lewis, Director and Senior Fellow of the technology and public policy program at the center for strategic
and international studies at Washington, D.C., Former member of U.S. Foreign Service and Senior Executive Service, Ph.D.
from University of Chicago, Center for Strategic and International Studies “Assessing the Risks of Cyber Terrorism, Cyber War,
and Other Cyber Threats” December 202 page 9-10 http://www.steptoe.com/publications/231a.pdf)NF
Much of the early work on the ‘cyber threat’ depicted hackers, terrorists, foreign spies and criminal gangs who, by typing a few commands
into a computer, can take over or disrupt the critical infrastructure of entire nations. This frightening scenario is not supported by
any evidence. Terrorist groups like Al Qaeda do make significant use of the Internet, but as a tool for intra-group communications, fund-raising and public
relations. Cyber terrorist could also take advantage of the Internet to steal credit card numbers or valuable data to provide financial support for their operations.
Cyber-terrorism has attracted considerable attention, but to date, it has meant little more than propaganda, intelligence collection
or the digital equivalent of graffiti, with groups defacing each other’s websites. No critical infrastructures have been shut down by
cyber attacks. Terrorists seek to make a political statement and to inflict psychological and physical damage on their targets. If
terrorism is an act of violence to achieve political objects, how useful will terrorists find an economic weapon whose effects are
gradual and cumulative? One of Al Qaeda’s training manuals, “Military Studies in the Jihad Against the Tyrants” notes that explosives are the
preferred weapon of terrorist because “explosives strike the enemy with sheer terror and fright.” Explosions are dramatic, strike fear into the hearts of
opponents and do lasting damage. Cyber attacks would not have the same dramatic and political effect that terrorists seek . A cyber attack,
which might not even be noticed by its victims, or attributed to routine delays or outages, will not be their preferred weapon. If terrorism is an act of violence to
create shock and achieve political objects, how useful will terrorists find an economic tool whose effects are at best gradual and cumulative?
--a2 impact – econ
Cyberattack on Sony was the largest on American soil, and not even that caused economic damage
Ando 2015 (Ritsuko Ando, Previous Reuters Bureau chief in Finland and current staff writer Reuters Magazine “Sony CEO
sees no major financial impact from cyberattack” 01/06/2015 http://www.reuters.com/article/2015/01/07/us-sony-cybersecurityidUSKBN0KF1ZW20150107)NF
Sony Corp Chief Executive Kazuo Hirai on Tuesday said he does not expect the November cyber attack
on the company's film studio to have a significant financial impact, two weeks after the studio rolled out the movie at
the heart of the attack. The studio, Sony Pictures Entertainment, said separately that the film, "The Interview," has generated
revenue of $36 million. Hirai told reporters at the Consumer Electronics Show in Las Vegas that he had signed off on
all
major decisions by the company in response to the attack, which the U.S. government has blamed on
North Korea. Sony's network was crippled by hackers as the company prepared to release "The Interview," a comedy
about a fictional plot to assassinate North Korean leader Kim Jong Un. The attack was followed by online leaks of unreleased
movies and emails that caused embarrassment to executives. "We are still reviewing the effects of the cyber attack,"
Hirai told reporters. "However, I do not see it as something that will cause a material upheaval on Sony
Pictures business operations, basically, in terms of results for the current fiscal year." Sony Pictures said
"The Interview," which cost $44 million to make, has brought in $31 million in online, cable and satellite
sales and was downloaded 4.3 million times between Dec. 24 and Jan. 4. It has earned another $5 million
at 580 independent theaters showing the movie in North America. Sony's unprecedented simultaneous release in
cinemas and online came together after it canceled the planned Christmas Day wide release of "The Interview" because major
movie theater chains refused to screen it following threats of violence from hackers opposed to the film. That decision drew pointed
criticism, including from President Barack Obama, that Sony had caved to hackers. Within a week, the studio struck deals with
small movie theaters that said they were keen to defend free expression and with technology giants like Google Inc for a
simultaneous online release on sites like YouTube Movies. Apple's iStore came aboard a few days later, as did major pay TV
providers. It is still unclear if Sony Pictures will recoup the costs of the film, starring Seth Rogen and James Franco, including an
estimated $30 million to $40 million marketing bill. On Monday, Hirai praised employees and partners of the Hollywood
movie studio for standing up to "extortionist efforts" of hackers, his first public comments on the attack launched on
Nov 21. Sony Pictures may need several more weeks to rebuild its computer network after what has been
deemed as the most destructive cyber attack on a company on U.S. soil. North Korea has denied it is behind the
attack.
A2 ADV – ECON
--xt sq solves tech
The Tech Industry is booming now – 2.4% increase in revenue this year
Cassagnol 7/15 [Danielle, Writer for the CEA, “New Tech to Drive CE Industry Growth in 2015, Projects CEA’s Midyear
Sales and Forecasts Report”, BusinessWire, July 15, 2015, http://www.businesswire.com/news/home/20150715006129/en/TechDrive-CE-Industry-Growth-2015-Projects#.VanNIflViko, July 18, 2015] KL
Consumer demand for emerging technology is redefining the consumer electronics (CE) landscape . According to the U.S. Consumer
Electronics Sales and Forecasts, the semi-annual industry report released today by the Consumer Electronics Association (CEA)®, retail revenues for the
consumer electronics (CE) industry are now projected to grow 2.4 percent in 2015 to reach $285 billion, led by 101 percent yearover-year growth in emerging product categories. CEA’s consensus forecast reflects U.S. factory sales to dealers and covers more than 100 CE
products. The bi-annual report serves as a benchmark for the CE industry, charting the size and growth of underlying categories. Overall Revenue Growth The
July U.S. Consumer Electronics Sales and Forecasts report projects that total industry revenue will reach a high of $285 billion,
accounting for retail markup, or $222.7 billion wholesale in 2015, a steady, 2.4 percent increase from $217.6 billion in sales in
2014. This midyear update is a slight downward adjustment from CEA’s projection in January, following slow economic growth in the first half of the year.
Looking ahead to 2016, CEA expects industry sales to grow by 2.7 percent, with industry revenues reaching an all-time high of
$228.8 billion. “Consumer technology is about constant and continuous innovation and that is what we are seeing in 2015,” said CEA President and CEO,
Gary Shapiro. “As the technology industry naturally ebbs and flows, a new class of tech is generating lots of enthusiasm among consumers. Emerging categories
such as 4K Ultra HD, smart home and health and fitness technology, are the breakout stars driving the industry onward and upward.” Emerging Categories
CEA’s forecast projects that revenues from emerging product categories will grow by 101 percent year-over-year in 2015. These developing technology
categories include 3D printers, 4K Ultra High-Definition (UHD) televisions, connected home technologies, unmanned aerial vehicles (drones), health and fitness
technology, home robots, smart eyewear and smart watches. While the emerging product categories represent less than five percent of the
entire CE industry revenue forecast, they are expected to contribute roughly $10 billion to overall CE revenue in 2015. Without
these categories, overall industry revenue would not sustain any growth in 2015. A few of the stand out products include: Health and fitness technology: Led by
the popularity of activity tracking devices, health and fitness devices will lead unit sales among all wearables in 2015 with a projected 20.3 million units (a 21
percent increase from last year), with revenue reaching $1.8 billion in 2015 (an 18 percent increase year-over-year). Connected Home Technologies: Including
smart thermostats, smart smoke detectors, IP cameras, smart home systems, smart locks, connected switches, dimmers and outlets, the booming connected
home technology industry is expected to reach $967 million in revenue in 2015, jumping 32 percent over last year. Drones: CEA market research expects 2015 to
be a defining year for drones, with the category ideally positioned for steady growth. According to CEA projections, the U.S. market will approach $105 million in
revenue in 2015 (increasing by more than 52 percent from 2014) with unit sales expected to approach 700,000, an increase of 63 percent. “The back half of
the year should give way to improving financial conditions that will drive consumer spending, setting up a stronger second half for
consumer tech,” said Shawn DuBravac, Ph.D., chief economist of CEA and author of the New York Times best-seller Digital Destiny: How the New Age of
Data Will Transform the Way We Work, Live, and Communicate. “The test that remains for 2015 is if the impressive growth driven by nascent, emerging
categories, as well as subsector growth, can offset some declines in mature categories and drive the tech industry towards sustained growth in 2015.”
--xt sq solves econ
Status quo solves – USA economy high right now
Mutikani 7/16 (Lucia, Journalist and Correspondent for Thomson Reuters, “U.S. jobless claims, housing data point to firming
economy”, http://www.reuters.com/article/2015/07/16/us-usa-economy-jobs-idUSKCN0PQ1AJ20150716)
The number of Americans filing new applications for unemployment benefits fell more than expected last week and confidence
among homebuilders held at a more than 9-1/2-year high in July, indicating underlying momentum in the economy. The solid
labor market and firming housing sector, underscored by Thursday's reports, suggest the economy likely is strong enough to support an
interest rate hike this year. Manufacturing, however, continues to struggle. "Overall, the economy continues to move in the right direction. We
look for the Federal Reserve to hike rates twice before the end of the year beginning in September," said John Ryding, chief economist at RDQ Economics in
New York.Fed Chair Janet Yellen told lawmakers on Wednesday that the U.S. central bank remained on course to tighten monetary policy "at some point this
year." Initial claims for state unemployment benefits fell 15,000 to a seasonally adjusted 281,000 for the week ended July 11 , the
Labor Department reported. The decline reversed the prior week's rise and ended three straight weeks of increases. Economists had forecast claims
falling to 285,000 last week. While claims tend to be volatile during the summer when automakers normally shut assembly plants for annual retooling, a
Labor Department official said there was nothing unusual in the state-level data. Some automakers keep production running, which can throw off a model the
government uses to smooth the data for seasonal fluctuations. The four-week moving average of claims, considered a better measure of
labor market trends as it irons out week-to-week volatility, increased 3,250 to 282,500 last week. It was the 16th straight week
that the four-week moving average of claims held below 300,000, a threshold normally associated with a firming labor market .
The dollar rose against a basket of currencies, while prices for short-dated U.S. government debt fell. Stocks on Wall Street were
higher after Greece's Parliament voted to approve a new bailout program and Citigroup (C.N) reported its highest quarterly profit in eight years
--xt no econ impact
No chance of war from economic decline---best and most recent data
Drezner 12 [Daniel W., Professor, The Fletcher School of Law and Diplomacy, Tufts University, “The Irony of Global
Economic Governance: The System Worked,” International Institutions and Global Governance Program, October 2012,
http://www.cfr.org/international-organizations-and-alliances/irony-global-economic-governance-system-worked/p29101, July 17,
2015] KL
The final outcome addresses a dog that hasn’t barked: the effect of the Great Recession on cross-border conflict and violence. During the
initial stages of the crisis, multiple analysts asserted that the financial crisis would lead states to increase their use of force as a tool
for staying in power.37 Whether through greater internal repression, diversionary wars, arms races, or a ratcheting up of great power
conflict, there were genuine concerns that the global economic downturn would lead to an increase in conflict. Violence in the Middle East, border disputes in
the South China Sea, and even the disruptions of the Occupy movement fuel impressions of surge in global public disorder. The aggregate data
suggests otherwise , however. The Institute for Economics and Peace has constructed a “Global Peace Index” annually since 2007. A key conclusion
they draw from the 2012 report is that “The average level of peacefulness in 2012 is approximately the same as it was in 2007.”38
Interstate violence in particular has declined since the start of the financial crisis – as have military expenditures in most sampled countries.
Other studies confirm that the Great Recession has not triggered any increase in violent conflict; the secular decline in violence
that started with the end of the Cold War has not been reversed.39 Rogers Brubaker concludes, “the crisis has not to date generated the surge in
protectionist nationalism or ethnic exclusion that might have been expected.”40 None of these data suggest that the global economy is operating swimmingly.
Growth remains unbalanced and fragile, and has clearly slowed in 2012. Transnational capital flows remain depressed compared to pre-crisis levels, primarily
due to a drying up of cross-border interbank lending in Europe. Currency volatility remains an ongoing concern. Compared to the aftermath of other postwar
recessions, growth in output, investment, and employment in the developed world have all lagged behind. But the Great Recession is not like other postwar
recessions in either scope or kind; expecting a standard “V”-shaped recovery was unreasonable. One financial analyst characterized the post-2008 global
economy as in a state of “contained depression.”41 The key word is “contained,” however. Given the severity, reach and depth of the 2008
financial crisis, the proper comparison is with Great Depression. And by that standard, the outcome variables look
impressive . As Carmen Reinhart and Kenneth Rogoff concluded in This Time is Different: “that its macroeconomic outcome has been only the most severe
global recession since World War II – and not even worse – must be regarded as fortunate.”42
Economic doesn’t doesn’t lead to conflict
Zakaria 09 (Fareed, was the managing editor of Foreign Affairs, Ph.D. in political science from Harvard, “The Secrets of
Stability”, Newsweek, 12/11/09, http://www.thedailybeast.com/newsweek/2009/12/11/the-secrets-of-stability.html)
Others predicted that these economic shocks would lead to political instability and violence in the worst-hit countries. At his confirmation
hearing in February, the new U.S. director of national intelligence, Adm. Dennis Blair, cautioned the Senate that "the financial crisis and global recession are
likely to produce a wave of economic crises in emerging-market nations over the next year." Hillary Clinton endorsed this grim view. And she was hardly alone.
Foreign Policy ran a cover story predicting serious unrest in several emerging markets. Of one thing everyone was sure: nothing would ever be
the same again. Not the financial industry, not capitalism, not globalization. One year later, how much has the world really
changed? Well, Wall Street is home to two fewer investment banks (three, if you count Merrill Lynch). Some regional banks have gone bust. There was some
turmoil in Moldova and (entirely unrelated to the financial crisis) in Iran. Severe problems remain, like high unemployment in the West, and we face new problems
caused by responses to the crisis—soaring debt and fears of inflation. But overall, things look nothing like they did in the 1930s. The predictions
of economic and political collapse have not materialized at all. A key measure of fear and fragility is the ability of poor and
unstable countries to borrow money on the debt markets. So consider this: the sovereign bonds of tottering Pakistan have
returned 168 percent so far this year. All this doesn't add up to a recovery yet, but it does reflect a return to some level of normalcy. And that rebound
has been so rapid that even the shrewdest observers remain puzzled. "The question I have at the back of my head is 'Is that it?' " says Charles Kaye, the cohead of Warburg Pincus. "We had this huge crisis, and now we're back to business as usual?"
No economy impact – empirics disprove
Barnett 9 – Thomas P.M. Barnett (senior managing director of Enterra Solutions LLC and a contributing editor/online columnist for Esquire magazine)
August 2009 “The New Rules: Security Remains Stable Amid Financial Crisis” http://www.aprodex.com/the-new-rules--security-remains-stable-amid-financialcrisis-398-bl.aspx
When the global financial crisis struck roughly a year ago, the blogosphere was ablaze with all sorts of scary predictions of, and
commentary regarding, ensuing conflict and wars -- a rerun of the Great Depression leading to world war, as it were. Now, as
global economic news brightens and recovery -- surprisingly led by China and emerging markets -- is the talk of the day, it's interesting to look
back over the past year and realize how globalization's first truly worldwide recession has had virtually no impact whatsoever on
the international security landscape. None of the more than three-dozen ongoing conflicts listed by GlobalSecurity.org can be
clearly attributed to the global recession. Indeed, the last new entry (civil conflict between Hamas and Fatah in the Palestine) predates the
economic crisis by a year, and three quarters of the chronic struggles began in the last century . Ditto for the 15 low-intensity conflicts
listed by Wikipedia (where the latest entry is the Mexican "drug war" begun in 2006). Certainly, the Russia-Georgia conflict last August was specifically timed, but
by most accounts the opening ceremony of the Beijing Olympics was the most important external trigger (followed by the U.S. presidential campaign) for that
sudden spike in an almost two-decade long struggle between Georgia and its two breakaway regions. Looking over the various databases, then, we see a most
familiar picture: the usual mix of civil conflicts, insurgencies, and liberation-themed terrorist movements. Besides the recent Russia-Georgia dust-up,
the only two potential state-on-state wars (North v. South Korea, Israel v. Iran) are both tied to one side acquiring a nuclear
weapon capacity -- a process wholly unrelated to global economic trends. And with the United States effectively tied down by its two ongoing
major interventions (Iraq and Afghanistan-bleeding-into-Pakistan), our involvement elsewhere around the planet has been quite modest, both leading up to and
following the onset of the economic crisis: e.g., the usual counter-drug efforts in Latin America, the usual military exercises with allies across Asia, mixing it up
with pirates off Somalia's coast). Everywhere else we find serious instability we pretty much let it burn, occasionally pressing the Chinese -- unsuccessfully -- to
do something. Our new Africa Command, for example, hasn't led us to anything beyond advising and training local forces. So, to sum up: * No significant uptick
in mass violence or unrest (remember the smattering of urban riots last year in places like Greece, Moldova and Latvia?); * The usual frequency maintained in
civil conflicts (in all the usual places); * Not a single state-on-state war directly caused (and no great-power-on-great-power crises even
triggered); * No great improvement or disruption in great-power cooperation regarding the emergence of new nuclear powers
(despite all that diplomacy); * A modest scaling back of international policing efforts by the system's acknowledged Leviathan
power (inevitable given the strain); and * No serious efforts by any rising great power to challenge that Leviathan or supplant its
role. (The worst things we can cite are Moscow's occasional deployments of strategic assets to the Western hemisphere and its weak efforts to outbid the
United States on basing rights in Kyrgyzstan; but the best include China and India stepping up their aid and investments in Afghanistan and Iraq.) Sure, we've
finally seen global defense spending surpass the previous world record set in the late 1980s, but even that's likely to wane given the stress on public budgets
created by all this unprecedented "stimulus" spending. If anything, the friendly cooperation on such stimulus packaging was the most notable
great-power dynamic caused by the crisis. Can we say that the world has suffered a distinct shift to political radicalism as a result
of the economic crisis? Indeed, no. The world's major economies remain governed by center-left or center-right political factions that remain decidedly
friendly to both markets and trade. In the short run, there were attempts across the board to insulate economies from immediate damage (in effect, as much
protectionism as allowed under current trade rules), but there was no great slide into "trade wars." Instead, the World Trade Organization is functioning as it was
designed to function, and regional efforts toward free-trade agreements have not slowed. Can we say Islamic radicalism was inflamed by the economic crisis? If
it was, that shift was clearly overwhelmed by the Islamic world's growing disenchantment with the brutality displayed by violent extremist groups such as alQaida. And looking forward, austere economic times are just as likely to breed connecting evangelicalism as disconnecting
fundamentalism. At the end of the day, the economic crisis did not prove to be sufficiently frightening to provoke major
economies into establishing global regulatory schemes, even as it has sparked a spirited -- and much needed, as I argued last week -- discussion
of the continuing viability of the U.S. dollar as the world's primary reserve currency. Naturally, plenty of experts and pundits have attached great significance to
this debate, seeing in it the beginning of "economic warfare" and the like between "fading" America and "rising" China. And yet, in a world of globally
integrated production chains and interconnected financial markets, such "diverging interests" hardly constitute signposts for wars
up ahead. Frankly, I don't welcome a world in which America's fiscal profligacy goes undisciplined, so bring it on -- please! Add it all up and it's fair to say that
this global financial crisis has proven the great resilience of America's post-World War II international liberal trade order. Do I
expect to read any analyses along those lines in the blogosphere any time soon? Absolutely not. I expect the fantastic fear-mongering to proceed apace. That's
what the Internet is for.
The Second Great Depression is a warning, not a threat – bad actors
Baker 13 – Co director of CEPR, Dean Baker is an American macroeconomist, Swarthmore College (B.A., 1981), the
University of Denver (M.A., 1983), and the University of Michigan (Ph.D., 1988). (“The Financial Crisis and the Second Great
Depression Myth”, Dean Baker, Huffington Post, September 13th, 2013, http://www.huffingtonpost.com/dean-baker/the-financialcrisis-and_b_3897014.html)//chiragjain
All knowledgeable D.C. types know that the
TARP and Fed bailout of Wall Street banks five years ago
saved us from a second Great Depression. Like most things known by knowledgeable Washington types, this is
not true. Just to remind folks, the Wall Street banks were on life support at that time. Bear Stearns, one of the five major investment
banks, would have collapsed in March of 2008 if the Fed had not been able to arrange a rescue by offering guarantees on almost $30
billion in assets to J.P. Morgan. Fannie Mae and Freddie Mac both went belly up in September. The next week Lehman, another of
the five major investment banks did go under. AIG, the country's largest insurer was about to follow suit when the Fed and Treasury
jury-rigged a rescue. Without massive government assistance, it was a virtual certainty that the remaining three investment banks,
Goldman Sachs, Morgan Stanley, and Merrill Lynch, were toast. Bank of America and Citigroup also were headed rapidly for the
dustbin of history. It is certainly possible, if not likely, that the other two giant banks, Wells Fargo and J.P. Morgan, would have been
sucked down in the maelstrom. In short, if we allowed the magic of the market to do its work, we would have seen an end to Wall
Street as we know it. The major banks would be in receivership. Instead of proffering economic advice to the president, the top
executives of these banks would be left walking the streets and dodging indictments and lawsuits. This was when they turned
socialist on us. We got the TARP and infinite money and guarantees from the Fed, FDIC, and Treasury to keep the Wall Street crew
in their expensive suits. All
the politicians told us how painful it was for them to hand out this
money to the wealthy, but the alternative was a Second Great Depression. It's not clear
what these people think they mean, but let's work it through. Suppose that we did see a full meltdown.
The commercial banks that handle checking and saving accounts and are responsible for most personal and business transactions
would then be under control of the FDIC. The FDIC takes banks over all the time. This would be more roadkill than it was
accustomed to, but there is little reason to think that after a few days most of us would not be able to get to most of the money in our
accounts and carry through normal transactions. Credit conditions would likely be uncertain for business loans for some time, as in
fact was the case even with the bailouts. Mortgage credit would have been provided by Fannie Mae and Freddie Mac, as has been the
case since September of 2008. One
item deserving special attention in this respect is the
commercial paper market. This is the market that most major businesses rely upon to
meet regular payments like payroll and electric bills. When he was lobbying Congress for the TARP,
Federal Reserve Board Chair Ben Bernanke said that this market was shutting down, which would in fact
be disastrous for the economy. What Bernanke neglected to mention was that he unilaterally had the
ability to support the commercial paper market through the Fed. In fact he announced a special
lending facility for exactly this purpose, the weekend after Congress approved the TARP. It is also worth ridiculing people who say
the government made a profit on its bailout loans. It's true that most loans were repaid with interest. However these loans were
made to favored borrowers (Wall Street banks) at far below the market interest rates at the time. The Congressional Oversight Panel
commissioned a study on the subsidies involved in just the first round of TARP loans. The study put the subsidies at a bit more than
30 percent of the money lent out, implying bank subsidies of almost $80 billion from just this small segment of the bailout. Adding
in other loans and various implicit and explicit guarantees would certainly increase this number considerably. But suppose we hadn't
opened the government's wallet and instead let the banks drown in their own greed. Would we have faced a decade of double digit
unemployment? From an economic standpoint there would be no reason for concern. We
know from the last Great
Depression, the key to recovery from a period of weak demand is to have the
government spend lots of money. We eventually got of the Great Depression by spending huge amounts of money
on World War II. To get the economy jump-started this time we could have had massive
spending on education, child care, rebuilding the infrastructure and making the
economy more energy efficient. As Paul Krugman has repeatedly joked, if we need a political rationale for this
spending we can say it is necessary to protect the United States from a Martian invasion. Of course as a political
matter, such massive spending could prove a tough sell given the present day politics.
But that is a political argument, not an economic one. Since we would be in uncharted water following
this sort of collapse, no one can with a straight face claim they know how the politics would play out. We can separate out three
camps. First we have the folks who would like the government to spend enough to restore full employment, but argue the political
opposition would be too great. These
people have a coherent second Great Depression story, but
based on politics, not economics. The bad guys would have forced us to endure a decade
of double digit unemployment if we didn't rescue Wall Street. Then we have the people who don't like
government spending and would oppose efforts to boost the economy back to full employment. These people are saying that we
would have faced a second Great Depression if we didn't rescue Wall Street because they
would have insisted upon it. Finally, there are Washington Very Serious People types like the Washington Post
editorial page, who would go along with restarting the economy but only if accompanied by
sharp cuts to programs like Social Security and Medicare. These people are hostage takers who are
saying that if the country didn't bailout Wall Street, they would force it to endure a second
Great Depression, unless it eviscerated essential programs that working people need. So
the long and short is that we only need to have worried about a Second Great
Depression if the bad guys got their way. And most of the people who warn about a
Second Great Depression were on the list of bad guys. The prospect of a second Great
Depression was not a warning, it was a threat. Next week I will explain why this downturn has been so longlasting. The reason is actually far too simple for most economists to understand. As a result, there continues to be
widespread confusion about the nature of the downturn.
1nc investment turn
Fear of surveillance is driving encryption investment – funding increasing now
JIJI 14 <Press news journal covering business and economics, 3/29/14, “The ‘golden age’ of encryption?”, The Japan Times,
http://www.japantimes.co.jp/news/2014/03/29/business/tech/the-golden-age-of-encryption/#.VamBOMZVikp>//wx
NEW YORK – Investors are pumping millions of dollars into encryption as unease about data security drives a rising need for ways to
keep unwanted eyes away from personal and corporate information. Major data breaches at Target and other retailers have made data security a
boardroom issue at companies large and small. And stunning revelations of widespread snooping by U.S. intelligence agencies have also
rattled companies and the public. For venture capital, that has opened up a new area of growth in the tech business. In February, Google
Ventures led a $25.5 million round of venture funding for Atlanta-based Ionic Security, a 3-year-old company that works on encryption —
the scrambling of data before it is shipped or stored. Other encryption companies, including Toronto-based PerspecSys and San Jose, California-based
CipherCloud, have announced major fundings. The funding rush could hearken a “golden age” of encryption, as one expert puts it. But the industry also
faces barriers to a tool that until recently was not a hot commodity. Concerns about encryption range from practical challenges, such as the difficulty people have
when searching for something in their encoded data, to government opposition toward privacy technology. “People are afraid of it because they don’t understand
it,” said John Kindervag, a vice president and principal analyst at Forrester Research. But he called the wider use of encryption “inevitable, because there’s no
other way to solve the problem.” Kindervag said the industry is between one and two years away from “some big revolutions” in the field. “It just needs to
happen,” he added. But Venky Ganesan, a managing director with venture capital firm Menlo Ventures, believes major advances are further off. “Encryption
slows down,” Ganesan said. “Just imagine if every room in your house was locked and you had to open and close it every time you go in. You would be
frustrated.”
Lack of incentive for investment kills startups – liquidated too quickly
Primack 15 <Dan, senior editor at Fortune, cites top VC and investor Bill Gurley, 1/22/15, “Top VC: A lot of tech startup failure coming in
2015”, http://fortune.com/2015/01/22/vc-tech-startup-failure-2015/>//wx
Bill Gurley thinks that some highly-valued tech startups are heading for a reckoning. Bill Gurley is no stranger to unicorns, the tech
industry’s name for startups that have been valued at $1 billion or more by venture capitalists. His VC firm, Benchmark, has put money into such
companies as Uber, DropBox, SnapChat and WeWork. Not to mention some that recently went public, like Hortonworks and New Relic . But he
believes that many of these unicorns, of which there are more than 80, will go down in flames after flying too close to the sun. “I think you’re
going to see a lot of failure in 2015,” Gurley said in Fortune‘s February cover story on the unicorn trend. Some companies will collapse under their
own overvalued weight, others because there could be a macro financial pullback that filters down into the private markets. Here is Gurley’s primary concern:
Privately-held companies that raise lots of funding at higher and higher valuations eventually build up tons of liquidation
preferences. For the jargon-challenged, liquidation preferences are inserted into venture funding deals to ensure that the VC gets paid first
(and how much) if the company is sold (i.e., generates liquidity). If a company is sold at a massive valuation increase, then it’s largely academic, as everyone
gets rich. But if the company is sold at a price below where it last raised money, it could leave a bunch of people out in the cold ,
including employees who were largely compensated with stock options. “The cap chart begins to calcify a bit, which eventually can be problematic,”
Gurley explains. “Hiring new employees, particularly senior management, becomes tough because they worry about getting stuck beneath a huge liquidation
preference stack. Some of these deals have so many [anti-dilution terms] that the cap table becomes almost concrete. If the
valuation goes down
significantly, it will sink them .” This is, of course, different than what happens when a publicly-traded company suffers a major valuation hit. In those
cases, the company can still offer stock options to employees at “market” prices, without liquidation preferences getting in the way. Moreover, far too few of these
startups are actually profitable, meaning that they are virtually required to keep tacking on new liquidation preferences in subsequent funding rounds. Part of this
is because so many entrepreneurs have taken their cue from unprofitable tech giants like Amazon and Salesforce , but the truth is that not everyone is Jeff Bezos
or Marc Benioff. “All of these companies raising so much money at higher and higher valuations is becoming a burgeoning anchor,” Gurley says. “I think it’s
almost tautologically easier to execute a company that loses money than one that’s profitable.”
1nc investment da (off-case w/impact)
Fear of surveillance is driving encryption investment – funding increasing now
JIJI 14 <Press news journal covering business and economics, 3/29/14, “The ‘golden age’ of encryption?”, The Japan Times,
http://www.japantimes.co.jp/news/2014/03/29/business/tech/the-golden-age-of-encryption/#.VamBOMZVikp>//wx
NEW YORK – Investors are pumping millions of dollars into encryption as unease about data security drives a rising need for ways to
keep unwanted eyes away from personal and corporate information. Major data breaches at Target and other retailers have made data security a
boardroom issue at companies large and small. And stunning revelations of widespread snooping by U.S. intelligence agencies have also
rattled companies and the public. For venture capital, that has opened up a new area of growth in the tech business. In February, Google
Ventures led a $25.5 million round of venture funding for Atlanta-based Ionic Security, a 3-year-old company that works on encryption —
the scrambling of data before it is shipped or stored. Other encryption companies, including Toronto-based PerspecSys and San Jose, California-based
CipherCloud, have announced major fundings. The funding rush could hearken a “golden age” of encryption, as one expert puts it. But the industry also
faces barriers to a tool that until recently was not a hot commodity. Concerns about encryption range from practical challenges, such as the difficulty people have
when searching for something in their encoded data, to government opposition toward privacy technology. “People are afraid of it because they don’t understand
it,” said John Kindervag, a vice president and principal analyst at Forrester Research. But he called the wider use of encryption “inevitable, because there’s no
other way to solve the problem.” Kindervag said the industry is between one and two years away from “some big revolutions” in the field. “It just needs to
happen,” he added. But Venky Ganesan, a managing director with venture capital firm Menlo Ventures, believes major advances are further off. “Encryption
slows down,” Ganesan said. “Just imagine if every room in your house was locked and you had to open and close it every time you go in. You would be
frustrated.”
Lack of incentive for investment kills startups – liquidated too quickly
Primack 15 <Dan, senior editor at Fortune, cites top VC and investor Bill Gurley, 1/22/15, “Top VC: A lot of tech startup failure coming in
2015”, http://fortune.com/2015/01/22/vc-tech-startup-failure-2015/>//wx
Bill Gurley thinks that some highly-valued tech startups are heading for a reckoning. Bill Gurley is no stranger to unicorns, the tech
industry’s name for startups that have been valued at $1 billion or more by venture capitalists. His VC firm, Benchmark, has put money into such
companies as Uber, DropBox, SnapChat and WeWork. Not to mention some that recently went public, like Hortonworks and New Relic . But he
believes that many of these unicorns, of which there are more than 80, will go down in flames after flying too close to the sun. “I think you’re
going to see a lot of failure in 2015,” Gurley said in Fortune‘s February cover story on the unicorn trend. Some companies will collapse under their
own overvalued weight, others because there could be a macro financial pullback that filters down into the private markets. Here is Gurley’s primary concern:
Privately-held companies that raise lots of funding at higher and higher valuations eventually build up tons of liquidation
preferences. For the jargon-challenged, liquidation preferences are inserted into venture funding deals to ensure that the VC gets paid first
(and how much) if the company is sold (i.e., generates liquidity). If a company is sold at a massive valuation increase, then it’s largely academic, as everyone
gets rich. But if the company is sold at a price below where it last raised money, it could leave a bunch of people out in the cold,
including employees who were largely compensated with stock options. “The cap chart begins to calcify a bit, which eventually can be problematic,”
Gurley explains. “Hiring new employees, particularly senior management, becomes tough because they worry about getting stuck beneath a huge liquidation
preference stack. Some of these deals have so many [anti-dilution terms] that the cap table becomes almost concrete. If the
valuation goes down
significantly, it will sink them .” This is, of course, different than what happens when a publicly-traded company suffers a major valuation hit. In those
cases, the company can still offer stock options to employees at “market” prices, without liquidation preferences getting in the way. Moreover, far too few of these
startups are actually profitable, meaning that they are virtually required to keep tacking on new liquidation preferences in subsequent funding rounds. Part of this
is because so many entrepreneurs have taken their cue from unprofitable tech giants like Amazon and Salesforce , but the truth is that not everyone is Jeff Bezos
or Marc Benioff. “All of these companies raising so much money at higher and higher valuations is becoming a burgeoning anchor,” Gurley says. “I think it’s
almost tautologically easier to execute a company that loses money than one that’s profitable.”
U.S. tech industry key to economy
Grisham 15: Senior Manager Public Policy Communications – CompTIA, Stonewall Strategies; served as the Manager of U.S. Public Affairs at
the Public Affairs Council and Communications Director for Congressman Mike Turner of Ohio; BA in Public Relations at University of South
Carolina-Columbia: (“UNITED STATES TECH INDUSTRY EMPLOYS 6.5 MILLION IN 2014”, Preston Grisham, CompTIA, February 10, 2015,
https://www.comptia.org/about-us/newsroom/press-releases/2015/02/10/united-states-tech-industry-employs-6.5-million-in-2014)//chiragjain//wx
Washington, D.C., February 10, 2015 – The U.S. tech industry added 129,600 net jobs between 2013 and 2014, for a total of
nearly 6.5 million jobs in the U.S., according to Cyberstates 2015: The Definitive State-by-State Analysis of the U.S. Tech Industry
published by CompTIA. The report represents a comprehensive look at tech employment, wages, and other key economic factors nationally and
state-by-state, covering all 50 states, the District of Columbia, and Puerto Rico. This year’s edition shows that tech industry jobs account
for 5.7 percent of the entire private sector workforce. Tech industry employment grew at the same rate as the overall
private sector, 2 percent, between 2013-2014. Growth was led by the IT services sector which added 63,300 jobs
between 2013 and 2014 and the R&D, testing, and engineering services sector that added 50,700 jobs. “The U.S. tech
industry continues to make significant contributions to our economy,” said Todd Thibodeaux, president and CEO, CompTIA.
“The tech industry accounts for 7.1 percent of the overall U.S. GDP and 11.4 percent of the total U.S. private sector
payroll. With annual average wages that are more than double that of the private sector, we should be doing all we can to
encourage the growth and vitality of our nation’s tech industry.” An examination of tech job postings for the nation shows a
year-over-year jump of more than 11 percent for technology occupations, with over 650,000 job openings in fourth
quarter of 2014. At the state level, Cyberstates shows that 38 states had an overall net increase of tech industry employment in
2014. The largest gains were in California (+32,900), Texas (+20,100), Florida (+12,500), Massachusetts (+8,700), and Michigan (+8,100). The
states with the highest concentration of workers were Massachusetts (9.8% of private sector employment), Virginia (9.4%), Colorado (9.2%),
Maryland (8.6%), and Washington (8.4%). The largest states by tech industry employment continues to be California, Texas, and New York.
Economic decline causes nuclear war
Kemp 10 – Director of Regional Strategic Programs at The Nixon Center, served in the White House under Ronald Reagan,
special assistant to the president for national security affairs and senior director for Near East and South Asian affairs on the
National Security Council Staff, Former Director, Middle East Arms Control Project at the Carnegie Endowment for International
Peace
(Geoffrey, “The East Moves West: India, China, and Asia’s Growing Presence in the Middle East,” p. 233-4)//BB
The second scenario, called Mayhem and Chaos, is the opposite of the first scenario; everything that can go wrong does go
wrong. The world economic situation weakens rather than strengthens, and India, China, and Japan suffer a major reduction in
their growth rates, further weakening the global economy. As a result, energy demand falls and the price of fossil fuels plummets,
leading to a financial crisis for the energy-producing states, which are forced to cut back dramatically on expansion programs
and social welfare. That in turn leads to political unrest: and nurtures different radical groups, including, but not limited to, Islamic
extremists. The internal stability of some countries is challenged, and there are more “failed states.” Most serious is the collapse
of the democratic government in Pakistan and its takeover by Muslim extremists, who then take possession of a large number of
nuclear weapons . The danger of war between India and Pakistan increases significantly. Iran, always worried about an
extremist Pakistan, expands and weaponizes its nuclear program. That further enhances nuclear proliferation in the Middle East,
with Saudi Arabia, Turkey, and Egypt joining Israel and Iran as nuclear states. Under these circumstances, the potential for
nuclear terrorism increases, and the possibility of a nuclear terrorist attack in either the Western world or in the oil-producing
states may lead to a further devastating collapse of the world economic market, with a tsunami-like impact on stability. In this
scenario, major disruptions can be expected, with dire consequences for two-thirds of the planet’s population.
--xt link
NSA surveillance helps the tech industry – increases the demand for better encryption
Chayka 13 – Kyle Chayka is a writer for publications including Newsweek, The New Republic and The New Yorker.
He is a weekly columnist for Pacific Standard and the author of an ebook, The Printed Gun (Kyle, “In the Future, Only
Rich People Can Afford to Keep Their Emails Secret”, 10/3/13, http://www.newrepublic.com/article/114979/dataencryption-swiss-banking-will-be-expensive-and-overseas)
In the PRISM era, truly secure data is increasingly rare. New companies like Lavabit are emerging to take advantage of the
growing demand for privacy and creating a new market in the process. Like opening a Swiss bank account to keep your holdings
safe and undetectable, those individuals and corporations with enough capital can now buy their way to security. The question is,
can these new services actually guarantee your data's safety? Computer security expert Jon Callas founded Silent Circle in 2012
as a “secure information service for people who travel and live abroad… so that they can communicate securely with people
back home,” he explained. The company launched an email service as well as a smartphone messaging and voice app that uses
peer-to-peer encryption, which keeps information safe by encoding it when it leaves the sender and decoding when the data
arrives at the recipient. Since it doesn’t store any of its users’ activities, there’s nothing to give up when the government
inevitably comes knocking, as was the case with Lavabit. “There are no keys on a server. There is no metadata we collect,”
Callas wrote in an email. The NSA inadvertently caused a boom in Callas’s business. Since the leaks, “we have seen our
revenues quadruple,” he noted. At $9.95 a month, the Silent Circle phone and text package is accessible for mainstream
consumers (though it only works with other Silent Circle users), and the company offers a larger, more expensive system to
businesses wanting to keep their communications private. But their products aren’t perfect. In August, the company shut down
the Silent Mail client, fearing that it wasn’t as secure as intended. The encryption keys to decode the email were stored online,
leaving them vulnerable. In fact, many common encryption techniques, like the open-source, decentralized Tor network, won’t
keep users safe any more. Snowden revealed that the NSA spent $250 million ensuring that products created by U.S. and
foreign IT businesses contain built-in exploits that help the government access data. This includes everything from messaging
services (a flaw was intentionally engineered into Microsoft's omnipresent Outlook email client) to chipmakers, compromising
even the hardware our computers run on. The NSA worked to weaken encryption standards and collected keys for commercial
encryption products. This means that developers hoping to capture the security market have to look for even stronger strategies.
Least Authority File System (LAFS) is a new, open-source cloud-storage technology that promises faultless security by
encrypting files before they go into the cloud, making them unreadable until they’re decoded by the user, who keeps the only key
on their personal machine. The LAFS source code is totally public, and has been vetted by security experts. Security engineer
Zooko Wilcox-O'Hearn helped code LAFS and launched his own company, Least Authority, in 2011 as a secure storage
provider. The service costs $50 a month for 350 gigabytes of cloud space, but it doesn’t include email or messaging, and since
it’s more of a developer toolkit than a turnkey product, it’s meant to sell to the security departments of corporations rather than
consumers. There’s a “bigger business potential in selling to companies than individuals because businesses already spend a
tremendous amount of money on data and also have a more specific motivating need to protect that data—economic espionage
and regulatory requirements,” Wilcox-O’Hearn said. If you’re seeking a service that’s super secure, all-encompassing, and easy
to use, the best choice would probably be to go outside the U.S., where legal measures could make it more difficult to access
data. But you’ll have to pay a hefty price for it.
--xt link booster – cybersecurity
Cyber-attacks are key to increased encryption – fear of attack
Messmer 10: Senior Editor Network World, Palma Sola Group, Blogging, Marketing Communications, Media Relations,
Hunter College, (“Encryption adoption driven by PCI, fear of cyberattacks”, Ellen Messmer, November 16, 2010, Network World,
http://www.networkworld.com/article/2194499/compliance/encryption-adoption-driven-by-pci--fear-ofcyberattacks.html)//chiragjain
A survey of more than 900 IT managers shows that adoption of encryption in their
organizations is being driven by two main factors, anxiety about possible cyberattacks
and the need to meet the payment-card industry (PCI) data security standards. According to the Ponemon Institute's "2010 Annual
Study: U.S. Enterprise Encryption Trends," 69% of the 964 IT managers responding to the survey said the need to meet
regulatory compliance was the driving force behind deployment of encryption in their organizations . And
the most important regulatory factor to them was the need to meet encryption requirements of the PCI data security standard. It was
the first time that respondents to the annual study listed regulatory compliance as "the main reason, for using encryption,"
according to the report, which was sponsored by Symantec. In the past, the need to simply protect data at rest was often the reason
stated. "Interestingly, PCI requirements have seen the greatest increase in influence by far over the past years, rising 49 points from
15% in 2007 to 64% this year," the report notes. Other regulations, such as state data-privacy laws in California and Massachusetts,
for instance, as well as the Health Information Portability and Accountability Act (HIPAA) and Sarbanes-Oxley still count, but have
far less impact overall than PCI. "PCI is becoming one of the most important drivers to action because failure to comply means
organizations can't do credit-card transactions, which holds organizations to a much higher level of accountability," the 2010
Ponemon study on encryption says. Another important factor spurring organizations to
adopt encryption is fear related to cyberattacks. Some 88% of organizations in the survey
acknowledged at least one data breach, up three points from 2009. "And of those, "23% had only
one breach and 40% had two to five breaches." These numbers were consistent with last year's results, but those experiencing more
than five data breaches a year was up 3% from 2009. For the first time since the annual study has been done, "Nearly
all
respondents (97%) list cyber attacks as the most severe threat to their ability to
successfully carry out their mission." The annual study has been done for six years now with five years of
comparable data and methodology, says Institute director Larry Ponemon. "Encryption is a tool here to stay," he
notes. He points out that this year's study shows growing adoption and some preference for whole-disk encryption. One reason is
that research shows "end users manipulate file encryption" by sometimes turning it off when they think it's slowing down computer
use, Ponemon says. "Foolishly, they're trying to do an end run around it." When the survey asked about budgeting for data security,
the answers suggest that 89% were earmarking for perimeter security controls, including intrusion detection and prevention, with
anti-malware, and identity access and management also prominent. Sixty-nine percent planned for at least one type of
encryption deployment. "Earmarks for encryption were up 9 points from 2009 and 12 points from 2008," the report says.
Nevertheless, the report reaches the conclusion that "data protection is not a high priority in most organizations," because "59% of
this year's respondents spent only 5% to 10% of the IT budgets on data protection activities." Encryption got a much smaller
percentage of that, with 37% of respondents, for example, spending less than 5% of their IT budgets on it.
A2 ADV – CLOUD COMPUTING
--xt cloud growth now
Cloud computing is on the rise – cheaper costs
Soghoian et al 15 (Christopher Soghoian, researcher at Harvard and Yale, Kevin Bankston, Policy Director of New America’s Open Technology
Institute, Fred Cate, C. Ben Dutton Professor of Law at Indiana University Maurer School of Law, Chris Hoofnagle, Co-Director, Berkeley Center for Law &
Technology, Marcia Hofmann, senior staff attorney at the Electronic Frontier Foundation, Rob Faris, Research Director of the Berkman Center for Internet and
Society at Harvard University, Albert Gidari, partner of Perkins Coie in Privacy & Security, Jennifer Granick, Director of Civil Liberties for the Center for Internet
and Society at Stanford Law School, Orin Kerr, professor of law at the George Washington University , Susan Landau, Professor of Social Science and Policy
Studies at Worcester Polytechnic Institute, Paul Ohm, Professor of Law at the Georgetown University Law Center, Nicole Azer, Technology & Civil Liberties
Policy Director in ACLU California, John Palfrey, previous executive director of Harvard's Berkman Center for Internet & Society, Marc Rotenberg, President and
Executive Director of the Electronic Privacy Information Center, Adam Schostack, expert in security, Ryan Singel, journalist of technology at WIRED, Adam
Thierer, senior research fellow with the Technology Policy Program at the Mercatus Center at George Mason University, Jonathan Zittrain, professor of Internet
law and the George Bemis Professor of International Law at Harvard Law School, “Privacy And Law Enforcement: Caught In The Cloud: Privacy, Encryption, And
Government Back Doors In The Web 2.0 Era”, 12/16/13,
http://www.researchgate.net/publication/228365094_Privacy_And_Law_Enforcement_Caught_In_The_Cloud_Privacy_Encryption_And_Government_Back_Door
s_In_The_Web_2.0_Era)//EM
Cloud Creep and the Rise of Cloud Services as the Pre-installed Default While some users may choose to switch to cloud-based services, others
are not as fortunate and often
this decision is made without their knowledge . Due to the significant reductions in licensing
and support costs , many corporate and government IT managers are making the switch . Compared to the $500 list price
for the full version of Microsoft Office Professional,23 Google’s $50-per-year price tag is a bargain—especially given that it includes telephone, email and web support.24 Corporate enterprise managers are able to re-brand the Google Apps products with their own companies’
logos. The services also plug directly into existing IT infrastructure. For example, corporate Google Mail customers can configure the service to
use their own Internet domain names, making the switch oblivious to outsiders and customers who might otherwise recognize the telltale
‘gmail.com’ email addresses. Incoming students at thousands of universities are now issued Google accounts on their first day, enabling
them to write term papers and access their official school email inboxes that are hosted on Google’s servers.25 University students are not alone in this switch—
before he was tapped to become the Federal Chief Information Officer, Vivek Kundra switched 38,000 Washington DC employees from
Microsoft Office to Google Docs. 26 Google claims that nearly 2 million businesses use Google Apps, with thousands more signing up
each day.27 While some students and employees realize that they are using cloud-based services, many others may not, particularly when the services have
been rebranded and heavily stripped of Google’s logos.28 At the consumer level, cloud services are also making inroads through the use of preinstalled desktop icons on new PCs, particularly in low end devices. Over the past year, sub $400 “netbook” portable computers have taken the
computing industry by storm. The manufacturers of these devices operate with extremely low profit margins, which they hope to make up in volume.29 As a
result, the netbook makers are trying many possible ways to lower their own costs. One of the main ways they have done this is to abandon
Microsoft’s operating system and Office suite. In addition to pre-installing these computers with the Linux operating system, several manufacturers
also ship their netbook products with prominent icons for Google’s Docs and Spreadsheets tools.30 In addition to the general industry trends that are pushing
many towards cloud-based services, new technologies make such transitions less obvious to end-users. Two of these are now highlighted: single
site browsers, and offline content.
A2 ADV – HUMINT
--xt sq solves
HUMINT funding and recruitment are increasing
Henely Putnam University 11
Henley-Putnam University( is the only accredited university that specializes exclusively in intelligence, counterterrorism and
protection and offers over 100 courses on topics such as covert actions, counterterrorism and intelligence team management.
By completing a degree here you will further differentiate yourself as a specialist among your peers. Pursue a Bachelors or
Masters Degree within your chosen specialty , or a Doctorate Degree in Strategic Security.
http://www.henleyputnam.edu/intelligencedegrees/humanintelligence.aspx)
Since the terrorist attacks on September 11, 2001, there has been renewed focus on the important role that HUMINT
plays in collecting information on terrorist networks and other hostile non-state actors. Consequently, calls for building up
a new, robust cadre of HUMINT professionals has led to increase investment in funding and recruiting efforts ,
especially for foreign language specialists and those who have traveled extensively overseas. At Henley-Putnam University,
we offer a nationally accredited (DETC) program that focuses exclusively on intelligence through our Bachelor and
Master of Science Degrees in Intelligence Management. Our 100% online courses are taught by expert faculty with
real world experience from the government and intelligence community . Graduates of our highly-focused program will
be able to: Manage a team of intelligence professionals from different disciplines, conduct operations that include clandestine
or covert activities and present finished intelligence in a manner appropriate to the consumer, to name but a few. Classes cover
topics like Advanced Intelligence Operations, Counterespionage, and Intelligence Team Management. Embark on an exciting
education path with Henley-Putnam by calling us today
--xt humint fails
HUMINT fails: can’t gather on terrorist groups and new groups pop up
Best, 2 (Richard A. Best Jr., Specialist in National Defense Foreign Affairs, Defense, and Trade Division,
CRS report for congress, Wrote The National Intelligence Council: Issues and Options for Congress and
published in Library of Congress. Pages 9-11 February 21, 2002.)\\mwang
Terrorists do not usually appear on the diplomatic cocktail circuit nor in gatherings of local businessmen. In many
cases they are also involved in various types of criminal activities on the margins of society . Terrorist groups may
be composed almost wholly of members of one ethnic or religious group. They may routinely engage in criminal
activities or human rights abuses. Developing contacts with such groups is obviously a difficult challenge for U.S.
intelligence agencies; it requires long-lead time preparation and a willingness to do business with unsavory
individuals. It cannot in many cases be undertaken by intelligence agents serving under official cover as diplomats or military attaches. It may
require an in-depth knowledge of local dialects and customs. Furthermore, the list of groups around the world that might at some
point in the future be involved in terrorist activities is not short; making determinations of where to seek agents whose reporting will
only be important under future eventualitiesis a difficult challenge with the risk of needlessly involving the U.S. with corrupt and ruthless
individuals. Critics of the current U.S. humint collection effort point to these and other institutional problems. One report quotes a former CIA
official: The CIA probably doesn’t have a single truly qualified Arabic-speaking officer of Middle Eastern background
who can play a believable Muslim fundamentalist who would volunteer to spend years of his life ... in the
mountains of Afghanistan need is for greater numbers of foreign language-capable intelligence personnel, with increased fluency in
specific and multiple languages. The Committee has heard repeatedly from both military and civilian intelligence producers and consumers that
this is the single greatest limitation in intelligence agency personnel expertise and that it is a deficiency throughout the
Intelligence Community.”
It is administratively difficult to develop resources throughout the world over a long period of time and costs are
higher than adding intelligence staff to embassies. Few observers could have predicted the intense U.S. concern
with
Somalia, Kosovo, or Afghanistan that eventually developed. Ten years from now there may be a whole set of
challenges from groups that no one today is even aware of. In short, reorienting humint collection to give significantly
greater attention to terrorist or potentially terrorist groups would have important administrative implications for the
Intelligence Community. While budgetary increases would not necessarily be dramatic given the size of the existing intelligence budget
(even paying hundreds of human agents would be far less costly than deploying a satellite), the infrastructure needed to train and
support numerous agents serving under non-official cover would grow significantly. Extensive redundancy would
be required to cover terrorist groups that may never pose significant threats to U.S. interests.
--xt humint not key
HUMINT is not key to intelligence gathering – other methods fill in
Sundaraj-Keun 12 (Simon S. Sundaraj-Keun. Think Tank Consultancy (Freelance), Sole-Proprietor Research, Malaysian
Dutch Descendents Project (MDDP) formerly at Dalat International School, HarvardX, Specialize background in providing risk
assessment Human Resources & Relations expert with wide proficiency in researching, analyzing, and amending organizational
frameworks.) "Human Intelligence: Past, Present, and Future”. <https://simonsundarajkeun.wordpress.com/2012/11/22/humanintelligence-past-present-and-future/> November 22, 2012.) \\mwang
There are methods of obtaining intelligence like Open Source Intelligence (OSINT) which is the gathering of
information from open sources. Signals Intelligence (SIGINT) and Communications Intelligence (COMINT) is the
gathering of sources from interception of signals, and Electronic Intelligence (ELINT) is the gathering of Intel from
non-communications electronic emissions. For the most part the methods used in obtaining intelligence does not
place the human component in a high risk situation. The risks of losing an operative is as real as it can be because the
targeted nation can deport, imprison, or even execute a spy. Some nations will deploy counter agents in order to
roll up several clandestine networks and even hire agents to infiltrate into other nation’s agency . Sometimes it is not
easy to imprison a spy because he or she is given diplomatic immunity by the home nation. The targeted nation can declare persona non grata
in order to evict the spy from its borders . It takes a long time to train a HUMINT agent than it takes to replace a broke
computer or replace a spy satellite. For cost effectiveness and risk free the United States has placed its emphasis on the
development of technologies in order to have command and control in the field of communication. One has to
understand that the advancement of technology brought the means and methods of espionage to a new level .
The days of agents running around a nation-state has evolved to an era of surveillance of all electronic
transmissions including cell phone logs, voice mail, email, packet sniffing, trace routing and wireless transmissions.
In reality the prioritization of most intelligence agencies today has been to control and monitor financial transactions, the information corridor
(internet and communication lines), and the spread of technological advance weaponry (weapons of mass destruction). In the post-Cold War
world numerous agencies have been data mining the world’s stock exchanges and this program was formalized on October 26, 2001 in the form
of the Patriot Act. This helps track the financing of people who might be laundering money and continues to be done without any warrants. It is
important for any nation to gather the political and economic information that might be of advantage to its Strategic Intelligence. The United
States is no exception to the rule and monitoring of foreign communications is essential in maintaining its national interest. In 2002, new
programs of satellite surveillance and unmanned low level drones armed with missiles made it possible not only to perform surveillance in real
time, but to respond with force. Thus proving that unmanned drones could be used for elimination operations without the lost of Allied forces.
One has to understand that out of the advancement of technology, chronic problems begin to emerge in the shadows, which would in turn
create an intelligence blind spot that would leave a state’s national security in the dark. This blind spot is the lack of balance between the use of
HUMINT and technology as a symbiosis component in force multiplication. The lack of investment in HUMINT could lead to disaster as
experience during World War II and currently on the War on Terrorism demonstrates. There is a serious problem faced by HUMINT,
which is the lack of time and effort to learn the multiple languages that are spoken by the various ethnic groups
within the nations around the globe. Languages transcend beyond the physical boundaries of nation-states and reflect the unique
multiracial heritage of a nation. There a firm belief that appreciation of languages should take priority in order to address the intelligence
problem, it will help to promote one’s national security, which is unfortunately lacking in today’s global arena (especially in the United States
Intelligence community).
1nc relations turn
HUMINT expansion decks relations – prefer empirics
Turner 05 (Micheal A. Turner, teaches at both San Diego State University and the University of San Diego. He is also a
consultant to the United States Government on national security matters. Until 2006, Dr. Turner was the Director of International
Relations Program at Alliant International University in San Diego, CA. Before joining Alliant, Dr. Turner was a senior CIA officer,
attached both to the analytical directorate as well as to elements supporting the Director of Central Intelligence) Why Secret
Intelligence Fails Page 90-92. January 1, 2005. \\mwang
HUMINT's contribution to the intelligence process over the years has been uneven. HUMINT assets provided valuable
information on many of America's foreign policy crises, such as the Sino-Soviet split in 1962, but it may also have contributed to some
of America's foreign policy debacles, such as China's occupation of Tibet in the early 1950s. " In addition, the
competition between technical collection disciplines and HUMINT in the 1970s served to damage the number and
quality of HUMINT assets in critical areas of the globe. For example, DCI Turner fired many of DO's case officers in the
late 1970s as a cost-saving measure, arguing that technical collection methods would take up the slack." In doing so,
he probably eliminated America's eyes and ears on the ground in such places as the Middle East, possibly contributing to the Iran hostage
situation and the failure to forecast the Soviet invasion of Afghanistan, both in 1979. More recently, HUMINT assets had to take
second place in Bosnia, Kosovo, and Iraq to the more sophisticated and compelling techniques of imagery and
signals intelligence. HUMINT's main advantage is that it is labor intensive and there-fore, compared to national technical means, cheap.
HUMINT also has the advantage of being capable of providing relevant and timely information on the motivations and intentions of foreign
HUMINT's disadvantages probably outweigh its advantages. One, American case
officers may not have sufficient training and know-how to perform their jobs well. According to one analyst, CIA
operatives are not particularly well prepared; they seldom speak foreign languages well and almost never know a
line of business or a technical field.I3 Two, the process of recruiting spies is time consuming and lengthy, which
often brings into question the benefits of such an activity in relation to its cost. Three, HUMINT information is
highly perishable and therefore has a low threshold of utility. Four, HUMINT is often vulnerable to deception and
double-agent operations. Five, spying is illegal everywhere, and case officers who have been caught in the process
political leaders. On the other hand,
of recruitment have embarrassed the U.S. government and damaged relations with both unfriendly and friendly
governments . Six, espionage is risky to the lives of intelligence agents and their assets. Seven, because HUMINT
assets are often employed in covert actions, espionage operations sometimes become enmeshed in political
controversies at home. Eight, many people believe that spying is ethically wrong, an activity that diminishes the
moral standing of the United States around the globe .
--xt hurts relations
HUMINT endangers foreign relations-secretive.
Best, 2 (Richard A. Best Jr., Specialist in National Defense Foreign Affairs, Defense, and Trade Division,
CRS report for congress, Wrote The National Intelligence Council: Issues and Options for Congress and
published in Library of Congress. Page 9 February 21, 2002.)\\mwang
Although humint is not in itself an expensive discipline, it requires large amounts of support and an awareness by senior
officials of possible negative consequences. Potential complications, including imprisonment of U.S. agents in foreign
countries and loss of friendly lives, have to be given careful consideration. Major diplomatic embarrassment to the United
States can result from revelations of covert efforts, especially those that go awry; such embarrassment can jeopardize
relationships that have been developed over many years. Collecting humint to support the counterterrorism
effort will require significant Changes in the work of intelligence agencies.
Espionage on allies ruins foreign relations and perception
Lister 13 (Tim Lister, covered international news for 25 years as a producer and reporter for the BBC and CNN. He has lived
and worked in the Middle East, and has also worked in Afghanistan and Pakistan. In 2004, he produced the award-winning
documentary “Between Hope and Fear: Journeys in the New Iraq” for CNN. He is now an independent writer and producer.
“Europe falls out of love with Obama over NSA spying claims.” <http://www.cnn.com/2013/10/24/world/europe/europe-ussurveillance/> October 25, 2013)\\mwang
On July 24, 2008, then-presidential candidate Barack Obama addressed tens of thousands of Germans on the avenue that leads from the
Brandenburg Gate in Berlin. In a pointed reference to the outgoing administration of President George W. Bush, he promised a new
era of "allies who will listen to each other, who will learn from each other, who will, above all, trust each other."
One German present among the hugely enthusiastic crowd said the occasion reminded him of Berlin's famous "Love Parade." No U.S. politician
since John F. Kennedy had so captured Europeans' imagination. Five years on, in the words of the song, it's a case of "After the Love Has Gone."
The U.S. ambassador in Berlin has been summoned to the foreign ministry over reports in Der Spiegel that the U.S. National Security
Administration (NSA) monitored Chancellor Angela Merkel's official cellphone. His counterpart in Paris received a similar summons earlier this
week after revelations in Le Monde. Merkel says Europe's trust must be repaired after U.S. spying claims Both Der Spiegel and Le Monde used
documents provided by former NSA contractor Edward Snowden. Merkel's spokesman, Steffen Seibert, lamented a "grave breach
of trust." One of Chancellor Merkel's closest allies, Defense Minister Thomas de Maiziere told broadcaster ARD
there would be consequences . "We can't simply turn the page," he warned. Der Spiegel reported Thursday that Thomas
Oppermann, who leads the parliamentary committee that scrutinizes Germany's intelligence services, complained that "the NSA's monitoring
activities have gotten completely out of hand, and take place beyond all democratic controls." In an article for the forthcoming edition of
Foreign Affairs magazine, Henry Farrell and Martha Finnemore argue that it's the disclosure of such practices rather than their existence that is
damaging. "When these deeds turn out to clash with the government's public rhetoric, as they so often do, it
becomes harder for U.S. allies to overlook Washington's covert behavior and easier for U.S. adversaries to justify
their own," they write. "The U.S. government, its friends, and its foes can no longer plausibly deny the dark side of U.S. foreign policy and will
have to address it head-on," they argue. Among the Twitterati, #merkelphone has gained some traction, with the famous Obama motif "Yes We
Can" finding a new interpretation. And the European media has begun to debate whether the revelations provided by Edward Snowden to The
Guardian and other newspapers will do to Obama's image on the continent what the Iraq war did to that of President George W. Bush.
Hyperbole perhaps, but the Obama administration is on the defensive, caught between fuller disclosure of just what the NSA has been up to
and the need to protect intelligence-gathering methods. The president himself received what German officials describe as an angry call from
Merkel Wednesday demanding assurances that there is no American eavesdropping on her conversations. The language out of the White
House has been less than forthright, with spokesman Jay Carney saying that "the president assured the chancellor that the United States is not
monitoring, and will not monitor, the communications of the chancellor." His careful avoidance of the past tense has heightened suspicions in
Europe that only the Snowden disclosures have forced a change of practice. Even pro-U.S. newspapers like the Frankfurter Allgemeine Zeitung
are in full throttle, writing that: "The government in Washington has apparently not yet understood the level of damage
that continues to be caused by the activities of American intelligence agencies in Europe."
A2 ADV – IHRL
alt cause – u.s.
Multiple U.S. violations thump
Capdevila 15, ( Gustavo, Writer for the Inter Press Service, Inter Press Service New Agency, “UN Failings Exposed in UN
Human Rights Review”, http://www.ipsnews.net/author/gustavo-capdevila/, AL)
Without the emperor’s clothes, like in the Hans Christian Andersen story, the United States was forced to submit its human rights record to the
scrutiny of the other 192 members of the United Nations on Monday. Washington attended the country’s second universal periodic review (UPR) in
the Geneva-based U.N. Human Rights Council, which reviews each U.N. member country’s compliance with international human rights standards. “So today was
a demonstration of the no confidence vote that world opinion has made of the United States as a country that considers itself a human rights champion,” said
Jamil Dakwar, director of the Human Rights Program (HRP) of the American Civil Liberties Union, a non-profit organisation that has worked to defend individual
rights and liberties since 1920. “I think that there was a clear message from today’s review that the United States needs to do much more to protect
human rights and to bring its laws and policies in line with human rights standards,” he told IPS. Although the UPR has come in for criticism
because its conclusions are negotiated among governments, it is recognised for starkly revealing the abuses that states commit against their own citizens and
those of other countries – and the Monday May 11 session was no exception. One of the demands set forth by the 117 states taking part in the debate was for
Washington to take measures to prevent acts of torture in areas outside the national territory under its effective control and prosecute perpetrators, and to ensure
that victims of torture were afforded redress and assistance. With respect to torture, among the positive achievements mentioned was the release of a report on
abuses committed as part of the Central Intelligence Agency’s (CIA) interrogation practices. The head of the 20-member U.S. delegation that flew over from
Washington, acting legal adviser in the State Department Mary McLeod, gave an indication that the negotiations for the visit by Juan Méndez, U.N. special
rapporteur on torture and other cruel, inhuman or degrading treatment or punishment, to the U.S. Naval Base at Guantanamo Bay were not
closed. In March, Méndez, an Argentine lawyer who lives in the United States, complained that Washington did not intend to give him access during his visit to
the more than 100 inmates in Guantanamo. The country’s closest ally, the United Kingdom, congratulated the United States on its commitment
to close Guantanamo, announced by President Barack Obama before his first term began in January 2009. But the British delegate said they
would like to see it actually happen. “The problem with Guantanamo is that it created a system of indefinite detention that we would
like to see shut down with the facility,” Dakwar said. “It also created a flawed system of military commissions that provide a second system of justice. This system
should also be shut down.” Ejim Dike, executive director of the U.S. Human Rights Network, said the concerns brought to the attention of the U.S. delegation
revolved around issues of poverty, criminalisation and violence. “In the United States we have more money today than we ever had. We have the highest child
poverty rate of any industrialised country. However, for the UPR no one from the government mentioned poverty,” Dike commented to IPS. The Cuban delegates
addressed the issue, urging the United States to guarantee the right of all residents to decent housing, food, healthcare and education, in order to reduce the
poverty that affects 48 million of the country’s 319 million people. A number of countries asked the United States to ratify the International Covenant on
Economic, Social and Cultural Rights, in effect since 1976 and considered one of the pillars of the U.N. human rights system. Related IPS Articles U.N. Member
States Accused of Cherry-Picking Human Rights Release of Senate Torture Report Insufficient, Say Rights Groups They also pointed out that the United
States has not ratified the Convention on the Rights of the Child. Nor has it ratified the Convention on the Elimination of All
Forms of Discrimination against Women and the Convention on the Rights of Persons with Disabilities . Furthermore, it has not
recognised the International Convention on the Protection of the Rights of All Migrant Workers and Members of Their Families, or
the International Labour Organisation’s (ILO) conventions on forced labour, minimum age for admission to employment, domestic workers, and
discrimination in respect of employment and occupation. McLeod also said that her country is not currently considering the ratification of the Rome Statute, which
created the International Criminal Court. Dakwar said the debate in the UPR highlighted “the issue of the lack of a fair criminal justice system that is being
demonstrated through the stops and frisks, racial profiling, racial studies in the death penalty. You see it in the police violence and killing of unarmed AfricanAmericans with no accountability. “Its inhuman and unfair system of immigration needs to again be brought in line with human rights…That means…no detention
of migrants, and ending migrants’ family detention,” he added. Another of the main recommendations to the United States is that it desist from targeted killings
through drones. “The United States continues to violate human rights in the name of national security and it needs to roll back these policies and bring them in
line with the U.S. constitution and international law,” Dakwar argued. “Also in the domestic system we have surveillance of Muslim communities.
There is a guidance by the Department of Justice that they allow the use of informants within communities, particularly Muslim
and Middle Eastern communities,” he added.
Particularly the War on Terror
Nazarova 1, (Inna, Writer for Fordham international Law Journal, Berkeley Press, Alientating “Human” from ”Right”: U.S. and
UK Non-Compliance with Asylum Obligations Under International Human Rights Law,
http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=1849&context=ilj, AL)
"Let the terrorists among us be warned," said the U.S. Attorney GeneralJohn D. Ashcroft in a speech at the U.S. Conference of Mayors on October 25,
2001.1 "If you overstay your visas even by one day, we will arrest you. If you violate a local law, we will work to make sure that you
are put in jail and kept in custody as long as possible." 2 The U.S Attorney General's statement conflates terrorist status with
immigrant status.' It reflects the blurring of meanings of "terrorist" and "asylum seeker" occurring in the United States and the United
Kingdom ("UK") 5 today. 6 Additionally, the statement underscores the dubious effects the recent U.S. and UK anti-terrorism measures
have on asylum seekers' rights.7 These developments are a far cry from the humanitarian ideals,' which accompanied the establishment
of the Office of the United Nations9 High Commissioner for Refugees ("UNHCR").° Nor do the new measures echo the promises the United
States and the UK made when they ratified the Universal Declaration on Human Rights ("UDHR") The new anti-terrorism laws implemented
in the two countries threaten to circumvent the right to seek asylum and the nonrefoulement principle in unprecedented ways. Scholars also warn that
violations of asylum seekers' rights may become a pretext for civil rights violations of regular citizens. A multilateral policy is the only
solution that can ensure that the U.S. and the UK domestic considerations do not topple the global structure of refugee
protection.
alt cause – other countries
Other countries thump
Crowe 15 (Anna, Clinical Advocacy Fellow at the Human Rights Program, Graduate from Harvard Law, Joint Publication
Released on Encryption, Online Anonymity and Human Rights, 6/17/15, http://hrp.law.harvard.edu/staff/joint-publicationreleased-on-encryption-online-anonymity-and-human-rights/, AL)
The International Human Rights Clinic and Privacy International released a publication today that examines the vital role that
encryption and anonymity tools and services play in safeguarding human rights. The 30-page publication, “Securing Safe Spaces Online:
encryption, online anonymity, and human rights,” complements a landmark report by the Special Rapporteur on the promotion and
protection of the right to freedom of opinion and expression, David Kaye. Kaye’s report, which he will present to the United Nations Human
Rights Council in Geneva today, calls on states to ensure security and privacy online by providing “comprehensive protection”
through encryption and anonymity tools. The clinic’s joint publication explores measures that restrict online encryption and
anonymity in four particular countries – Morocco , Pakistan , South Korea , and the U nited K ingdom. In all four countries,
these restrictions impede private and secure online communication and inhibit free expression. The publication also points to
opportunities for governments, the corporate sector, and civil society to eliminate or minimize obstacles to use of encryption and
online anonymity. The Clinic’s collaboration with Privacy International dates back to last fall, when we supported a coalition of NGOs calling for the creation
of a new Special Rapporteur on the Right to Privacy. In March 2015, the Human Rights Council established this new Special Rapporteur. The Clinic began work
on the encryption and anonymity publication this past spring. Clinical students Sarah Lee, JD ’16, and Mark Verstraete, JD ’16, worked on the publication
throughout the semester and participated in a meeting of Privacy International’s global partners in April.
A2 ADV – PRIVACY
--xt sq solves
Surveillance doesn’t violate privacy – litany of checks on law enforcement
Yates and Comey 7/8 <Sally Quillian Yates, Deputy Attorney General, and James B. Comey, Director of the FBI, 7/8/2015,
“Going Dark: Encryption, Technology, and the Balance Between Public Safety and Privacy”, p.5-7,
http://www.judiciary.senate.gov/imo/media/doc/07-08-15%20Yates%20and%20Comey%20Joint%20Testimony1.pdf>//wx
The rules for the collection of the content of communications in order to protect public safety have been worked out by Congress and
the courts over decades. Our country is justifiably proud of the strong privacy protections established by the Constitution and by
Congress, and the Department of Justice fully complies with those protections. The core question is this: once all of the requirements and
safeguards of the laws and the Constitution have been met, are we comfortable with technical design decisions that result in barriers to obtaining evidence of a
crime? ‐ 6 - We would like to describe briefly the law and the extensive checks, balances, and safeguards that it contains. In addition to the Constitution,
two statutes are particularly relevant to the Going Dark problem. Generally speaking, in order for the Government to conduct realtime—i.e.,
data in motion—electronic surveillance of the content of a suspect’s communications, it must meet the standards set forth in either the amended
versions of Title III of the Omnibus Crime Control and Safe Streets Act of 1968 (often referred to as “Title III” or the “Wiretap Act”) or the Foreign Intelligence
Surveillance Act of 1978 (or “FISA”). Title III authorizes the Government to obtain a court order to conduct surveillance of wire, oral, or
electronic communications when it is investigating Federal felonies. Generally speaking, FISA similarly relies upon judicial authorization,
through the Foreign Intelligence Surveillance Court (FISC), to approve surveillance directed at foreign intelligence and international terrorism threats.
Regardless of which statute governs, however, the standards for the real-time electronic surveillance of United States persons’
communications are demanding. For instance, if Federal law enforcement seeks the authority to intercept phone calls in a criminal case using the
Wiretap Act, a Federal district court judge must find
there is probable cause to believe the person whose communications are targeted for
That alternative investigative procedures have failed, are
unlikely to succeed, or are too dangerous;
probable cause to believe that evidence of the felony will be obtained through
the surveillance. The law also requires that before an application is even brought to a court, it must be approved by a highranking Department of Justice official. In addition, court orders allowing wiretap authority expire after 30 days; if the Government seeks to
extend surveillance beyond this period it must submit another application with a fresh showing of probable cause and ‐ 7 - investigative necessity. And the
Government is required to minimize to the extent possible its electronic interceptions to exclude non-pertinent and privileged communications. All of these
requirements are approved by a Federal court. The statutory requirements for electronic surveillance of U.S. persons under FISA are
also demanding. To approve that surveillance, the FISC, must, among other things, find probable cause to believe:
the target of the
surveillance is a foreign power
the facilities or places at which the electronic surveillance is
directed is being used or is about to be used by a foreign power or an agent of a foreign power. Similarly, when law enforcement investigators
seek access to electronic information stored—i.e., data at rest—on a device, such as a smartphone, they are likewise bound by the mandates
of the Fourth Amendment, which typically require them to demonstrate probable cause to a neutral judge, who independently decides whether to issue a
search warrant for that data. Collectively, these statutes reflect a concerted Congressional effort, overseen by an independent judiciary,
to validate the principles enshrined in our Constitution and balance several sometimes-competing, yet equally-legitimate social interests:
privacy, public safety, national security, and effective justice. The evolution and operation of technology today has led to recent trends that threaten
this time-honored approach. In short, the same ingenuity that has improved our lives in so many ways has also resulted in the proliferation of products and
services where providers can no longer assist law enforcement in executing warrants. Provider Assistance Both Title III and FISA include provisions mandating
technical assistance so that the Government will be able to carry out activities authorized by the court. For example, Title III ‐ 8 - specifies that a “service
provider, landlord…or other person shall furnish [the Government]…forthwith all…technical assistance necessary to accomplish the interception.” As the
communications environment has grown in volume and complexity, technical assistance has proven to be essential for interception to occur. These provisions
alone, however, have not historically been sufficient to enable the Government to conduct electronic surveillance in a timely and effective manner.
--no i/l
There is no right to never reveal encrypted data
Hess 15 (Amy Hess, Executive Assistant Director Federal Bureau of Investigation, Before the
Subcommittee on Information Technology Oversight and Government Reform U.S. House of
Representatives Concerning Encryption and Cybersecurity for Mobile Electronic Communication Devices,
page 7-8, April 29, 2015.)\\mwang
Civil Liberties and the Rule of Law Just as we have an obligation to address threats to our national security and our public safety, we also
have an obligation to consider the potential impact of our investigations on civil liberties , including the right to privacy.
Intelligence and technology are key tools we use to stay ahead of those who would do us harm. Yet, as we evolve and adapt our investigative
techniques and our use of technology to keep pace with today’s complex threat environment, we must always act within the confines
of the rule of law and the safeguards guaranteed by the Constitution. The people of the FBI are sworn to protect both security
and liberty. We care deeply about protecting liberty – including an individual’s right to privacy through due process of law –
while simultaneously protecting this country and safeguarding the citizens we serve. The rule of law is our true north; it is the
guiding principle for all that we do. The world around us continues to change, but within the FBI, our values must
never change. Every FBI employee takes an oath promising to uphold the United States Constitution. It is not enough
to catch the criminals; we must do so while upholding civil rights. It is not enough to stop the terrorists; we must do so while
maintaining civil liberties. It is not enough to prevent foreign nations from stealing our secrets; we must do so
while upholding the rule of law. Following the rule of law and upholding civil liberties and civil rights are not burdens. They
are what make all of us safer and stronger. In the end, we in the FBI will be judged not only by our ability to keep Americans
safe from crime and terrorism, but also by whether we safeguard the liberties for which we are fighting and maintain the trust of the
American people. And with the rule of law as our guiding principle, we also believe that no one in this country should be beyond the
law. We must follow the letter of the law, whether examining the contents of a suspected individual’s closet or the contents of her smart
phone.
But the notion that the closet could never be opened – or that the phone could never be unlocked or
unencrypted – even with a properly obtained court order, is troubling. Are we as a society comfortable knowing
that certain information is no longer available to law enforcement under any circumstances? Is there no way to
reconcile personal privacy and public safety? It is time to have open and honest debates about these issues.
DA – CIRCUMVENTION
1nc circumvention
NSA circumvents – multiple methods
Perloth, et al 13 (Nicole, technology and cybersecurity reporter for The New York Times, guest lecturer at Stanford’s
graduate schools of business and communications, former deputy editor at Forbes, winner of the Society of American Business
Editors and Writers award for best technology coverage in 2013, voted the top cybersecurity journalist by the SANS Institute in
2014, graduate of Stanford University’s Graduate School of Journalism, and Princeton University; Scott Shane, journalist for The
New York Times, reporting principally about the United States intelligence community, former Moscow correspondent for The
Baltimore Sun; “NSA Able to Foil Basic Safeguards of Privacy on Web, http://www.nytimes.com/2013/09/06/us/nsa-foils-muchinternet-encryption.html?_r=0) BJ
N ational S ecurity A gency is winning its long-running secret war on encryption, using supercomputers , technical trickery ,
court orders and behind-the-scenes persuasion to undermine the major tools protecting the privacy of everyday communications in the
Internet age, according to newly disclosed documents. The agency has circumvented or cracked much of the encryption, or digital scrambling,
The
that guards global commerce and banking systems, protects sensitive data like trade secrets and medical records, and automatically secures the e-mails, Web
searches, Internet chats and phone calls of Americans and others around the world, the documents show.
--xt laundry list
Yes circumvention – supercomputing brute attacks, voluntary cooperation, hacking, and subterfuge
Perloth, et al 13 (Nicole, technology and cybersecurity reporter for The New York Times, guest lecturer at Stanford’s
graduate schools of business and communications, former deputy editor at Forbes, winner of the Society of American Business
Editors and Writers award for best technology coverage in 2013, voted the top cybersecurity journalist by the SANS Institute in
2014, graduate of Stanford University’s Graduate School of Journalism, and Princeton University; Scott Shane, journalist for The
New York Times, reporting principally about the United States intelligence community, former Moscow correspondent for The
Baltimore Sun; “NSA Able to Foil Basic Safeguards of Privacy on Web, http://www.nytimes.com/2013/09/06/us/nsa-foils-muchinternet-encryption.html?_r=0) BJ
The agency, according to the documents and interviews with industry officials, deployed custom-built, superfast computers to break
codes, and began collaborating with technology companies in the United States and abroad to build entry points into their products. The
documents do not identify which companies have participated. The N.S.A. hacked into target computers to snare messages before they were
encrypted. In some cases, companies say they were coerced by the government into handing over their master encryption keys or
building in a back door. And the agency used its influence as the world’s most experienced code maker to covertly introduce
weaknesses into the encryption standards followed by hardware and software developers around the world. “For the past decade,
N.S.A. has led an aggressive, multipronged effort to break widely used Internet encryption technologies,” said a 2010 memo describing
a briefing about N.S.A. accomplishments for employees of its British counterpart, Government Communications Headquarters, or GCHQ. “Cryptanalytic
capabilities are now coming online. Vast amounts of encrypted Internet data which have up till now been discarded are now exploitable.”
--xt decryption
Yes circumvention – NSA supercomputers can decrypt
PTI 13 (The Economic Times, “NSA cracked online encryption technology”, 6 September 2013,
http://articles.economictimes.indiatimes.com/2013-09-06/news/41835321_1_encryption-nsa-gchq)
Intelligence agencies of the US and the UK have teamed up to crack the encryption technology designed to provide online privacy
that guards global commerce and banking systems, protects sensitive data like trade secrets and medical records, Internet chats
and phone calls. The National Security Agency of the United States and its British counterpart Government Communications Headquarters (GCHQ) have
cracked the encryption by using supercomputers, court orders, and some cooperation from technology companies, according to
multiple media reports in the US and UK. The classified documents leaked by whistleblower Edward Snowden show the NSA has cracked much of the
encryption that guards global commerce and banking systems, protects sensitive data like trade secrets and medical records,
Internet chats and phone calls, the reports said. News articles by The Guardian, The New York Times and ProPublica reported that these classified
documents reveal that unlike commonly presumed in the public, none of the data on internet is safe from prying eyes, including those of the
government, and the NSA wants to keep it that way. "The agency treats its recent successes in deciphering protected information
as among its most closely guarded secrets," The New York Times said. The NSA deployed custom-built, superfast computers to break
codes, and began collaborating with technology companies in the US and abroad to build entry points into their products, but the
documents do not identify which of the IT companies participated in it. The agency hacked into target computers to snare messages before they
were encrypted, the report said. In some cases, companies say they were coerced by the government into handing over their master
encryption keys or building in a back door. "And the agency used its influence as the world's most experienced code maker to covertly introduce
weaknesses into the encryption standards followed by hardware and software developers around the world," the report said. "For the past decade, NSA has led
an aggressive, multipronged effort to break widely used Internet encryption technologies," said a 2010 memo describing a briefing about NSA accomplishments
for employees of the GCHQ. The media outlets said they were asked by the intelligence officials not to publish the articles arguing that this might prompt foreign
targets to switch to new forms of encryption or communications that would be harder to collect or read. The news organisations removed some specific facts but
decided to publish the article because of the value of a public debate about government actions that weaken the most powerful privacy tools, media outlets
reported.
Encryption doesn’t solve – Random number generators
Wayner 5/5/14 – Peter Wayner is contributing editor at InfoWorld and the author of more than 16 books on diverse topics,
including open source software ("Free for All"), autonomous cars ("Future Ride"), privacy-enhanced computation ("Translucent
Databases"), digital transactions ("Digital Cash"), and steganography ("Disappearing Cryptography"). (“Peter Wayner”; 11
reasons encryption is (almost) dead; http://www.infoworld.com/article/2607386/encryption/11-reasons-encryption-is--almost-dead.html)//pk
Encryption's weak link No. 8: Backdoors aplenty Sometimes programmers make mistakes. They forget to check the size of an
input, or they skip clearing the memory before releasing it. It could be anything. Eventually, someone finds the hole and starts
exploiting it. Some of the most forward-thinking companies release a steady stream of fixes that never seems to end, and they
should be commended. But the relentless surge of security patches suggests there won't be an end anytime soon. By the time
you've finished reading this, there are probably two new patches for you to install. Any of these holes could compromise your
encryption. It could patch the file and turn the algorithm into mush. Or it could leak the key through some other path. There's no
end to the malice that can be caused by a backdoor. Encryption's weak link No. 9: Bad random-number generators Most of the
hype around encryption focuses on the strength of the encryption algorithm, but this usually blips over the fact that the keyselection algorithm is just as important. Your encryption can be superstrong, but if the eavesdropper can guess the key, it won't
matter. This is important because many encryption routines need a trustworthy source of random numbers to help pick the key.
Some attackers will simply substitute their own random-number generator and use it to undermine the key choice. The algorithm
remains strong, but the keys are easy to guess by anyone who knows the way the random-number generator was compromised.
--xt hacking
Even if backdoors are closed, NSA will just move on to hacking
Kopstein 7/17/15 <Joshua, cyberculture journalist and researcher and writer for AlJazeera, 7/17/15, “The feds don’t need digital
backdoors – they can hack you”, Aljazeera, http://america.aljazeera.com/opinions/2015/7/the-feds-dont-need-digital-backdoors-they-can-hackyou.html>//wx
The massive hack of Hacking Team, a surveillance company notorious for selling spyware to repressive regimes , brought a wave of
unrestrained schadenfreude to many social media feeds last week. A mysterious hacker spilled more than 400 gigabytes of the company’s emails, internal
documents, source code and more across the Internet, allowing journalists to lay bare the inner workings of one of the most controversial players in the booming
government surveillance industry. Privacy advocates have long been fascinated and appalled by Hacking Team, and for good reason. Its flagship spyware
suite, Remote Control System, or RCS, is a flashily advertised “hacking suite for governmental interception” that allows police to quietly
take control of electronic devices — reading emails and texts, recording keystrokes, snooping on Skype calls, even eavesdropping on the device’s
microphone and webcam. Security researchers at the University of Toronto previously discovered the software targeting activists and journalists from the United
Arab Emirates, Morocco and Ethiopia, using a hidden network of servers based in 21 countries. The company’s leaked emails and documents display a
disturbing nonchalance about all of this, confirming highly questionable clients including Sudan, Ethiopia, Saudi Arabia, Uzbekistan, Bahrain, Kazakhstan and
Tunisia, among many others. The U.S. government is also a customer: The Drug Enforcement Administration, Federal Bureau of Investigation and
U.S. Army have all bought Hacking Team’s spyware, which is sold as a service with software updates and full customer support. The company also has
plans for a U.S. branch, and is currently using a front company called Cicom USA to drum up business with other North American agencies including the
U.S. Department of Homeland Security, the Bureau of Alcohol Tobacco and Firearms, the New York City Police Department and the Royal Canadian Mounted
Police. Of course, it’s ironic that none of this would have likely come to light if not for an act of hacking. But if there’s a singular lesson of the post-Snowden era,
it’s that extreme acts of transparency are sometimes the only remedy for extreme corporate and government secrecy. Armed with the knowledge that these
intrusive tools are being sold to governments around the world, we must now begin a long-overdue debate about how, where and when — not to
mention if — governments should be allowed to hack their own citizens. In the U.S., that debate could not come any sooner. Despite the fact that a lack of
security led to the hack of the Office of Personnel Management, compromising a staggering 21 million government employee records, U.S. law enforcement
agencies such as the FBI are continuing a campaign of fear against widespread encryption. They’re demanding that companies such as Apple and
Google insert backdoors into their products so they can unscramble messages from criminals and terrorists, claiming that their inability to do so
is causing investigations to “go dark.” But one important takeaway from the Hacking Team leak is that government agencies are doing just
fine without backdoors. A key feature of Hacking Team’s software, and targeted surveillance in general, is the ability to overcome encryption by
compromising individual “endpoints,” such as a computer or smartphone. But the documents show this capability is sometimes redundant. The FBI, for example,
is so fully invested in homegrown hacking tools that it only bought Hacking Team spyware as a “backup” solution, according to leaked emails. If we reject
digital backdoors — and we should — we can’t be unprepared when more unregulated hacking powers are the next thing on the
FBI’s wish list. The FBI has been in the hacking business since the 1990s, yet its use of these tools and tactics has never been sufficiently scrutinized. In a
rare public decision in 2013, a judge in Texas denied an FBI request to send spyware to an unidentified suspect’s computer, criticizing its “vague assurances”
that innocent parties wouldn’t be affected. The FBI has since argued it doesn’t need a warrant to hack servers and electronic devices, even when they belong to
targets whose identities and locations are unknown. This March, a federal rule change that Google warned was a “monumental” constitutional threat granted
judges the authority to let the FBI do just that. Amazingly, the FBI’s new authority to hack hasn’t decreased the momentum of its quest for backdoors. During a
Congressional hearing last week, FBI director James Comey invoked the bogeyman of the Islamic State in Iraq and the Levant (ISIL) to illustrate the dangers of
encryption, but once again failed to provide any actual evidence of the problem. (On the contrary, a recent government report found only 4 cases last year in
which federal and state wiretaps couldn’t circumvent encryption.) Sen. Sheldon Whitehouse (D-Rhode Island) even suggested that if commercial encryption
prevents law enforcement access, companies such as Apple and Google that deploy it should be held legally liable. At the same time, a new report (PDF) from
some of the world’s most prominent security experts authoritatively concluded that enforcing backdoors would be disastrous for security. To wit: You can’t build a
backdoor for the FBI that can’t also be found and exploited by Chinese hackers, Russian cybercriminals or any other advanced adversary that cares to look. On
Thursday, the Web’s international standards body, the World Wide Web Consortium,concurred, writing, “It is impossible to build systems that can securely
support ‘exceptional access’ capabilities without breaking the trust guarantees of the web platform.” Boiled down, the crypto debate really becomes a question of
mass surveillance versus targeted surveillance. Backdoors would remove the technical barriers preventing governments from having
unfettered access to everyone’s communications. Hacking, meanwhile, circumvents those barriers using highly invasive but much
more targeted means. Of the two options, the latter seems vastly preferable. Surveillance should be rare, and hacking forces authorities to make a costbenefit analysis. That’s because computers are generally hacked by exploiting hidden flaws in software code; since those flaws are eventually found and
patched, it often that means the attacker needs to be really sure the target is worth it. The problem is that law enforcement wants both backdoors and
hacking powers — and we still haven’t had a debate about the latter. What kind of suspects should law enforcement be allowed to hack? What will stop
authorities from planting evidence on someone’s computer? Given the well-known problem of attribution in online crime investigations, how will they ensure
they’re hacking the right person and that no innocents will get caught up in the process? These are questions that need to be debated and answered now. If we
reject backdoors the way we did two decades ago — and we should — we can’t be unprepared when more unregulated hacking powers are the next thing on the
FBI’s wish list.
Encryption breach inevitable – mathematical algorithms
Wayner 5/5/14 – Peter Wayner is contributing editor at InfoWorld and the author of more than 16 books on diverse topics,
including open source software ("Free for All"), autonomous cars ("Future Ride"), privacy-enhanced computation ("Translucent
Databases"), digital transactions ("Digital Cash"), and steganography ("Disappearing Cryptography"). (“Peter Wayner”; 11
reasons encryption is (almost) dead; http://www.infoworld.com/article/2607386/encryption/11-reasons-encryption-is--almost-dead.html)//pk
Everyone who has studied mathematics at the movie theater knows that encryption is pretty boss. Practically every spy in every spy movie looks at an encrypted
file with fear and dread. Armies of ninjas can be fought. Bombs can be defused. Missiles can be diverted. But an encrypted file can only be cracked open with the
proper key -- and that key is always in the hands of a dangerously attractive agent hidden in a sumptuous hideout on the other side of the world. (Never in
Newark or New Haven -- who wants to film there?) Alas, this theorem of encryption security may be accepted as proven by math geniuses at Hollywood U., but
reality is a bit murkier. Encryption isn't always perfect, and even when the core algorithms are truly solid, many other links in the chain can go kablooie. There are
hundreds of steps and millions of lines of code protecting our secrets. If any one of them fails, the data can be as easy to read as the face of a five-year-old
playing Go Fish. Encryption is under assault more than ever -- and from more directions than previously thought. This doesn't mean you should forgo
securing sensitive data, but forewarned is forearmed. It's impossible to secure the entire stack and chain. Here are 11 reasons encryption is no longer all it's
cracked up to be. Encryption's weak link No. 1: No proofs -- just an algorithm arms race The math at the heart of encryption looks
impressive, with lots of superscripts and subscripts, but it doesn't come with any hard and fast proofs. One of the most famous algorithms, RSA,
is said to be secure -- as long as it's hard to factor large numbers. That sounds impressive, but it simply shifts the responsibility. Is it truly that
hard to factor large numbers? Well, there's no proof that it's hard, but no one knows how to do it right all of the time. If someone figures out a fast
algorithm, RSA could be cracked open like an egg, but that hasn't happened yet ... we think. Encryption's weak link No. 2: Disclosure is
the only means of detecting a crack Suppose you figured out how to factor large numbers and crack RSA encryption. Would you tell the
world? Perhaps. It would certainly make you famous. You might get appointed a professor at a fancy college. You might even land a cameo on "The Big Bang
Theory." But the encryption-cracking business can be shady. It isn't hard to imagine that it attracts a higher share of individuals or organizations that
might want to keep their newfound power secret and use it to make money or extract valuable information. Many of our assumptions about the security
of cryptography are based on the belief that people will share all of their knowledge of vulnerabilities -- but there is no guarantee
anyone will do this. The spy agencies, for instance, routinely keep their knowledge to themselves. And rumors circulate about an amazing
cryptographic breakthrough in 2010 that's still classified. Why should the rest of us act any differently? Encryption's weak link No. 3: The chain is
long and never perfect There are a number of excellent mathematical proofs about the security of this system or that system. They offer
plenty of insight about one particular facet, but they say little about the entire chain. People like to use phrases like " perfect forward security" to describe
a mechanism that changes the keys frequently enough to prevent leaks from spreading. But for all of its perfection, the proof covers
only one part of the chain. A failure in the algorithm or a glitch in the software can circumvent all this perfection . It takes plenty of
education to keep this straight.
Encryption doesn’t solve – hidden layers
Wayner 5/5/14 – Peter Wayner is contributing editor at InfoWorld and the author of more than 16 books on diverse topics,
including open source software ("Free for All"), autonomous cars ("Future Ride"), privacy-enhanced computation ("Translucent
Databases"), digital transactions ("Digital Cash"), and steganography ("Disappearing Cryptography"). (“Peter Wayner”; 11
reasons encryption is (almost) dead; http://www.infoworld.com/article/2607386/encryption/11-reasons-encryption-is--almost-dead.html)//pk
Encryption's weak link No. 6: Hypervisors -- the scourge of the hypervigilant You've downloaded the most secure distro, you've
applied all the updates, you've cleaned out all the cruft, and you've turned off all the weird background processes . Congratulations,
you're getting closer to having a secure server. But let's say you're still obsessed and you audit every single last line of code yourself. To be
extra careful, you even audit the code of the compiler to make sure it isn't slipping in a backdoor. It would be an impressive stunt, but it wouldn't matter much.
Once you have your superclean, completely audited pile of code running in a cloud, the hypervisor in the background could do anything it
wanted to your code or your memory -- so could the BIOS. Oh well. Encryption's weak link No. 7: Hidden layers abound The hypervisor and the
BIOS are only a few of the most obvious layers hidden away. Practically every device has firmware -- which can be remarkably porous.
It's rarely touched by outsiders, so it's rarely hardened. One research "hardware backdoor" called Rakshasa can infect the BIOS and sneak
into the firmware of PCI-based network cards and CD drivers. Even if your encryption is solid and your OS is uninfected, your
network card could be betraying you. Your network card can think for itself! It will be a bit harder for the network card to reach into the main memory, but
stranger things have happened. These hidden layers are in every machine, usually out of sight and long forgotten. But they can do
amazing things with their access.
--xt cloud
Encryption is breakable – Cloud Computing
Wayner 5/5/14 – Peter Wayner is contributing editor at InfoWorld and the author of more than 16 books on diverse topics,
including open source software ("Free for All"), autonomous cars ("Future Ride"), privacy-enhanced computation ("Translucent
Databases"), digital transactions ("Digital Cash"), and steganography ("Disappearing Cryptography"). (“Peter Wayner”; 11
reasons encryption is (almost) dead; http://www.infoworld.com/article/2607386/encryption/11-reasons-encryption-is--almost-dead.html)//pk
Encryption's weak link No. 4: Cloud computing power is cheap and massive. Some descriptions of algorithms like to make claims that
it would take "millions of hours" to try all the possible passwords. That sounds like an incredibly long time until you realize that Amazon
alone may have half a million computers for rent by the hour. Some botnets may have more than a million nodes. Big numbers aren't so impressive
these days. Encryption's weak link No. 5: Video cards bring easy parallelism to cracking The same hardware that can chew through millions of
triangles can also try millions of passwords even faster. GPUs are incredible parallel computers, and they're cheaper than ever. If you
need to rent a rack, Amazon rents them too by the hour too.
--xt voluntary
Yes circumvention – voluntary requests end-run prohibitions
Kayyali 14 (Nadia, member of EFF’s activism team. Her work focuses on surveillance, national security policy, and the
intersection of criminal justice, racial justice, and digital civil liberties issues. She has also provided legal support for
demonstrators through the National Lawyers Guild and Occupylegal. Nadia previously served as the 2012 Bill of Rights Defense
Committee Legal Fellow where they worked with grassroots groups to restrict the reach of overbroad national security policies.
Nadia earned a B.A. from UC Berkeley, with a major in Cultural Anthropology and minored in Public Policy. Nadia received a J.D.
from UC Hastings, where she served as Community Outreach Editor for the Hastings Race and Poverty Law Journal and the
Student National Vice-President for the National Lawyers Guild. During law school they interned at the ACLU of Northern
California and Bay Area Legal Aid. Nadia currently serves on the board of the National Lawyers Guild S.F. Bay Area chapter and
works with Fists Up Legal Collective to provide to provide legal support and community education for the Black Lives Matter
actions in the Bay Area, “Security Backdoors are Bad News—But Some Lawmakers Are Taking Action to Close Them”, 12/9/14,
https://www.eff.org/deeplinks/2014/12/security-backdoors-are-bad-news-some-lawmakers-are-taking-action-close-them)ML
EFF says the Secure Data Act starts to address the problem of backdoors by prohibiting any agency from “mandate[ing] that a
manufacturer, developer, or seller of covered products design or alter the security functions in its product or service to allow the
surveillance of any user of such product or service, or to allow the physical search of such product, by any agency.” The
legislation only prohibits agencies from requiring a company to build a backdoor. The NSA can still do its best to convince
companies to do so voluntarily. The legislation also doesn’t change the Communications Assistance for Law Enforcement Act (CALEA.) CALEA, passed
in 1994, is a law that forced telephone companies to redesign their network architectures to make it easier for law enforcement to wiretap telephone calls. In
2006, the D.C. Circuit upheld the FCC's reinterpretation of CALEA to also include facilities-based broadband Internet access and VoIP service, although it
doesn't apply to cell phone manufacturers.
NSA circumvents – “encourages” collaboration from industries
Perloth, et al 13 (Nicole, technology and cybersecurity reporter for The New York Times, guest lecturer at Stanford’s
graduate schools of business and communications, former deputy editor at Forbes, winner of the Society of American Business
Editors and Writers award for best technology coverage in 2013, voted the top cybersecurity journalist by the SANS Institute in
2014, graduate of Stanford University’s Graduate School of Journalism, and Princeton University; Scott Shane, journalist for The
New York Times, reporting principally about the United States intelligence community, former Moscow correspondent for The
Baltimore Sun; “NSA Able to Foil Basic Safeguards of Privacy on Web, http://www.nytimes.com/2013/09/06/us/nsa-foils-muchinternet-encryption.html?_r=0) BJ
Because strong encryption can be so effective, classified N.S.A. documents make clear, the agency’s success depends on working with Internet
companies — by getting their voluntary collaboration, forcing their cooperation with court orders or surreptitiously stealing their
encryption keys or altering their software or hardware. According to an intelligence budget document leaked by Mr. Snowden, the N.S.A. spends
more than $250 million a year on its Sigint Enabling Project, which “actively engages the U.S. and foreign IT industries to covertly
influence and/or overtly leverage their commercial products’ designs” to make them “exploitable.” Sigint is the acronym for signals intelligence,
the technical term for electronic eavesdropping. By this year, the Sigint Enabling Project had found ways inside some of the encryption chips that
scramble information for businesses and governments, either by working with chipmakers to insert back doors or by exploiting security
flaws, according to the documents. The agency also expected to gain full unencrypted access to an unnamed major Internet phone call
and text service; to a Middle Eastern Internet service; and to the communications of three foreign governments.
Alt methods of NSA surveillance – private companies share personal information
Toren 15 -- Attorney with Weisbrod Matteis & Copley PLLC (Peter, NSA ENCRYPTION AND BIG DATA, 6/29/15,
http://www.legalcurrent.com/nsa-encryption-and-big-data/)
The revelations by Edward Snowden about the NSA’s collection of “metadata” on every phone call that is made in the U.S. has
led to concerns about whether the government should be collecting this type of information and whether there are adequate
safeguards as to when and how the government may be permitted to use the information. Putting aside the host of legal and
security issues associated with this program, most Americans are probably still not aware that the five largest tech companies –
Google, Facebook, Apple, Amazon, and Yahoo – collect information that contains far more personal details, and is available to
the government for the asking. While the exact types of data collected differs somewhat amongst these tech giants, nearly all
collect ad clicks, browser information, email addresses, IP addresses, phone numbers, search queries and more. The companies
aren’t stealing this information but are obtaining it without cost from users, who either don’t care or haven’t taken the time to read
the privacy policies which give companies free access to this information. Perhaps what is equally disturbing about the
companies’ unfettered use of the information is the very limited legal protection given to such information. The primary and most
important federal privacy law in the United States, the Stored Communications Act (SCA), was originally enacted in 1986 to
govern the privacy of computer network communications and grants Internet users a set of statutory privacy rights that limit the
government’s power to access a person’s communications and records. However, it does not cover certain things like search
queries, for example. In other words, search records, like whether a person visited a website for alcohol or drug addiction
centers, can be disclosed to the government without even a subpoena. Moreover, while standing alone, each of the types of data
may only pose a limited threat to an individual’s privacy. But by combining them, a precise, comprehensive record of a person’s
public movements come to light, reflecting a wealth of detail about familial, political, professional, religious, and sexual
associations that can be stored and mined years into the future, not only by the companies but by the government as well. For
example, if the search query for drug addiction or alcohol treatment is combined with ad clicks and phone numbers, a much more
complete profile of the user is generated which is freely available to the government. Because the information can be acquired by
the government at little or no cost, there is no monetary restraint on the information collected by the government. This can lead to
the government having access to a substantial quantum of information about any person whom the government wishes and may
alter the relationship between citizen and government in a way that is inimical to democratic. James Madison, the principal
author of the Bill of Rights, is reported to have observed, “Since the general civilization of mankind, I believe there are more
instances of the abridgement of freedom by the people by gradual and silent encroachments by those in power than by violent
and sudden usurpations.” Indeed, this data that can be freely obtained by the government at virtually no cost is just the type of
“gradual and silent encroachment” into the very details of our lives that we as a society must be vigilant to prevent. Congress
should carefully consider whether there should be limits on whether the government can obtain this information and how the
information can be obtained. It is too important an issue for the government to decide without the knowledge and consent of the
American public.
--xt court order
Yes circumvention – court orders under the wiretap act can compel
Soghoian et al 15 (Christopher Soghoian, researcher at Harvard and Yale, Kevin Bankston, Policy Director of New America’s Open Technology
Institute, Fred Cate, C. Ben Dutton Professor of Law at Indiana University Maurer School of Law, Chris Hoofnagle, Co-Director, Berkeley Center for Law &
Technology, Marcia Hofmann, senior staff attorney at the Electronic Frontier Foundation, Rob Faris, Research Director of the Berkman Center for Internet and
Society at Harvard University, Albert Gidari, partner of Perkins Coie in Privacy & Security, Jennifer Granick, Director of Civil Liberties for the Center for Internet
and Society at Stanford Law School, Orin Kerr, professor of law at the George Washington University , Susan Landau, Professor of Social Science and Policy
Studies at Worcester Polytechnic Institute, Paul Ohm, Professor of Law at the Georgetown University Law Center, Nicole Azer, Technology & Civil Liberties
Policy Director in ACLU California, John Palfrey, previous executive director of Harvard's Berkman Center for Internet & Society, Marc Rotenberg, President and
Executive Director of the Electronic Privacy Information Center, Adam Schostack, expert in security, Ryan Singel, journalist of technology at WIRED, Adam
Thierer, senior research fellow with the Technology Policy Program at the Mercatus Center at George Mason University, Jonathan Zittrain, professor of Internet
law and the George Bemis Professor of International Law at Harvard Law School, “Privacy And Law Enforcement: Caught In The Cloud: Privacy, Encryption, And
Government Back Doors In The Web 2.0 Era”, 12/16/13,
http://www.researchgate.net/publication/228365094_Privacy_And_Law_Enforcement_Caught_In_The_Cloud_Privacy_Encryption_And_Government_Back_Door
s_In_The_Web_2.0_Era)//EM
The Wiretap Act164 regulates the collection of actual content of wire and electronic communications. The Wiretap Act was first passed as
Title III of the Omnibus Crime Control and Safe Streets Act of 1968165 and is generally known as “Title III.” Prior to the 1986 amendment by Title I of the
Electronic Communications Privacy Act (ECPA),166 it covered only wire and oral communications. Title I of the ECPA extended that coverage to electronic
communications.167 18 U.S.C. § 2518(4) states that: An order authorizing the interception of a wire, oral, or electronic communication under
this chapter shall, upon request of the applicant, direct that a provider of wire or electronic communication service, landlord, custodian or other
person shall furnish the applicant forthwith all information, facilities, and technical assistance necessary to accomplish the interception
unobtrusively and with a minimum of interference with the services that such service provider, landlord, custodian, or person is according the person whose
communications are to be intercepted.168 18 U.S.C. § 2518(4) also states that: Any provider of wire or electronic communication service, landlord, custodian or
other person furnishing such facilities or technical assistance shall be compensated therefore by the applicant for reasonable expenses incurred in providing such
facilities or assistance.169 In the car navigation case discussed earlier in this article, the court determined that the term “other person” in 18 U.S.C. §
2518(4) also includes “an individual or entity who both provides some sort of service to the target of the surveillance and is uniquely
situated to assist in intercepting communications through its facilities or technical abilities.”170 At least based on that court’s interpretation
of the law in that case, the Wiretap Act can be used to justify forcing a
service provider to create new functionality in its products
solely for the purpose of wiretapping customers . While the technical details of the FBI’s Magic Lantern/CIPAV system have yet to be revealed,
some legal experts did discuss the possible means through which the government might be able to compel anti-virus vendors to ignore or even white list the
FBI’s spyware tool. An attorney with the Electronic Frontier Foundation told one journalist that “[t]he government would be pushing the boundaries of the law if it
attempted to obtain such an order . . . . There’s simply no precedent for this sort of thing.”171 He did, however, point to the Wiretap Act as one possible source
for this coercive power, adding that “[t]here is some breadth in that language that is of concern and that the
attempt to exploit .”172
Justice Department may
--xt warrant
Targeted backdoors can be applied upon warrant issuance
Soghoian et al 15 (Christopher Soghoian, researcher at Harvard and Yale, Kevin Bankston, Policy Director of New America’s Open Technology
Institute, Fred Cate, C. Ben Dutton Professor of Law at Indiana University Maurer School of Law, Chris Hoofnagle, Co-Director, Berkeley Center for Law &
Technology, Marcia Hofmann, senior staff attorney at the Electronic Frontier Foundation, Rob Faris, Research Director of the Berkman Center for Internet and
Society at Harvard University, Albert Gidari, partner of Perkins Coie in Privacy & Security, Jennifer Granick, Director of Civil Liberties for the Center for Internet
and Society at Stanford Law School, Orin Kerr, professor of law at the George Washington University , Susan Landau, Professor of Social Science and Policy
Studies at Worcester Polytechnic Institute, Paul Ohm, Professor of Law at the Georgetown University Law Center, Nicole Azer, Technology & Civil Liberties
Policy Director in ACLU California, John Palfrey, previous executive director of Harvard's Berkman Center for Internet & Society, Marc Rotenberg, President and
Executive Director of the Electronic Privacy Information Center, Adam Schostack, expert in security, Ryan Singel, journalist of technology at WIRED, Adam
Thierer, senior research fellow with the Technology Policy Program at the Mercatus Center at George Mason University, Jonathan Zittrain, professor of Internet
law and the George Bemis Professor of International Law at Harvard Law School, “Privacy And Law Enforcement: Caught In The Cloud: Privacy, Encryption, And
Government Back Doors In The Web 2.0 Era”, 12/16/13,
http://www.researchgate.net/publication/228365094_Privacy_And_Law_Enforcement_Caught_In_The_Cloud_Privacy_Encryption_And_Government_Back_Door
s_In_The_Web_2.0_Era, page 418-420)//EM
The move to cloud computing makes it far easier for the government to effectively force the deployment of covert back doors. This
is due to a few key features specific to the Web 2.0 application
model: identifiable customers, automatic, silent updates, and
the complete absence of visible product releases. Updates and the Cloud One of the most useful features of the Web 2.0 paradigm, for both
provider and customer, is that users are always running the latest version of a particular web-based application. There is simply no need to coax an
update, because it is simply impossible to run anything but the latest version. The vast majority of cloud-based software runs in a
web browser. In this model, a user visits a web page, and her browser immediately downloads the programmatic code which is used to
implement the Web page’s functionality. When the user revisits that same website the next day, her web browser requests the same content again, and
then downloads it from the company’s web server.202 If the website owner has updated the code, a new version of the application will be
downloaded, without any notification to the user that the code running on her computer today is different than the day before.203
Traditional software vendors, both application and operating system, ship software with a version number. Users can, if they know how, find out which version of
Microsoft Word, Photoshop or Quicken they are running. In fact, many applications display their current version number when starting. Contrast this to the
situation for the users of cloud-based services. Google does not provide a version number for its Gmail or Docs service. Neither does
Yahoo, Facebook, or MySpace. New features might be announced, or suddenly appear, however, when bugs are fixed, these
are usually done so quietly with no notification to the user. If a user of Google Docs starts up her computer, connects to the Internet and
accesses her documents, she has no way of knowing if her browser is executing different code than it ran the day before . The same user
running Firefox or Microsoft Windows would have a much better chance of knowing this, and in most cases, of declining to perform an update if one was made
available. Finally, most cloud providers know a significant amount more about their customers than traditional software companies.
Unless a customer has given a false name, email providers and social networking companies know who their customers are as
well as the names and contact information for their friends. As a result, if law enforcement agencies serve a subpoena in order to
obtain the files for a specific customer, most cloud computing providers know exactly which account to target. This shift in the
effectiveness of software updates and the ease of customer identification significantly weakens the ability of cloud providers to protect their customers’ privacy
with encryption. While Google could add encryption to its Docs application,
the company could just as easily be forced to add a
back door in to the browser code which would steal the user’s key. As I have just explained, this would be automatically downloaded and executed
the next time that the user logged in, with no way for her to avoid the update, or even know that it was applied. Furthermore, because of the fact that Google
typically knows which particular user account an individual is using, it can issue the backdoor-laced update to only that user . Essentially, cloud
computing makes it far easier for companies to force out covert backdoors with surgical precision to only those persons who the government has targeted.
--a2 unbreakable
Our evidence assumes unbreakable encryption – circumvention targets the human element
Simonite 13 (Tom, Editor of MIT Technology Review’s and graduate of Cambridge and Imperial London College, “The NSA
Hasn’t “Cracked” Encryption—It’s Just Reminded Us of the Ways Around It”,
http://www.technologyreview.com/news/519171/nsa-leak-leaves-crypto-math-intact-but-highlights-known-workarounds/)BJ
However, cryptography experts tell MIT Technology Review that a close reading of last week’s report suggests the NSA has not broken the underlying
mathematical operations that are used to cloak online banking or e-mail. Instead, the agency appears to rely on a variety of attacks on
the software used to deploy those cryptographic algorithms and the humans and organizations using that software. Those strategies,
revealed in documents leaked by Edward Snowden, came as no surprise to computer security researchers, given that the NSA’s mission includes the pursuit of
America’s most technologically capable enemies. “The whole leak has been an exercise in `I told you so,’ ” says Stephen Weis, CEO of server encryption
company PrivateCore. Weis previously worked on implementing cryptography at Google. “There doesn’t seem to be any kind of groundbreaking algorithmic
breakthrough,” he says, “but they are able to go after implementations and the human aspects of these systems.” Those tactics apparently
include using legal tools or hacking to get the digital keys used to encrypt data; using brute computing power to break weak encryption; and
forcing companies to help the agency get around security systems.
Humans are the weak point – circumvention inevitable
Bradely 13
Tony Bradley (Tony is principal analyst with the Bradley Strategy Group, providing analysis and insight on tech trends. He is a
prolific writer on a range of technology topics, has authored a number of books, and is a frequent speaker at industry events.
http://www.pcworld.com/article/2042542/encryption-can-t-protect-your-data-while-you-re-logged-in.html)
You carry a lot of data and sensitive information on your laptop, tablet, and smartphone. The standard method of
protecting that information from prying eyes is to encrypt it, rendering the data inaccessible. But with most encryption
software, that information becomes accessible the moment you log in to the device as a matter of convenience. Think
about what information that might be: names, postal and email addresses, and phone numbers for friends, family,
clients, and business associates; calendar events indicating where you’ll be and when you’ll be there; personal
photographs; and more. You might also have proprietary information about your company, clients, information that
companies have entrusted you under the terms of non-disclosure agreements, and other sensitive information that should be
secured. Encrypting data protects it from unauthorized access. Encryption basically scrambles the data so it’s nothing but
unusable gibberish to anyone who isn’t authorized to access or view it. And that’s great, but ask yourself this: How many
steps must you go through to decrypt your data? Encryption is designed to protect data, but it should also be
seamlessly accessible to the user—it should automatically decrypt, so you don’t have to jump through hoops to use
your own encrypted data. And that means it’s not protected at all if someone finds your laptop, smartphone, or
tablet in a state that doesn’t require a log-in password.
Encryption is unbreakable but you can get around it
Laskowski 15 – Senior News Writer at TechTarget, English education from Purdue University and a master’s degree in literary
nonfiction from the University of Oregon. 10 years of writing. Cites Edward Snowden – NSA whistleblower, and Bruce Schneier –
Expert in Security Technology. (“Snowden: Data encryption is good, but not good enough”, Nicole Laskowski, Janurary 23, 2015,
TechTarget, http://searchcio.techtarget.com/opinion/Snowden-Data-encryption-is-good-but-not-good-enough)//chiragjain
One of the big revelations to come out of the National Security Agency (NSA) documents
leaked in 2013 by
Edward Snowden didn't have to do with what the NSA was doing with our data. Instead, it had to do with what
the NSA couldn't do: Namely, the agency couldn't break cryptography. "Properly
implemented encryption does work," Snowden said last week at Harvard University's fourth annual
Symposium on the Future of Computation in Science and Engineering. Snowden, the former systems administrator-turnedwhistleblower who has taken refuge in Moscow, was beamed into the Cambridge auditorium via Google Hangouts, for a
conversation with security technology expert Bruce Schneier, a fellow at Harvard's Berkman Center for Internet and Society.
Schneier said it was surprising to learn that the government doesn't seem to have a secret sauce or advanced technology such as
quantum computers to break encryption. "Ten to 20 years ago, we would assume that we, in the academic world, were a decade
behind the NSA and other countries," Schneier said. "That seems like that might not be true." When
Snowden and
Schneier refer to "properly implemented encryption," they're referring to open source
encryption tools such as Tor (an anonymity network), OTR (an instant messaging tool) and PGP (data encryption software),
and not to what Snowden called "homebrewed," "boutique" or closed-source cryptography or even to
hardware implementations of cryptography, which he said have successfully been
broken. Yet despite sound encryption, data is still at risk. "When they do attack, it's
typically through some kind of weakness, some sort of shortcut that reduces the
resistance," Snowden said. He pointed to the ongoing Silk Road trial as a timely case in point. Ross Ulbricht, the
alleged mastermind behind the online drug market, used PGP to encrypt personal
documents. "He had fully, irresistibly encrypted material. Yet just yesterday in court,
[members of the prosecution] were reading out encrypted diary entries to a room full of
reporters," Snowden said. "Encryption is not fool proof." Prosecutors didn't break the
encryption; instead, they found a way around PGP, Snowden said, by pulling a key off of Ulbricht's laptop.
"The way everyone gets around cryptography is by getting around cryptography," Schneier
said. The weakness in encryption, in other words, isn't the algorithms and it isn't data in transit; it's
everything else, Schneier said. "What we really have to worry about is the rest of everything -- so the bad
implementations, the weak keys, any kind of back doors being inserted in the software," he said.
That includes weaknesses commonly found at the endpoints. Tools from surveillance manufacturers such as
Hacking Team and Remote Control System sell products to third-world countries to
perform NSA-like activities at a smaller scale. Activities include hacking into computers and
reading encrypted traffic after it's been decrypted or covertly recording passwords through keystroke logging, Schneier
said. And it includes how encryption keys are stored, Snowden said. "One of the real dangers of the current
security model at scale for defenders is the aggregation of key material," he said. "If you
have a centralized database of keys, that is a massive target." If attackers can't access that material
remotely, they could very well send someone to get hired into your organization to develop that access, Snowden said. "We've
got to focus on end points, we've got to focus on the keys [and make them] more
defensible," he said.
Encryption is nearly useless – multiple warrants
Wayner 14 – Contributing Editor at InfoWorld; author of more than 16 books in open source software, and privacy-enhanced
communication. (“11 reasons encryption is (almost) dead”, Peter Wayer, InfoWorld, May 5, 2014,
http://www.infoworld.com/article/2607386/encryption/11-reasons-encryption-is--almost--dead.html)//chiragjain
Encryption is under assault more than ever -- and from more directions than previously
thought. This doesn't mean you should forgo securing sensitive data, but forewarned is forearmed. It's impossible to
secure the entire stack and chain. Here are 11 reasons encryption is no longer all it's cracked up to be. Encryption's
weak link No. 1: No proofs -- just an algorithm arms race The math at the heart of encryption
looks impressive, with lots of superscripts and subscripts, but it doesn't come with any
hard and fast proofs. One of the most famous algorithms, RSA, is said to be secure -- as long as it's hard to factor large
numbers. That sounds impressive, but it simply shifts the responsibility. Is it truly that hard to factor large numbers? Well, there's
no proof that it's hard, but no one knows how to do it right all of the time. If someone figures out a fast algorithm, RSA could be
cracked open like an egg, but that hasn't happened yet ... we think. Encryption's weak link No. 2: Disclosure is the only means
of detecting a crack Suppose you figured out how to factor large numbers and crack RSA encryption.
Would you tell the world? Perhaps. It would certainly make you famous. You might get appointed a
professor at a fancy college. You might even land a cameo on "The Big Bang Theory." But the encryptioncracking business can be shady. It isn't hard to imagine that it attracts a higher share of individuals or
organizations that might want to keep their newfound power secret and use it to make money or extract
valuable information. Many of our assumptions about the security of cryptography are based on the belief
that people will share all of their knowledge of vulnerabilities -- but there is no guarantee anyone will do
this. The spy agencies, for instance, routinely keep their knowledge to themselves. And rumors circulate about
an amazing cryptographic breakthrough in 2010 that's still classified. Why should the rest of us act any differently? Encryption's
weak link No. 3: The
chain is long and never perfect There are a number of excellent
mathematical proofs about the security of this system or that system. They offer plenty of insight
about one particular facet, but they say little about the entire chain. People like to use phrases like "perfect forward security" to
describe a mechanism that changes the keys frequently enough to prevent leaks from spreading. But for all of its perfection, the
proof covers only one part of the chain.
A failure in the algorithm or a glitch in the software can
circumvent all this perfection. It takes plenty of education to keep this straight. Encryption's
weak link No. 4: Cloud computing power is cheap and massive Some descriptions of
algorithms like to make claims that it would take "millions of hours" to try all the
possible passwords. That sounds like an incredibly long time until you realize that Amazon alone may have half a million
computers for rent by the hour. Some botnets may have more than a million nodes. Big numbers aren't so
impressive these days. Encryption's weak link No. 5: Video cards bring easy parallelism to
cracking The same hardware that can chew through millions of triangles can also try
millions of passwords even faster. GPUs are incredible parallel computers, and they're cheaper than ever. If you
need to rent a rack, Amazon rents them too by the hour too. Encryption's weak link No. 6: Hypervisors -- the scourge
of the hypervigilant You've downloaded the most secure distro, you've applied all the
updates, you've cleaned out all the cruft, and you've turned off all the weird background
processes. Congratulations, you're getting closer to having a secure server. But let's say you're still obsessed and you audit every
single last line of code yourself. To be extra careful, you even audit the code of the compiler to make sure it isn't slipping in a
Once you have your superclean,
completely audited pile of code running in a cloud, the hypervisor in the background
could do anything it wanted to your code or your memory -- so could the BIOS. Oh well. Encryption's
weak link No. 7: Hidden layers abound The hypervisor and the BIOS are only a few of the
most obvious layers hidden away. Practically every device has firmware -- which can be
remarkably porous. It's rarely touched by outsiders, so it's rarely hardened. One research "hardware bac
kdoor"
backdoor. It would be an impressive stunt, but it wouldn't matter much.
Even if your
encryption is solid and your OS is uninfected, your network card could be betraying you. Your network
card can think for itself! It will be a bit harder for the network card to reach into the main memory, but
stranger things have happened. These hidden layers are in every machine, usually out of sight and long
forgotten. But they can do amazing things with their access. Encryption's weak link No. 8: Backdoors
aplenty Sometimes programmers make mistakes. They forget to check the size of an input, or they skip
clearing the memory before releasing it. It could be anything. Eventually, someone finds the hole and
starts exploiting it. Some of the most forward-thinking companies release a steady stream of fixes that never seems to
called Rakshasa can infect the BIOS and sneak into the firmware of PCI-based network cards and CD drivers.
end, and they should be commended. But the relentless surge of security patches suggests there won't be an end anytime soon. By
the time you've finished reading this, there are probably two new patches for you to install. Any of these holes could compromise
your encryption. It could patch the file and turn the algorithm into mush. Or it could leak the key through some other path. There's
no end to the malice that can be caused by a backdoor. Encryption's weak link No. 9: Bad
random-number
generators Most of the hype around encryption focuses on the strength of the encryption
algorithm, but this usually blips over the fact that the key-selection algorithm is just as important. Your
encryption can be superstrong, but if the eavesdropper can guess the key, it won't
matter. This is important because many encryption routines need a trustworthy source of random numbers to help pick the key.
Some attackers will simply substitute their own random-number generator and use it to undermine the key choice. The algorithm
remains strong, but the keys are easy to guess by anyone who knows the way the random-number generator was compromised.
Encryption's weak link No. 10: Typos
One of the beauties of open source software is that it can
uncover bugs -- maybe not all of the time but some of the time. Apple's iOS, for instance, had an extra
line in its code: goto fail. Every time the code wanted to check a certificate to make sure it was accurate, the code would hit the goto
statement and skip it all. Oops. Was it a mistake? Was it put there on purpose? We'll never know. But it sure took a long time for the
wonderful "many eyes" of the open source community to find it. Encryption's weak link No. 11: Certificates
can be faked
Let's say you go to PeteMail.com with an encrypted email connection, and to be extra
careful, you click through to check out the certificate. After a bit of scrutiny, you discover
it says it was issued by the certificate authority Alpha to PeteMail.com and it's all legit. You're clear, right?
Wrong. What if PeteMail.com got its real SSL certificate from a different certificate authority -- say, Beta. The certificate from Alpha
may also be real, but Alpha just made a certificate for PeteMail.com and gave it to the eavesdropper to make the connection easier to
bug. Man-in-the-middle attacks are easier if the man in the middle can lie about his identity. There
are hundreds of
certificate authorities, and any one of them can issue certs for SSL. This isn't a hypothetical worry.
There are hundreds of certificate authorities around the world, and some are under the control of the local governments. Will they
just create any old certificate for someone? Why don't you ask them?
DA – TERRORISM
--xt link
Strong encryption greatly increases chance of successful terror attack
RT 15 (RT, “Apple, Google helping terrorists with encryption- Manhattan DA” 04/21/15, http://www.rt.com/usa/251469-applegoogle-encryption-terrorists/)
Allowing users to take advantage of advanced
encryption in order to keep their messages and mobile
communication out of the government’s hands will only help terrorists plot future attacks, a top New York law
enforcement official said. The new encryption services offered by Apple and Google will make it harder to protect New Yorkers,
Manhattan District Attorney Cyrus Vance Jr. told local AM970 radio host John Cats. He mentioned built-in
encryption – which
that federal and local law enforcement bodies won’t be able
to intercept communications between potential criminals and terrorists, even if they acquire a warrant.
When Cats suggested, “terrorists are running out to buy iPhones,” Vance responded by saying, he was “absolutely right.” “If
individuals who are seeking to do serious harm to our citizenry know they have a device that they can use
with impunity and that the contents of their messages and images on their phones cannot be accessed by
law enforcement that's going to be the terrorists’ community device of choice,” he added, according to the Daily
Apple claims its own engineers cannot break – means
Dot. In addition to Apple, Google is also incorporating encryption into its mobile devices. The two tech giants’ smartphones
comprise 96 percent of the global market, the New York Post mentions. “Apple
has created a phone that is dark, that
cannot be accessed by law enforcement even when a court has authorized us to look at its contents,”
Vance said.In response, Vance wants police departments around the country to register their opposition with politicians and for
hearings on the issue to take place. On its website, Apple says that encryption is enabled “end-to-end” on its devices and that it has
“no way to decrypt iMessage and FaceTime data when it’s in transit between devices.” Additionally, the company states, “We
wouldn’t be able to comply with a wiretap order even if we wanted to .” Other features such as iCloud and Mail also
offer some encryption protections. READ MORE: FBI director lashes out at Apple, Google for encrypting smartphones Vance isn’t the
only law enforcement official to come out against widespread encryption. In October, New York Police Department Commissioner
Bill Bratton heavily criticized Apple and Google for the move, and FBI Director James Comey also blasted the development. "There
will come a day -- well it comes every day in this business -- when it will matter a great, great deal to the lives of
people of all kinds that we be able to with judicial authorization gain access to a kidnapper's or a terrorist
or a criminal's device,” Comey said. “I just want to make sure we have a good conversation in this country before that day
comes.” In a blog post at the Wall Street Journal, Amy Hess of the FBI clarified the bureau’s position on the issue, which has seen a
surge in support since former government contractor Edward Snowden revealed a massive domestic and international surveillance
operation. She said law
enforcement officials will need “some degree of access” to encrypted messages in
order to stop criminal and violent plots in the future. “No one in this country should be beyond the law,”
she wrote. “The notion that electronic devices and communications could never be unlocked or unencrypted – even when a judge
has decided that the public interest requires accessing this data to find evidence — is troubling. It may be time to ask: Is that a cost
we, as a society, are prepared to pay?”
Encryption decks counter-terror effectiveness
Hess 15 (Amy Hess, Executive Assistant Director Federal Bureau of Investigation, Before the
Subcommittee on Information Technology Oversight and Government Reform U.S. House of
Representatives Concerning Encryption and Cybersecurity for Mobile Electronic Communication Devices,
page 6-7, April 29, 2015.)\\mwang
Examples
The more we as a society rely on electronic devices to communicate and store information, the more likely it is that evidence
that was once found in filing cabinets, letters, and photo albums will now be available only in electronic storage. We
have seen case after case – from homicides and kidnappings, to drug trafficking, financial fraud, and child
exploitation – where critical evidence came from smart phones, computers, and online communications . Each of
the following examples demonstrates how important information stored on electronic devices can be to prosecuting
criminals and stopping crime. As encryption solutions become increasingly inaccessible for law enforcement, it is
cases like these that could go unsolved, and criminals like these that could go free. Another investigation in Clark
County, Nevada, centered on allegations that a woman and her boyfriend conspired together to kill the woman’s
father who died after being stabbed approximately 30 times. Text messages which had been deleted from the phone and
recovered by investigators revealed the couple’s plans in detail, clearly showing premeditation. Additionally, the
communications around the time of the killing proved that both of them were involved throughout the process and
during the entire event, resulting in both being charged with murder and conspiracy to commit murder. Following a joint investigation
conducted by the FBI and Indiana State Police, a pastor pleaded guilty in Federal court to transporting a minor
across state lines with intent to engage in illicit sexual conduct in connection with his sexual relationship with an underage girl
who was a student at the church’s high school. During this investigation, information recovered from the pastor’s smart phone
proved to be crucial in showing the actions taken by the pastor in the commission of his crimes. Using forensic software,
investigators identified Wi-Fi locations, dates, and times when the pastor traveled out of state to be with the victim.
The analysis uncovered Internet searches including, “What is the legal age of consent in Indiana”, “What is the legal age of consent in
Michigan”, and “Penalty for sexting Indiana.” In addition, image files were located which depicted him in compromising positions with the
victim. These are examples of how important evidence that resides on smart phones and other devices can be to
law enforcement – evidence that might not have been available to us had strong encryption been in place on those
devices and the user’s consent not granted. The above examples serve to show how critical electronic evidence has become
in the course of our investigations and how timely, reliable access to it is imperative to ensuring public safety. Today’s encryption
methods are increasingly more sophisticated, and pose an even greater challenge to law enforcement . We are
seeing more and more cases where we believe significant evidence resides on a phone, a tablet, or a laptop –
evidence that may be the difference between an offender being convicted or acquitted – but we cannot access it.
Previously, a company that manufactured a communications device could assist law enforcement in unlocking the
device. Today, however, upon receipt of a lawful court order, the company might only be able to provide
information that was backed up in the cloud – and there is no guarantee such a backup exists, that the data is current, or that it
would be relevant to the investigation. If this becomes the norm, it will be increasingly difficult for us to investigate and
prevent crime and terrorist threats.
Encryption is getting stronger—cloaks terrorists.
Hess 15 (Amy Hess, Executive Assistant Director Federal Bureau of Investigation, Before the
Subcommittee on Information Technology Oversight and Government Reform U.S. House of
Representatives Concerning Encryption and Cybersecurity for Mobile Electronic Communication Devices,
page 4-5, April 29, 2015.)\\mwang
Court-Ordered Access to Stored Encrypted Data Encryption of stored
data is not new, but it has become increasingly prevalent
and sophisticated. The challenge to law enforcement and national security officials has intensified with the advent
of default encryption settings and stronger encryption standards on both devices and networks. In the past, a consumer
had to decide whether to encrypt data stored on his or her device and take some action to implement that encryption. With today’s new
operating systems, however, a device and all of a user’s information on that device can be encrypted by default –
without any affirmative action by the consumer. In the past, companies had the ability to decrypt devices when the Government
obtained a search warrant and a court order. Today, companies have developed encryption technology which makes it
impossible for them to decrypt data on devices they manufacture and sell, even when lawfully ordered to do so.
Although there are strong and appropriate cybersecurity and other reasons to support these new uses of encryption, such decisions
regarding system design have a tremendous impact on law enforcement’s ability to fight crime and bring
perpetrators to justice . Evidence of criminal activity used to be found in written ledgers, boxes, drawers, and file cabinets, all of which
could be searched pursuant to a warrant. But like the general population, criminal actors are increasingly storing such
information on electronic devices. If these devices are automatically encrypted, the information they contain may
be unreadable to anyone other than the user of the device. Obtaining a search warrant for photos, videos, email,
text messages, and documents can be an exercise in futility. Terrorists and other criminals know this and will
increasingly count on these means of evading detection. Additional Considerations Some assert that although more
and more devices are encrypted, users back-up and store much of their data in “the cloud,” and law enforcement
agencies can access this data pursuant to court order. For several reasons, however, the data may not be there. First,
aside from the technical requirements and settings needed to successfully back up data to the cloud, many companies impose fees to
store information there – fees which consumers may be unwilling to pay. Second, criminals can easily avoid putting
information where it may be accessible to law enforcement. Third, data backed up to the cloud typically includes only a portion
of the data stored on a device, so key pieces of evidence may reside only on a criminal’s or terrorist’s phone , for
example. And if criminals do not back up their phones routinely, orif they opt out of uploading to the cloud
altogether, the data may only be found on the devices themselves – devices which are increasingly encrypted .
Strong encryption facilitates terrorist recruitments and plots
Ybarra 15 (Maggie Ybarra, military affairs and Pentagon correspondent for the Washington Times, Washington Times, “FBI
director James Comney flags dangers of encryption services, 07/7/15, http://www.washingtontimes.com/news/2015/jul/7/fbiencryption-fosters-furtive-terrorism/)
FBI Director James B. Comey will be arguing for a robust debate on
Wednesday, as he takes to Capitol Hill to plead his case that
message-encryption technology
to lawmakers
terrorist groups such as the Islamic State could take
advantage of such technology to recruit Americans into their organization. The technology, commonly
referred to as “going dark” allows people to send messages to one another that cannot be traced by the government . Google
has reported about 80 percent of its Gmail messages to other addresses in the last month were encrypted, and Apple has said
it uses encryption on its iMessage and FaceTime tools which is so secure that even the company can’t read or decode the
communications. But for all the good encryption services provide — protecting innovation, private thoughts and other things
of value — the technology can also be used for nefarious purposes, Mr. Comey wrote in a blog posting Monday. “There
is
simply no doubt that bad people can communicate with impunity in a world of universal strong
encryption,” Mr. Comey wrote. The Senate Judiciary Committee is prepared to hear Mr. Comey’s testimony about the
technology, along with the testimony of Sally Quillian Yates, the deputy attorney general at the Department of Justice. “Today’s
hearing is intended to start a conversation in the Senate about whether recent
technological changes have upset the
balance between public safety and privacy,” Sen. Chuck Grassley, Iowa Republican and the panel, said in prepared
remarks. “In particular, Director Comey has talked about the challenges this issue presents the FBI in the national security
context. According to the Director,
ISIS is recruiting Americans on-line and then directing them to encrypted
communication platforms that are beyond the FBI’s ability to monitor, even with a court order. If this is
accurate, it obviously represents a dangerous state of affairs.” Despite the danger, a group of computer scientists and security
experts are trying to counter Mr. Comey’s message by defending the need for encrypted technology. The same day that FBI
director made a rare social media effort to flag the dangers of “going dark,” the Computer Science and Artificial Intelligence
Laboratory released a 34-page technical report that advocates against providing federal authorities access to encrypted
conversations. “We have found that the damage that could be caused by law enforcement exceptional access requirements
would be even greater today than it would have been 20 years ago,” the report states. “In the wake of the growing economic
and social cost of the fundamental insecurity of today’s Internet environment , any proposals that alter the security dynamics
online should be approached with caution.” President Obama
has been trying to ease the concerns of Mr. Comey and the
other heads of U.S. government intelligence agencies by searching for a middle ground solution that protects the privacy of
U.S. citizens while providing federal agencies with the tools they need to track down and halt potential terrorist threats. Mr.
Obama said during a joint January press conference with British Prime Minister David Cameron that his administration has
been communicating with companies about how to provide agencies with legal access to conversations that might be taking
place via technologies that are constantly evolving. “If
we get into a situation in which the technologies do not
allow us at all to track somebody that we’re confident is a terrorist, if we … have specific information, we are
confident that this individual or this network is about to activate a plot and, despite knowing that information, despite having a
phone number or despite having a social media address or a e-mail address, that we can’t penetrate that,
that’s a
problem,” he said. The solution to that problem will likely be complicated and involve consideration of legislation, regulation,
cooperation among lawmakers and with private companies, Mr. Comey said during a June 18 press conference at the
Department of Justice. “The companies that are providing communication services don’t want folks killed by people using their
platforms,” he said. “So we’re having good conversations with them. I’m sure a big part of it’s going to be international
cooperation.”
Backdoor searches protect privacy and are key to law enforcement
Vance 7/8 <Cyrus R., New York District Attorney, 7/8/15, “Going Dark: Encryption, Technology, and the Balance Between
Public Safety and Privacy”, p.2-3, http://www.judiciary.senate.gov/meetings/going-dark-encryption-technology-and-the-balancebetween-public-safety-and-privacy >//wx
As you know, the Fourth Amendment of the United States Constitution authorizes reasonable searches and seizures, providing law enforcement
agencies access to places where criminals hide evidence of their crimes – from car trunks, to storage facilities, to computers, mobile devices, and digital
networks. In order to safeguard Fourth Amendment rights, these searches are conducted pursuant to judicial warrants, issued upon a neutral
judge’s finding of probable cause. The probable cause standard represents a balance between privacy and public safety carefully
calibrated by centuries of jurisprudence, and it guides individuals and companies in developing their expectations of privacy. Through this judicial process, my
Office obtains smartphone evidence to support all types of cases – homicides, sex crimes, child abuse, fraud, assaults, robberies, cybercrime, and identity theft.
Many perpetrators, particularly those who commit sexual offenses, take photos and videos of their acts, and store them on computers and
smartphones. Between October 2014 and June 2015, 35 percent of the data extracted from all phones by my Office was collected from Apple devices; 36
percent was collected from Android devices.2 That means that when smartphone encryption is fully deployed by Apple and Google, 71
percent of all mobile devices examined—at least by my Office’s lab—may be outside the reach of a search warrant. I want to emphasize I
am testifying from a state and local perspective. I am not advocating bulk data collection or unauthorized surveillance. Instead, I am concerned about
protecting local law enforcement’s ability to conduct targeted requests for information, scrutinized by an impartial judge for his or her
evaluation as to whether probable cause has been established. Importantly, and by Apple’s own admission, governmental request for information have
affected only .00571 percent of Apple’s customers.
Strong encryption decks law enforcement abilities – can’t obtain any data
Vance 7/8 <Cyrus R., New York District Attorney, 7/8/15, “Going Dark: Encryption, Technology, and the Balance Between
Public Safety and Privacy”, p.3-5, http://www.judiciary.senate.gov/imo/media/doc/07-08-15%20Vance%20Testimony.pdf>//wx
Last fall, Apple and Google, whose operating systems run 96 percent of smartphones worldwide, announced with some fanfare, but without notice to my
Office or other law enforcement offices I have spoken to, that they had engineered their new mobile operating systems such that they can no
longer assist law enforcement with search warrants written for passcode- protected smartphones. According to Apple’s website: On
devices running iOS 8.0 and later versions, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes
content, notes, and reminders is placed under the protection of your passcode. . . Apple will not perform iOS data extractions in response to
government search warrants because the files to be extracted are protected by an encryption key that is tied to the user’s passcode,
which Apple does not possess. [Emphasis added.]5 Apple’s announcement led to an immediate response by law enforcement officials who pointed out that
allowing a phone or tablet to be locked such that it would be beyond the reach of lawful searches and seizures was unprecedented and
posed a threat to law enforcement efforts – in effect, a boon to criminals. Unless law enforcement officials can obtain the passcode
from the user, which will be difficult or impossible in many cases, or can use “brute force” to obtain the passcode (again, difficult or impossible, and
attempts to do this would likely lead to the destruction of evidence on the iPhone), the search warrant would be of no consequence, because no one will be
able to unlock the phone, notwithstanding the court order. Law enforcement’s warnings are hardly idle. Recently, a father of six was
murdered in Evanston, Illinois. City of Evanston Police believe that prior to his murder, the victim was robbed of a large sum of money. There were no
eyewitnesses to or surveillance footage of the killing. Found alongside the body of the deceased were an iPhone 6 and a Samsung Galaxy S6
Edge running Google Android. Cook County prosecutors served Apple and Google with judicial warrants to unlock the phones, believing that
relevant evidence might be stored on them. Apple and Google replied, in substance, that they could not, because they did not know the user’s passcode.
Information that might be crucial to solving the murder, therefore, had effectively died with the victim. His homicide remains
unsolved. His killer remains at large. It is not hyperbole to say that beginning in September 2014, Americans conceded a measure of their
protection against everyday crimes to Apple and Google’s new encryption policies. Yet, I would note that, before the changes, neither company, to
our knowledge, ever suggested that their encryption keys, held by the companies, were vulnerable to hacking or theft. Fully one-quarter of our felony
cases now involve cybercrime or identity theft, so I am keenly aware of the dangers and impact of these crimes on our community (which happens to be
situated in a world financial center and is the number one target for terrorism in the world). Because of this, my Office has invested heavily in becoming highly
proficient and active in the prosecution of these crimes, and in the promotion of best cybersecurity practices for New York consumers and companies. From my
vantage point, and in my opinion, for reasons set forth later in my testimony, Apple and Google’s new encryption policies seem to increase
protection for consumers from hackers only minimally, if at all. But those policies create serious new risks for my constituents and the
millions of visitors and workers passing through Manhattan every day.
Access to smartphone data is key to law enforcement – numerous cases prove
Vance 7/8 <Cyrus R., New York District Attorney, 7/8/15, “Going Dark: Encryption, Technology, and the Balance Between
Public Safety and Privacy”, p.3-5, http://www.judiciary.senate.gov/imo/media/doc/07-08-15%20Vance%20Testimony.pdf>//wx
The Cost of Evidence Made Inaccessible Through Apple’s Encryption Although encryption has been often discussed in the context of international terrorism, the
NSA, and the CIA, the greatest cost of these new encryption policies may well be borne by local law enforcement . Smartphones are
ubiquitous, and there is almost no kind of case in which prosecutors have not used evidence from smartphones . My Office (and, I expect,
every other local prosecutor’s office) has used evidence from cellphones in homicides, rape cases, human trafficking, assaults, domestic violence cases,
narcotics cases, kidnappings, larcenies, frauds, identity theft, cybercrime, and robberies. Indeed, it is the rare case in which information from a
smartphone is not useful. The following list of recent cases is representative: • Homicide: People v. Hayes, Indictment Number 4451/12:
The victim was filming a video using his iPhone when he was shot and killed by the defendant. The video captured the shooting. Because the iPhone was not
locked, the video was recovered and admitted into evidence at trial. The video corroborated eyewitness testimony. The defendant was convicted of murder and
sentenced to 35 years to life. • Sex Trafficking: People v. Brown, Indictment Numbers 865/12, 3908/12, and 3338/13: The defendant directed a sex
trafficking operation involving at least four women, using physical violence, threats of force, and psychological manipulation to coerce the women to engage in
prostitution. Evidence recovered from electronic devices lawfully seized from the defendant’s home proved crucial to his conviction at trial. In particular, the
defendant’s cellular phones contained photographs showing him posing his victims for online prostitution advertisements, and showing that he had “branded”
multiple women, with his 14 nickname tattooed onto their bodies; text messages between him and several victims confirmed that he had engaged in acts of
violence against the testifying witness and others. The defendant was convicted of multiple counts of sex trafficking and promoting prostitution and was
sentenced to 10-20 years in prison. • Cybercrime and Identity Theft: People v. Jacas et al., Indictment Number 42/12 and People v. Brahms et al.,
Indictment Number 5151/11: This case involved the successful prosecution of a 29-member identity theft ring, which was able to be investigated and prosecuted,
in large part, because of evidence obtained early in the investigation from an iPhone, pursuant to a search warrant. An iPhone was recovered from a waiter who
was arrested for stealing more than 20 customers’ credit card numbers by surreptitiously swiping those credit cards through a card reader that stored the credit
card number and other data. When the phone was lawfully searched, law enforcement officials discovered text messages between members of the group
regarding the ring’s crimes. Investigators were able to obtain an eavesdropping warrant, and ultimately arrested 29 people, including employees of high-end
restaurants who stole credit card numbers, shoppers who made purchases using counterfeit credit cards containing the stolen credit card numbers, and
managers who oversaw the operation. The group compromised over 100 American Express credit card numbers and stole property worth over $1,000,000. All of
the defendants pleaded guilty, and more than $1,000,000 in cash and merchandise were seized and forfeited. • Sex Offenses: United States v. Juarez,
Case No. 12-CR-59: The defendant was arrested for unlawful surveillance by an NYPD officer after the officer observed the defendant using a cell phone to film
up women’s skirts. My Office obtained a search warrant for the phone. 15 During the subsequent search of the phone’s micro SD card, forensic analysts
discovered a series of images, taken by the defendant, showing a seven-year-old girl lying down on a bed and an adult man pushing aside her underwear,
revealing her genitals. The case was referred to the United States Attorney’s Office for the Eastern District of New York, which charged the defendant with
producing child pornography. • Physical and Sexual Abuse of a Child: U.S. v. Patricia and Matthew Ayers, Case No. 5:14 CR 0117 LSC SGC:
In case after case, law enforcement has been able to discover and prosecute child abuse by using video or photographic evidence taken by the abuser. This
case is illustrative: From 2010 to 2013, the defendants abused and exploited a young child in their care who, during that period, was six to nine years old. The
couple took photographs of the child in lewd poses, as well as of each other engaged in sexual acts with the child. The defendants recorded the abuse with their
smartphones and downloaded the images to a computer. In at least one instance, one of the defendants transmitted images to another individual, indicating that
she would travel interstate with the child to the individual’s home so the individual could also have sexual relations with the child. The federal judge overseeing
the case described it as the worst case he has personally dealt with, including murders, in his 16 years on the bench. The defendants were ultimately convicted
of producing child pornography, in violation of 18 U.S.C. § 2251(a), and were sentenced to 1,590 and 750 years, respectively, in federal prison. There are
many other cases—almost too many to count—that I might have selected, but the point is clear: We would risk losing crucial evidence
in all of these cases if the contents of passcode-protected smartphones were unavailable to us, even with a warrant. 16 The enormity of the
loss is fully appreciated by wrongdoers who use smartphones. Recently, a defendant in a serious felony case told another individual on recorded
jailhouse call that “Apple and Google came out with these softwares that can no longer be encrypted [sic: decrypted] by the police. . . . If our phones is running on
the i0[S]8 software, they can’t open my phone. That might be another gift from God.” This defendant’s appreciation of the safety that the iOS 8 operating system
afforded him, is surely shared by criminal defendants in every jurisdiction in America charged with all manner of crimes, including rape, kidnapping, robbery,
promotion of child pornography, larceny, and presumably by those interested in committing acts of terrorism. Criminal defendants across the nation are
the principal beneficiaries of iOS 8, and the safety of all American communities is imperiled by it.
Data on encrypted devices is crucial to law enforcement and counterterrorism
Yates and Comey 7/8 <Sally Quillian Yates, Deputy Attorney General, and James B. Comey, Director of the FBI, 7/8/2015,
“Going Dark: Encryption, Technology, and the Balance Between Public Safety and Privacy”, p.3-4,
http://www.judiciary.senate.gov/imo/media/doc/07-08-15%20Yates%20and%20Comey%20Joint%20Testimony1.pdf>//wx
The more we as a society rely on electronic devices to communicate and store information, the more likely it is that information that was
once found in filing cabinets, letters, and photo albums will now be stored only in electronic form. We have seen case after case – from
homicides and kidnappings, to drug trafficking, financial fraud, and child exploitation – where critical evidence came from smart phones, computers,
and online communications. When changes in technology hinder law enforcement’s ability to exercise investigative tools and follow
critical leads, we may not be able to identify and stop terrorists who are using social media to recruit, plan, and execute an attack in our country.
We may not be able to root out the child predators hiding in the shadows of the Internet, or find and arrest violent criminals who are targeting our
neighborhoods. We may not be able to recover critical information from a device that belongs to a victim who cannot provide us with the
password, especially when time is of the essence. These are not just theoretical concerns. We continue to identify individuals who seek to join the
ranks of foreign fighters traveling in support of the Islamic State of Iraq and the Levant, commonly known as ISIL, and also homegrown violent
extremists who may aspire to attack the United States from within. These threats remain among the highest priorities for the Department of Justice, including
the FBI, and the United States government as a whole. Of course, encryption is not the only technology terrorists and criminals use to further their ends.
Terrorist groups, such as ISIL, use the Internet to great effect. With the widespread horizontal distribution of social media, terrorists can spot, assess,
recruit, and radicalize vulnerable individuals of all ages in the United States either to travel or to conduct a homeland attack. As a result, foreign terrorist
organizations now have direct access into the United States like never before. For example, in recent arrests, a group of individuals was contacted by a known
ISIL supporter who had already successfully traveled to Syria and encouraged them to do the same. Some of these conversations occur in publicly accessed
social networking sites, but others take place via private messaging platforms. These encrypted direct messaging platforms are tremendously
problematic when used by terrorist plotters. Outside of the terrorism arena we see countless examples of the impact changing technology is having on
our ability to affect our court authorized investigative tools. For example, last December a long-haul trucker kidnapped his girlfriend, held her in his truck, drove
her from State to State and repeatedly sexually assaulted her. She eventually escaped and pressed charges for sexual assault and kidnapping. The trucker
claimed that the woman he had kidnapped engaged in consensual sex. The trucker in this case happened to record his assault on video using a smartphone,
and law enforcement was able to access the content stored on that ‐ 4 - phone pursuant to a search warrant, retrieving video that revealed that the sex was not
consensual. A jury subsequently convicted the trucker. In a world where users have sole control over access to their devices and communications, and so can
easily block all lawfully-authorized access to their data, the jury would not have been able to consider that evidence, unless the truck driver, against his own
interest, provided the data. And the theoretical availability of other types of evidence, irrelevant to the case, would have made no difference.
In that world, the grim likelihood that he would go free is a cost that we must forthrightly acknowledge and consider. We are seeing more and more
cases where we believe significant evidence resides on a phone, a tablet, or a laptop —evidence that may be the difference
between an offender being convicted or acquitted. If we cannot access this evidence, it will have ongoing, significant impacts on our ability to identify,
stop, and prosecute these offenders. Legal Framework
Encryption blocks successful investigation – Investigators locked out
Ellen Nakashima and Barton Gellman ’15 (Ellen Nakashima is a national security reporter for The Washington Post. Gellman
writes for the national staff. He has contributed to three Pulitzer Prizes for The Washington Post. He is a senior fellow at the
Century Foundation and visiting lecturer at Princeton’s Woodrow Wilson School. https://www.washingtonpost.com/world/nationalsecurity/as-encryption-spreads-us-worries-about-access-to-data-for-investigations/2015/04/10/7c1c7518-d401-11e4-a62fee745911a4ff_story.html)CK
Bitkower cited a case in Miami in December in which a long-haul trucker kidnapped his girlfriend, held her in his truck, drove her
from state to state and repeatedly sexually assaulted her. She eventually escaped and pressed charges for sexual assault and
kidnapping. His defense, Bitkower said, was that she engaged in consensual sex. As it turned out, the trucker had videorecorded his assault, and the phone did not have device encryption enabled. Law enforcement agents were able to get a warrant
and retrieve the video. It “revealed in quite disturbing fashion that this was not consensual,” Bitkower said. The jury convicted the
trucker. Officials and former agents say there will be cases in which crimes will go unsolved because the data was unattainable
because only the phone owner held the key. “I just look at the number of cases I had where, if the bad guy was using one of
these [locked] devices, we never would have caught him,” said Timothy P. Ryan, a former FBI supervisory special agent who
now leads Kroll Associates’ cyber-investigations practice.
Encryption decks efforts to combat ISIS- online recruiting
Clare Hopping 7/8/15---Freelance editor and journalist as well as editorial editor for Longneck and Thunderfoot. Cites FBI.
(Hopping, “FBI director complains encryption makes his job harder”, ITpro. http://www.itpro.co.uk/security/24943/fbi-encryptionhelps-isis-recruit-new-members.)//ET
Universal encryption will help terrorists spread their creeds through secure messaging services, according to the FBI. James Comey,
director of the agency, claimed in a blog post that worldwide encryption will help groups like ISIS ahead of his appearance at the Senate Intelligence
Committee. He wrote that secure messaging services and social media will help ISIS recruit new members online. "When the
government's ability—with appropriate predication and court oversight—to see an individual's stuff goes away, it will affect public safety," he
wrote on pro surveillance website Lawfare. "That tension is vividly illustrated by the current ISIL threat, which involves ISIL operators in Syria
recruiting and tasking dozens of troubled Americans to kill people, a process that increasingly takes part through mobile messaging
apps that are end-to-end encrypted, communications that may not be intercepted, despite judicial orders under the Fourth Amendment."
Backdoors would allow criminals to bypass encryption
Phys.org 15 (“Security experts warn against encryption 'backdoors'”, 7/7/15, http://phys.org/news/2015-07-experts-encryptionbackdoors.html)
A group of computer code experts said Tuesday that law enforcement cannot be given special access to encrypted communications without
opening the door to "malicious" actors. A research report published by the Massachusetts Institute of Technology challenges claims from US and British
authorities that such access is the policy response needed to fight crime and terrorism. Providing this kind of access "will open doors through
which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend," said the report by 13
scientists. The paper was released a day after FBI Director James Comey called for public debate on the use of encrypted communications, saying Americans
may not realize how radical groups and criminals are using the technology. Comey argued in a blog post that Islamic State militants are among those using
encryption to avoid detection. The New York Times, which reported earlier on the study, said Comey was expected to renew a call at a congressional hearing for
better access to encrypted communications to avoid "going dark." The computer scientists said, however, that any effort to build in access for law
enforcement could be exceedingly complex and lead to "unintended consequences," such as stifling innovation and creating hostility toward
new tech products. "The costs would be substantial, the damage to innovation severe, and the consequences to economic growth
difficult to predict," the report said. "The costs to developed countries' soft power and to our moral authority would also be considerable." In the 1990s, there
was a similar debate on the "clipper chip" proposal to allow "a trusted third party" to have access to encrypted messages that could be granted under a legal
process. The clipper chip idea was abandoned, but the authors said that if it had been widely adopted, " it is doubtful that companies like Facebook
and Twitter would even exist." The computer scientists said the idea of special access would create numerous technical and legal
challenges, leaving unclear who would have access and who would set standards.
--a2 metadata solves
Encryption prevents access to key communication data
Hess, Executive Assistant Director of the FBI, 15 (Amy Hess, Executive Assistant Director of the FBI,
“ENCRYPTION TECHNOLOGY POLICY ISSUES”, 4/29/15,
HTTP://congressional.proquest.com.proxy.lib.umich.edu/congressional/docview/t39.d40.04293003.d94?accountid=14667)//EM*E
dited for easier flow
Law enforcement and national security investigators need to be able to access communications and information to obtain the
evidence necessary to prevent crime and bring criminals to justice in a court of law. We do so pursuant to the rule of law, with clear guidance and strict
judicial oversight. But increasingly, even armed with a court order based on probable cause, we are* [the FBI is] too often unable to access
potential evidence. The Communications Assistance for Law Enforcement Act (CALEA) requires telecommunication carriers to be
able to implement court orders for the purpose of intercepting communications. But that law wasn`t designed to cover many of the new means of
communication that exist today. Currently, thousands of companies provide some form of communication service, but most do not have
the ability to isolate and deliver particular information when ordered to do so by a court. Some have argued that access to metadata about
these communications - which is not encrypted - should be sufficient for law enforcement. But metadata is incomplete information, and can be is
difficult to analyze when time is of the essence. It can take days to parse metadata into readable form, and additional time to correlate
and analyze the data to obtain meaningful and actionable information.
--a2 cloud solves
Cloud storage fails – backdoors are key
Hess, Executive Assistant Director of the FBI, 15 (Amy Hess, Executive Assistant Director of the FBI,
“ENCRYPTION TECHNOLOGY POLICY ISSUES”, 4/29/15,
HTTP://congressional.proquest.com.proxy.lib.umich.edu/congressional/docview/t39.d40.04293003.d94?accountid=14667)//EM
Additional Considerations Some assert that although more and more devices are encrypted, users back-up and store much of their data in ``the
cloud,`` and law enforcement agencies can access this data pursuant to court order. For several reasons, however, the data may not be
there. First, aside from the technical requirements and settings needed to successfully back up data to the cloud , many companies
impose fees to store information there - fees which consumers may be unwilling to pay . Second, criminals can easily avoid
putting information where it may be accessible to law enforcement. Third, data backed up to the cloud typically includes
only a portion of the data stored on a device, so key pieces of evidence may reside only on a criminal`s or terrorist`s phone, for
example. And if criminals do not back up their phones routinely, or if they opt out of uploading to the cloud altogether, the data may only be found on the devices
themselves devices which are increasingly encrypted.
--a2 hacking solves
Brute force attacks fail – backdoors are key
Hess, Executive Assistant Director of the FBI, 15 (Amy Hess, Executive Assistant Director of the FBI,
“ENCRYPTION TECHNOLOGY POLICY ISSUES”, 4/29/15,
HTTP://congressional.proquest.com.proxy.lib.umich.edu/congressional/docview/t39.d40.04293003.d94?accountid=14667)//EM
But without a solution that enables law enforcement to access critical evidence, many investigations could be at a dead end. The same is true for cyber security
investigations; if there is no way to access encrypted systems and data, we may not be able to identify those who seek to steal our technology, our state secrets,
our intellectual property, and our trade secrets. A common misperception is that we can simply break into a device using a ``brute force``
attack - the idea that with enough computing resources devoted to the task, we can defeat any encryption. But the reality is that even a
supercomputer would have difficulty with today`s highlevel encryption standards. And some devices have a setting that erases the
encryption key if someone makes too many attempts to break the password, effectively closing all access to that data. Finally, a
reasonable person might also ask, ``Can`t you just compel the owner of the device to produce the information in a readable form?`` Even if we could compel an
individual to provide this information, a suspected criminal would more likely choose to defy the court`s order and accept a punishment for contempt rather than
risk a 30-year sentence for, say, production and distribution of child pornography. Without access to the right evidence, we fear we may not be able to identify
and stop child predators hiding in the shadows of the Internet, violent criminals who are targeting our neighborhoods, and terrorists who may be using social
media to recruit, plan, and execute an attack in our country. We may not be able to recover critical information from a device that belongs to a victim who can`t
provide us with the password, especially when time is of the essence.
NSA can’t crack encryption—Need encryption keys to access
The Nation ’15 (The Nation is America’s oldest continuously published weekly magazine, devoted to reporting on politics and
culture. The Nation has bureaus in Washington, D.C., London, and South Africa, with departments covering architecture, art,
corporations, defense, environment, films, legal affairs, music, peace and disarmament, poetry, and the United Nations,
http://www.nationmultimedia.com/breakingnews/NSA-cant-crack-common-encryption-software-top-hack-30251390.html)CK
Publicly available encryption programmes are so tough that they can't be cracked by the experts at the US National Security
Agency (NSA), an authoritative expert has told one of the world's top hacker jamborees. The assurance, delivered by Jacob
Applebaum during this month's Chaos Communication Congress (CCC) in Hamburg, Germany, ends months of speculation that
the NSA may have found a backdoor into such privacysoftware. Services like PGP for protecting emails and OTR (off the record)
for protecting messaging are pretty safe, agreed experts at CCC, which attracts some of the globe's top hacking experts every
January. "PGP and OTR are two ways to keep spies from looking through your stuff," says US activist Applebaum. He said
communications protected end to end with these services cannot be read by the NSA. Period. Options like the SSL encryption
protocol can be surmounted though, he said. SSL is used - often by banks and internet retail - to keep prying eyes from seeing
which websites are being accessed and what's sent to them. SSH, used by system administrators to get into other computers
and run them, can also be cracked. It's not clear, though, if the NSA has actually cracked their protocols. Instead, it seems the
US electronic intelligence agency is trying to collect keys so it can crack encrypted communication by other methods. That's
according to documents released by whistleblower Edward Snowden, a former NSA contractor, which have been published by
German news magazine Der Spiegel.
--a2 court order solves
Court orders can’t compel decryption – backdoors are key
Crocker, attorney at the Electronic Frontier Foundation, 14 (Andrew Crocker, Graduate of Harvard Law and
attorney at the Electronic Frontier Foundation in civil liberties, “Sifting Fact from Fiction with All Writs and Encryption: No
Backdoors”, 12/3/14, https://www.eff.org/deeplinks/2014/12/sifting-fact-fiction-all-writs-and-encryption-no-backdoors)//EM*Edited
for easier flow
Following recent reports in the Wall Street Journal and Ars Technica, there’s been new interest in the government’s use of a relatively obscure law, the All Writs
Act. According to these reports, the government has invoked the All Writs Act in order to compel the assistance of smartphone manufacturers in unlocking
devices pursuant to a search warrant. The reports are based on orders from federal magistrate judges in Oakland and New York City issued to
Apple and another unnamed manufacturer (possibly also Apple) respectively, requiring them to bypass the lock screen on seized phones and
enable law enforcement access. These reports come at an interesting time. Both Apple and Google have announced expanded encryption
in their mobile operating systems. If a device is running the latest version of iOS or Android, neither company will be able to
bypass a user’s PIN or password and unlock a phone, even if the government gets a court order asking it to do so . The
announcements by Apple and Google have in turn led to calls for “golden keys”—hypothetical backdoors in devices intended to allow only law enforcement to
access them. As we’ve explained, we think these proposals to create backdoors totally misunderstand the technology and make for terrible policy. Amid this
prospect of a second “Cryptowar” is the lurking fear that the government might force unwilling companies to include backdoors in their products, even if they’re
not required by Congress to do so. We sometimes hear from jaded developers and others who think that all it would take to force a backdoor is
one National Security Letter. While NSLs are unconstitutional, even the government admits that they* [NSLs] can only be used to obtain
limited information, which does not include forcing anyone to backdoor a product. Nevertheless, this fear is feeding some of the interest
generated by the press reports about the government’s invocation of All Writs Act in the unlocking cases.
Court orders fail – backdoors are key
Hess, Executive Assistant Director of the FBI, 15 (Amy Hess, Executive Assistant Director of the FBI,
“ENCRYPTION TECHNOLOGY POLICY ISSUES”, 4/29/15,
HTTP://congressional.proquest.com.proxy.lib.umich.edu/congressional/docview/t39.d40.04293003.d94?accountid=14667)//EM
Encryption of stored data is not new, but it has become increasingly prevalent and sophisticated. The challenge to law enforcement and
national security officials has intensified with the advent of default encryption settings and stronger encryption standards on both devices and
networks. In the past, a consumer had to decide whether to encrypt data stored on his or her device and take some action to implement that encryption. With
today`s new operating systems, however, a device and all of a user`s information on that device can be encrypted by default - without any
affirmative action by the consumer. In the past, companies had the ability to decrypt devices when the Government obtained a search warrant and a court order.
Today, companies have developed encryption technology which makes it impossible for them to decrypt data on devices they
manufacture and sell, even when lawfully ordered to do so. Although there are strong and appropriate cybersecurity and other reasons to
support these new uses of encryption, such decisions regarding system design have a tremendous impact on law enforcement`s ability to fight crime and bring
perpetrators to justice. Evidence of criminal activity used to be found in written ledgers, boxes, drawers, and file cabinets, all of which could be searched pursuant
to a warrant. But like the general population, criminal actors are increasingly storing such information on electronic devices. If these
devices are automatically encrypted, the information they contain may be unreadable to anyone other than the user of the device.
Obtaining a search warrant for photos, videos, email, text messages, and documents can be an exercise in futility. Terrorists and other criminals
know this and will increasingly count on these means of evading detection.
Data encryption eviscerates the third party doctrine – Lack of encryption permits government
access
Christopher Soghoian Ph.D 06 (Principal Technologist with the Speech, Privacy, and Technology Project at the American
Civil Liberties Union. He is also a Visiting Fellow at Yale Law School's Information Society Project. Caught in the Cloud: Privacy,
Encryption, and Government Back Doors in the Web 2.0 Era Privacy and Law Enforcement pg. 391
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1421553)CK
“The third party doctrine is the Fourth Amendment rule that scholars love to hate. It is . . . widely criticized as profoundly
misguided. Decisions applying the doctrine *top[] the chart of [the] most-criticized Fourth Amendment cases.’”95 However, for the
purposes of this article, it can be summarized by stating that online service providers can be compelled to reveal their customers’
private documents with a mere subpoena.96 As such, the government is not required to obtain a search warrant,97 demonstrate
probable cause98 or go before a judge. While the third party doctrine is certainly the current tool of choice for the government’s
evisceration of the Fourth Amendment, is not completely to blame for the lack of privacy online. The real and often overlooked
threat to end-user privacy is not this legal rule, but the industry-wide practice of storing customers’ data in plain text, forgoing any
form of encryption. Simply put, if encryption were used to protect users’ stored data, the third party doctrine would for the most
part be moot.
--a2 voluntary solves
Compelling fails – backdoors are key
Hess, Executive Assistant Director of the FBI, 15 (Amy Hess, Executive Assistant Director of the FBI,
“ENCRYPTION TECHNOLOGY POLICY ISSUES”, 4/29/15,
HTTP://congressional.proquest.com.proxy.lib.umich.edu/congressional/docview/t39.d40.04293003.d94?accountid=14667)//EM
But without a solution that enables law enforcement to access critical evidence, many investigations could be at a dead end. The same is true for cyber security
investigations; if there is no way to access encrypted systems and data, we may not be able to identify those who seek to steal our technology, our state secrets,
our intellectual property, and our trade secrets. A common misperception is that we can simply break into a device using a ``brute force`` attack - the idea that
with enough computing resources devoted to the task, we can defeat any encryption. But the reality is that even a supercomputer would have difficulty with
today`s highlevel encryption standards. And some devices have a setting that erases the encryption key if someone makes too many attempts to break the
password, effectively closing all access to that data. Finally, a reasonable person might also ask, ``Can`t you just compel the owner of the device to produce the
information in a readable form?`` Even if we could compel an individual to
provide this information, a suspected criminal would more
likely choose to defy the court`s order and accept a punishment for contempt rather than risk a 30-year sentence for, say,
production and distribution of child pornography. Without access to the right evidence, we fear we may not be able to identify and stop
child predators hiding in the shadows of the Internet, violent criminals who are targeting our neighborhoods, and terrorists who
may be using social media to recruit, plan, and execute an attack in our country. We may not be able to recover critical
information from a device that belongs to a victim who can`t provide us with the password, especially when time is of the essence.
DA – POLITICS
--xt link
Plan is unpopular – seems soft on crime
Christopher Soghoian Ph.D 06 (Principal Technologist with the Speech, Privacy, and Technology Project at the American
Civil Liberties Union. He is also a Visiting Fellow at Yale Law School's Information Society Project. Caught in the Cloud: Privacy,
Encryption, and Government Back Doors in the Web 2.0 Era Privacy and Law Enforcement pg. 421
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1421553)CK
The problem of compelled back doors is extremely difficult. Due to powers provided to the government by the various laws outlined earlier in this
article, consumers can never completely trust the companies who make and supply the software that they use to go about their daily business online. Any firm
can be compelled to insert a back door into its own product, no matter how committed it is to protecting the privacy of its customers. The simplest solution
to this problem would be to amend the law to prohibit this coercive behavior by government agencies . However, given the
realities of Washington DC, and the fear of being accused of being soft on terrorism or child pornography, it is unlikely that
Congress would agree to any form of legislative fix which took away this power. Thus, we focus our attention upon non-legislative solutions to
this issue.
CP – CNCI (CYBERSECURITY ADV CP)
1nc cnci cp
Text: The Department of Homeland Security should substantially expand
the Comprehensive National Cybersecurity Initiative.
CNCI expansion key to solving cybersecurity – organized response to future
incidents, strengthen partnerships, invest in technology, and all this will be
done ensuring human rights and privacy.
White House ’09: The White House Foreign Policy Cybersecurity National Initiative Page. Cites the President of the
United States Document Release. (“The Comprehensive National Cybersecurity Initiative”, the White House President Barack
Obama, May 2009, https://www.whitehouse.gov/issues/foreign-policy/cybersecurity/national-initiative)//chiragjain
President Obama
has identified cybersecurity as one of the most serious economic and
national security challenges we face as a nation, but one that we as a government or as a
country are not adequately prepared to counter. Shortly after taking office, the President therefore
ordered a thorough review of federal efforts to defend the U.S. information and
communications infrastructure and the development of a comprehensive approach to
securing America’s digital infrastructure. In May 2009, the President accepted the
recommendations of the resulting Cyberspace Policy Review, including the selection of
an Executive Branch Cybersecurity Coordinator who will have regular access to the
President. The Executive Branch was also directed to work closely with all key players in U.S.
cybersecurity, including state and local governments and the private sector, to ensure an
organized and unified response to future cyber incidents ; strengthen public/private
partnerships to find technology solutions that ensure U.S. security and prosperity;
invest in the cutting-edge research and development necessary for the innovation and
discovery to meet the digital challenges of our time; and begin a campaign to promote
cybersecurity awareness and digital literacy from our boardrooms to our classrooms and
begin to build the digital workforce of the 21st century. Finally, the President directed
that these activities be conducted in a way that is consistent with ensuring the privacy
rights and civil liberties guaranteed in the Constitution and cherished by all Americans.
The activities under way to implement the recommendations of the Cyberspace Policy Review build on the
Comprehensive National Cybersecurity Initiative (CNCI) launched by President George W.
Bush in National Security Presidential Directive 54/ Homeland Security Presidential Directive 23 (NSPD54/ HSPD-23) in January 2008. President Obama determined that the CNCI and its associated activities
should evoltve to become key elements of a broader, updated national U.S. cybersecurity
strategy. These CNCI initiatives will play a key role in supporting the achievement of
many of the key recommendations of President Obama’s Cyberspace Policy Review. The
CNCI consists of a number of mutually reinforcing initiatives with the following major goals designed to help secure the United
States in cyberspace: To
establish a front line of defense against today’s immediate threats by
creating or enhancing shared situational awareness of network vulnerabilities, threats, and
events within the Federal Government—and ultimately with state, local, and tribal governments and private sector partners—and the
ability to act quickly to reduce our current vulnerabilities and prevent intrusions. To
defend against the full
spectrum of threats by enhancing U.S. counterintelligence capabilities and increasing
the security of the supply chain for key information technologies. To strengthen the future
cybersecurity environment by expanding cyber education; coordinating and redirecting research and
development efforts across the Federal Government; and working to define and develop
strategies to deter hostile or malicious activity in cyberspace. In building the plans for the CNCI, it was
quickly realized that these goals could not be achieved without also strengthening certain key strategic foundational capabilities
within the Government. Therefore, the CNCI
includes funding within the federal law enforcement,
intelligence, and defense communities to enhance such key functions as criminal
investigation; intelligence collection, processing, and analysis; and information
assurance critical to enabling national cybersecurity efforts. The CNCI was developed with
great care and attention to privacy and civil liberties concerns in close consultation with
privacy experts across the government. Protecting civil liberties and privacy rights
remain fundamental objectives in the implementation of the CNCI. In accord with President
Obama’s declared intent to make transparency a touchstone of his presidency, the Cyberspace Policy Review identified enhanced
information sharing as a key component of effective cybersecurity. To improve public understanding of Federal efforts, the
Cybersecurity Coordinator has directed the release of the following summary description of the CNCI.
Plan doesn’t solve - CNCI initiatives are key to solving cybersecurity –
Roulo 12: American Forces Press Service – U.S. Department of Defense Website. (“Cybercom Chief: U.S. Unprepared for Serious
Cyber Attacks”, Claudette Roulo, July 26, 2012, United States Department of Defense,
http://www.defense.gov/news/newsarticle.aspx?id=117289)//chiragjain
ASPEN, Colo., July 26, 2012 – The
United States is not adequately prepared for a serious cyber
attack, the commander of U.S. Cyber Command told the audience at the Aspen Institute’s annual security
forum today. Army Gen. Keith Alexander, who also serves as the director of the National
Security Agency and the chief of the Central Security Service, said that, in terms of
preparation for a cyber attack on a critical part of its network infrastructure, the U.S. is
at a three on a scale of one to ten. The problem of defending the nation from a cyber attack is complicated,
Alexander said. It’s not just a question of preparing the Department of Defense or federal networks. Private industry also has to be
defended. “Industry has a variety of capabilities,” Alexander said. While networks serving the financial community are well-
Key to developing a strong cyber security infrastructure is
educating its users , Alexander said. “We have a great program, it’s jointly run by [the National Security Agency] and [the
defended, others sectors need help.
Department of Homeland Security] working with over 100 different colleges and universities to set up an information
assurance/cyber security portfolio,” he said. Ensuring people who didn’t grow up in the Internet age are security-aware is one of the
major challenges facing those who secure the network, Alexander said. The
number of exploits of mobile
technology has almost doubled over the past year, he said, and many people don’t
realize that phones are tied into the same digital network infrastructure as computers.
Alexander defined exploits as the means that a hacker uses to penetrate a system, including
mobile phones or tablets, to potentially steal files and credentials or jump to another computer. “The attack surfaces for adversaries
to get on the internet now include all those mobile devices,” Alexander said. And mobile security lags behind that of cyber security
for landline devices like desktop computers. Alexander said the
Department of Defense, in concert with
agencies like the Department of Homeland Security and the Federal Bureau of
Investigation, works together with industry to secure network devices. “If we identify a
problem, we jointly give that back to industry and say ‘Here’s a problem we found,’”
Alexander said. Using the nuclear model, or concentrating solely on major nation-states, to analyze the cyber threat is wrong, he
said. Several
nations are capable of serious cyber attacks, he explained, but anyone who
finds vulnerabilities in the network infrastructure could cause tremendous problems.
Industry and government must work as a team to combat these threats, Alexander said. “There are great folks in industry who have
some great insights,” he said. “That’s the only way that we can prevent those several states from mounting a real attack on this
nation’s cyber.” In addition, deterrence
theory worked for nuclear weapons in part because the
decision time was much slower than it is for cyber threats. “A piece of information can circumnavigate
the globe in about 133-134 milliseconds,” he said. “Your decision space in cyber [is] half that—60 seconds.” “My concern is...you’ve
seen disruptions like in Estonia in 2007, in Georgia, Latvia, Lithuania, Azerbaijan, Kyrgyzstan, you could go on,” he said. “We’ve
seen them here in the United States... What
I’m concerned about is the shift to destructive [attacks].
Those are the things that will hurt our nation.” Disruptive attacks, like distributed denial-of-service
attacks, are aimed at interrupting the flow communication or finance, but aren’t designed
to cause long-term damage. In contrast, destructive attacks are designed to destroy
parts of the network infrastructure, like routers or servers, which would have to be
replaced in order to resume normal operations, Alexander said. In some cases this could take weeks or
months. Congress is considering bills that would give the Department of Homeland Security a greater role in setting performance
requirements for network industries. Alexander said this legislation is important to assist in setting network infrastructure
standards. Both parties have something to bring to the table, he said. Industry knows things that government doesn’t, and
government knows things that industry doesn’t. “If
we were to be completely candid here, the reality is
that industry is getting hacked [and] government is getting hacked,” he said. “What we need to do
is come together and form best practices.” Government-civil partnerships open up the possibility that the U.S. can accomplish things
in cyber space that no other nation has the capability to accomplish, Alexander said. “When we put together this ability for our
nation to work as a team in cyber space, what that allows us to do now is do things that other countries aren’t capable of doing in
defending the nation,” Alexander said. Because attributing the source of a cyber attack is difficult, the focus is currently on defense
rather than offense, Alexander said. “Today, the offense clearly has the advantage,” he said. “Get
cyber legislation in
there, bring industry and government together, and now we have the capability to say
‘You don’t want to attack us. We can stop it and there are other things that we can do to
really make this hurt.’” “The key is having a defensible capability that can survive that
first onslaught,” Alexander said.
--a2 deficit
CNCI creates government surveillance transparency
Schmidt 10 <Howard A., Special Assistant to the President and Cybersecurity Coordinator, 3/2/2010, “Transparent
Cybersecurity”, White House, https://www.whitehouse.gov/blog/2010/03/02/transparent-cybersecurity>//wx
Today in my keynote speech at the RSA Conference in San Francisco I discussed two themes that are vital to our nation’s cybersecurity efforts:
partnerships and transparency. These two themes go hand-in-hand. You cannot have one without the other, and they form the foundation of
nearly all of the action items outlined in the President’s Cyberspace Policy Review. Earlier this year in a memorandum on open government to all
Federal departments and agencies, President Obama said, “My Administration is committed to creating an unprecedented level of openness in
government.” Building on this statement, I am personally dedicated to ensuring that the Federal Government’s cybersecurity efforts are as
transparent as possible. For this reason, I was pleased to announce today that the Obama Administration has revised the classification guidance for the
Comprehensive National Cybersecurity Initiative (or CNCI), which began in 2008 and forms an important component of cybersecurity efforts within the federal
government. Anyone can now view or download an unclassified description of the CNCI and each of the 12 initiatives under the CNCI. Transparency is
particularly vital in areas, such as the CNCI , where there have been legitimate questions about sensitive topics like the role of the intelligence
community in cybersecurity. Transparency provides the American people with the ability to partner with government and participate
meaningfully in the discussion about how we can use the extraordinary resources and expertise of the intelligence community with proper
oversight for the protection of privacy and civil liberties. In order to be successful against today’s cybersecurity threats, we must continue to seek
out innovative new partnerships—not only within government, but also among industry, government, and the American public. Transparency improves our
collective knowledge and helps bind our partnerships together to form the most powerful cyber tools that we have. We will not defeat our cyber adversaries
because they are weakening, we will defeat them by becoming collectively stronger, through stronger technology, a stronger cadre of security professionals, and
stronger partnerships.
Cp is key to solve—federal framework needs to be changed
Fischer 13 (Eric A. Fischer, Senior Specialist in Science and Technology, Congressional Research Service Report for
Congress“Federal Laws relating to cybersecurity: Overview and Discussion of Proposed Revisions” 06/20/13
https://www.fas.org/sgp/crs/natsec/R42114.pdf)
For more than a decade, various experts have expressed increasing concerns about cybersecurity, in light of the growing frequency,
impact, and sophistication of attacks on information systems in the United States and abroad. Consensus has also been building that the current legislative
framework for cybersecurity might need to be revised. The complex federal role in cybersecurity involves both securing federal systems
and assisting in protecting nonfederal systems. Under current law, all federal agencies have cybersecurity responsibilities relating to their own systems, and
many have sector-specific responsibilities for critical infrastructure. More than 50 statutes address various aspects of cybersecurity either directly or indirectly, but
there is no overarching framework legislation in place. While revisions to most of those laws have been proposed over the past few years, no
major cybersecurity legislation has been enacted since 2002. Recent legislative proposals, including many bills introduced in recent Congresses,
have focused largely on issues in 10 broad areas (see “Selected Issues Addressed in Proposed Legislation” for an overview of how current legislative proposals
would address issues in several of those areas): • national strategy and the role of government, • reform of the Federal Information Security Management Act
(FISMA), • protection of critical infrastructure (including the electricity grid and the chemical industry), • information sharing and cross-sector coordination, •
breaches resulting in theft or exposure of personal data such as financial information, • cybercrime, • privacy in the context of electronic commerce, • international
efforts, • research and development, and • the cybersecurity workforce. For most of those topics, at least some of the bills addressing them have proposed
changes to current laws. Several of the bills specifically focused on cybersecurity received committee or floor action in the 112th and 113th Congresses,
but none has become law. In the absence of enactment of cybersecurity legislation, the White House issued Executive Order 1336, with
provisions on protection of critical infrastructure, including information sharing and standards development. Comprehensive legislative proposals on cybersecurity
that received considerable attention in 2012 are The Cybersecurity Act of 2012 (CSA 2012, S. 2105, reintroduced in revised form as S. 3414), recommendations
from a House Republican task force, and a proposal by the Obama Administration. They differed in approach, with S. 2105 proposing the most extensive
regulatory framework and organizational changes, and the task force recommendations focusing more on incentives for improving private-sector cybersecurity.
An alternative to S. 2105 and S. 3414, S. 3342 (a refinement of S. 2151), did not include enhanced regulatory authority or new federal entities, but did include
cybercrime provisions. S. 3414 was debated in the Senate but failed two cloture votes. Several narrower House bills would address some of the issues raised
and recommendations made by the House task force. Four passed the House in 2012 but were not considered by the Senate. They were reintroduced in passed
the House again, with some amendments, in April 2013: • Cyber Intelligence Sharing and Protection Act (H.R. 624), which focuses on information sharing and
coordination, including sharing of classified information; • Cybersecurity Enhancement Act of 2013 (H.R. 756), which addresses federal cybersecurity R&D and
the development of technical standards; • Advancing America’s Networking and Information Technology Research and Development Act of 2013 (H.R. 967),
which addresses R&D in networking and information technology, including but not limited to security; and • Federal Information Security Amendments Act of 2012
(H.R. 1163), which addresses FISMA reform. One bill from the 112th Congress was ordered reported out of the full committee but did not come to the floor: •
Promoting and Enhancing Cybersecurity and Information Sharing Effectiveness Act of 2011 or PRECISE Act of 2011 (H.R. 3674), which addressed the role of
the Department of Homeland Security in cybersecurity, including protection of federal systems, personnel, R&D, information sharing, and public/private sector
collaboration in protecting critical infrastructure. Together, those House and Senate bills have addressed most of the issues listed above, although in different
ways. All include proposed revisions to some existing laws covered in this report.
--a2 links to politics
CP doesn’t link to politics—DHS funding is bipartisan
Rebecca Shabad 15--- staff writer at The Hill newspaper I. B.A. from Syracuse university with a degree in political science.
(Shabad, “Senate panel approves $47.1B funding bill for Homeland Security”, 6/16/15. The Hill.
http://thehill.com/policy/finance/245116-senate-sub-panel-advances-homeland-security-funding-bill)//ET
A Senate subcommittee on Tuesday approved a $47.1 billion bill to fund the Department of Homeland Security, the Coast Guard and other
agencies in fiscal 2016. The base portion of the bill contains $40.2 billion in spending, which is $543 million more than 2015 and $1.18 billion
less than President Obama’s request. “This bill supports a wide range of critical operations at DHS to secure our nation , from mitigating the
impacts of natural disasters to securing our borders and airports,” said Sen. John Hoeven (R-N.D.), chairman of the Homeland Security Appropriations
subcommittee that oversees the bill. The legislation includes $160 million in emergency funds for the Coast Guard and $6.7 billion for a disaster relief fund
operated by the Federal Emergency Management Agency (FEMA). Customs and Border Protection, Immigration and Customs Enforcement, the Transportation
Security Administration and the Secret Service are among the agencies that would receive funding in the bill. Sen. Jeanne Shaheen (D-N.H.), ranking
member of the sub-panel, suggested the bill has the right priorities despite adhering to the sequestration budget ceilings that will return
in October. “While I strongly disapprove of the overall budget that this bill conforms to,
this is a bipartisan product that makes good use of
the funds provided. I appreciate that the legislation has not reduced important state and local grants, and am especially pleased that there’s significant
funding for local fire department grants,” she said. Under the measure, the Secret Service would receive a $258 million funding boost, which
would support activities related to the 2016 campaign as well as detail for Obama when he leaves office. The bill allocates funding that a panel recommended in
reviewing the Secret Service’s internal problems.
CP doesn’t link to politics- DHS funding is bipartisan—new oversight solves disputes
Thad Cochran 15--- Chairman on the committee of appropriations. (Cochran, “Committee Advances FY2016 Homeland Security
Appropriations Bill for Senate Debate”. 6/18/15. The
WASHINGTON, D.C. –The Senate Committee on Appropriations today approved a $47.09 billion measure to fund border security,
immigration enforcement, cybersecurity, disaster assistance and other programs to keep Americans safe. The FY2016 Homeland Security
Appropriations Bill [external link], approved on a bipartisan 26 to 4 vote, is now available for consideration by the Senate. To continue to ensure the
security of the American people, the legislation places emphasis on border security, including the U.S. Coast Guard, the Secret Service protective mission,
cybersecurity, and hazard mitigation. The measure establishes metrics for setting goals and assessing results for the U.S. Department of Homeland Security and
its agencies. “The American people expect the protection that this bipartisan legislation would provide to confront threats from the air,
land and sea, and in cyberspace. This important bill deserves to be debated by all Senators as soon as possible to ensure the Department of Homeland Security
has the tools it needs to carry out its national security missions,” said U.S. Senator Thad Cochran (R-Miss.), chairman of the Senate Committee on
Appropriations. “This bill supports a wide range of critical operations at DHS to secure our nation, from mitigating the impacts of natural disasters to securing our
borders and airports. Further, we address the deficiencies in some of DHS’ agencies by making important investments in training, technology and improved
procedures, while also requiring new measures and performance evaluations to ensure future accountability and results,” said U.S. Senator John Hoeven (RN.D.), chairman of the Senate Homeland Security Appropriations Subcommittee. “ This legislation is the result of a close, bipartisan effort. We
have worked hard to include the input of all our committee members, and I thank them for their contributions. I especially want to thank Senator Shaheen, the
ranking member, and her staff for their leadership. Now, we will work with our colleagues on the Senate floor to help finish this important effort.” The following are
highlights of the Senate FY2016 Homeland Security Appropriations Bill, which is consistent with the FY2016 budget resolution and the 302(b) allocation for the
subcommittee: Department of Homeland Security (DHS) – The bill provides $47.09 billion for DHS, $ 765 million above the FY2015 enacted level
and $1.02 billion below the President’s budget request, to fund DHS missions including border security, transportation security, immigration enforcement,
and cybersecurity, among others. Customs and Border Protection (CBP) – The bill contains $11.08 billion for CBP, an increase of $385 million above the FY2015
enacted level. It fully supports 21,370 Border Patrol agents and 23,775 CBP officers and intelligence and targeting system enhancements. It includes funding for
recapitalization of Non-Intrusive Inspection equipment, replacement and maintenance of border fencing, procurement of additional mobile surveillance systems
and other situational awareness technology, two multi-role enforcement aircraft, and unmanned aerial system capabilities. Immigration & Customs Enforcement
(ICE) – The bill includes $5.81 billion for ICE, $143 million below the FY2015 enacted level. It continues support for enhancements provided in FY2015 and
includes funding for 34,000 detention beds, additional teams targeting at-large criminal aliens and those who overstay their visas, and for human trafficking and
other domestic and international investigative activities. Transportation Security Administration (TSA) – TSA is funded at $4.72 billion, $63 million below the
President’s budget request. The funding makes targeted investments in training and checkpoint security following recent testing by the DHS Inspector General.
Specifically, the bill includes an additional $13 million for screener training, $24 million for checkpoint support, and $2.5 million for Federal Flight Deck Officer and
Flight Crew Training Program. U.S. Coast Guard (USCG) – The bill contains $10.33 billion for USCG, an increase of $496 million above the FY2015 enacted
level. This funding level, which reflects an increase in operating expenses, will support timely response from fixed-wing Search and Rescue aircraft (i.e., fixed
wing “Bravo-0”), maintaining rotary wing air facilities, and providing incentive pay for hard-to-fill billets. This bill also provides necessary increases for acquisitions,
including funding for the ninth National Security Cutter and continuing activities associated with the Offshore Patrol Cutter, the new icebreaker, and multiple
sustainment offices such as the C-130J and HH-65. United States Secret Service (USSS) – USSS is funded at $1.92 billion, an increase of $258 million above
the FY2015 enacted level. This increase fully supports activities associated with the 2016 campaign, the next former presidential security detail, and restores
grant funding and support to the National Center for Missing and Exploited Children. The bill also fully allocates funding for recommendations from the Protective
Mission Panel, although Congress has deferred funding on the White House Mock Up until plans and final costs can be assessed. National Protection and
Programs Directorate (NPPD) – The bill includes $1.638 billion, an increase of $135 million above the FY2015 enacted level. This funding includes $1.45 billion
in fees for the Federal Protective Service. Cybersecurity efforts, including protection of civilian Federal networks, are fully supported at $820 million, as is $16
million in cybersecurity pay reform. Additional activities are funded, including the Office of Bombing Prevention, priority communications upgrades so that
designated calls can be placed on the most current technology during disasters and emergencies, and an upgrade to the DHS biometric identification system.
Office of Health Affairs (OHA) – The bill contains $123 million for OHA, $6 million below the FY2015 enacted level. Included in the total is $83 million for the
BioWatch program. This amount will sustain current efforts and provide for recapitalization of equipment. Federal Emergency Management Agency (FEMA) –
The bill includes $7.37 billion for the Disaster Relief Fund, of which $6.71 billion is pursuant to the Budget Control Act. The bill also provides $929 million for
FEMA salaries and expenses which includes grants systems modernization, the Emergency Management Assistance Compact, and Urban Search and Rescue
Teams. The bill includes robust support for state and local first responders and emergency management personnel, providing $2.53 billion in the following grant
programs: $467 million for State Homeland Security Grants, including $55 million for Operation Stonegarden; $600 million for Urban Area Security Initiative
grants, including an increase to $25 million for the non-profit set-aside; $100 million each for Port and Transit Security grants; $680 million for Fire and SAFER
grants; $350 million for Emergency Management Performance Grants; and $98 million for the National Domestic Preparedness Consortium. The Committee also
strongly supports hazard mitigation programs. For every $1 invested in mitigation, $4 can be saved in disaster recovery. For that reason, $165 million above the
FY2015 enacted level amounts are recommended in mitigation programs, including $190 million for Flood Hazard Mapping and Risk Assessment Program and
$100 million for the Pre-Disaster Mitigation Grant Program. U.S. Citizenship and Immigration Services (USCIS) – The bill contains $120 million in appropriations
for E-Verify, and requires an analysis of the costs and timeline necessary to make use of the system permanent for employers. The Federal Law Enforcement
Training Center (FLETC) – FLETC is funded at $246 million. This amount includes funds to complete prior year student-level increases while sustaining current
instruction. Science and Technology Directorate (S&T) – The bill includes $765 million for S&T. A total of $39 million is provided to maintain all current Centers of
Excellence within University Programs, an increase of $8 million from the request. The Domestic Nuclear Detection Office (DNDO) – The bill contains $320
million for DNDO, an increase of $12 million above the FY2015 enacted level. It supports continued research and development activities, Securing the Cities, and
additional resources for the purchase of new handheld radiation detection systems. Departmental Management and Operations (DMO) – The bill includes $1.1
billion for DMO, a $39 million increase above the FY2015 enacted level. The increase supports critical information technology security measures, informationsharing enhancements, and a $16 million increase to the Office of the Inspector General to bolster audit and oversight activities. In addition, a general provision
includes $212 million for the DHS headquarters consolidation at St. Elizabeths in Washington, D.C. Oversight and Accountability The measure also includes
extensive direction regarding metrics and performance evaluation, holding the DHS accountable for operational outcomes associated
with the included investments. Not only should this information be available to assist the Committee in its budgeting and allocation decisions, but the
American taxpayers should know what results they are getting for their investment in security. Other oversight measures include:
Requiring the Department to submit reports on biometric exit implementation and visa overstays; Requiring the Department to submit spending plans
and better details in budget justification; Requiring the Department to report conference spending to the Inspector General and limiting the use of funds for certain
conferences; and Requiring the Department to link all contracts that provide award fees to successful acquisition outcomes, and prohibiting funds to pay for
award or incentive fees for contractors with below satisfactory performance.
Bill is popular
RT 15 (RT “Cyber threat-sharing bill clears House committee, would give immunity to companies 03/27/15,
http://www.rt.com/usa/244449-house-approves-cybersecurity-bill/)
The House Intelligence Committee unanimously approved its cyber threat data-sharing bill on Thursday. The
measure provides liability protections for companies when sharing cyber attack information with
government agencies. The unanimous vote by the House panel on Thursday for the Protecting Cyber Networks Act marks
the first step in a top bipartisan legislative priority on how to share information when a government
agency or private company undergoes a cyber attack, according to The Hill. “We’re off to a great start,” the Committee’s
ranking member, Adam Schiff (D-Calif.), told reporters after the markup. “I think the prospect for successful passage of
cyber legislation have gone up dramatically.” The bill is one of three working their way through Congress.
Another is being proposed by the House Homeland Security Committee to allow data sharing between the Department of
Homeland Security and the private sector. The Senate Intelligence Committee approved a similar measure in a 14 – 1 vote in early
March, though it came under withering criticism from privacy advocates. It is uncertain for the moment how the two House bills and
the Senate would merge for a conclusive vote, but they share many similarities. All three bills would
authorize liability
protections for companies so they could exchange cyber threat data with government agencies.
Government agencies, as well as the retail and banking sectors, have had repeated cyber attacks exposing
the personal details of millions of Americans. The FBI announced last week that it was working with Premera Blue Cross
regarding 11 million medical records that were hacked during a breach in 2014. There has been opposition in the past to cyber infosharing, due to concerns that it would enhance the National Security Agency’s surveillance powers. Those fears were raised again
after the House committee moved forward with its bill. "You have pretty much non-existent privacy protections, along with new
powers to spy on and monitor users…all while being provided broad immunity,”Mark Jaycox, a legislative analyst with the Electronic
Frontier Foundation who is following the House and Senate bills,toldWired. “It
creates a perfect storm for sharing personal
information with intelligence agencies.’ The House bill actually includes an amendment that would require companies to
strip out personal data about their customers before submitting the information to a government agency. This amendment is
stronger than the one in the Senate bill, though the bill is weaker in other areas. For example, unlike the Senate bill, the House bill
allows the government to gather data on threats that are not “imminent.” While the House bill states explicitly that the information
cannot be used by the government to highlight individuals for surveillance, groups like the American Civil Liberties Union say that is
not enough – especially if an agency decides to gather data under another name.
CP – HUMINT REFORM (HUMINT ADV CP)
1nc humint reform cp
The United States federal government should reform human intelligence practices as per the
Johnson evidence.
CP solves the HUMINT advantage
Loch Johnson ‘9---- Loch has a PhD, Political Science, University of California at Riverside. He is Regents Professor of Political Science in
the Department of International Affairs at the University of Georgia. Dr Johnson was Special Assistant to the Chair of the Senate Select
Committee House Sub-committee on Intelligence Oversight. He was a Visiting Scholar at Yale University in 2005. Johnson edits the Praeger
Security International Series Intelligence and the Quest for Security and is co-editor of the journal Intelligence and National Security. (Lock,
"Evaluating "Humint": The Role of Foreign Agents in U.S. Security", Peer reviewed conference paper. 2/15/9.
http://citation.allacademic.com//meta/p_mla_apa_research_citation/3/1/0/6/6/pages310665/p310665-14.php)//ET
Despite the many negative critiques of humint, former DCI Tenet emphasizes that intelligence is still “primarily a human endeavor.” xxvii He is
obviously not referring to the government’s intelligence budget priorities. Recall that the United States devotes only a small percentage of its annual
intelligence budget to human spying. xxviii Spy machines are costly, while human agents are relatively inexpensive to hire and sustain on an annual
stipend. One of the ironies of American intelligence is that the vast percentage of its spending goes into expensive intelligence
hardware, especially surveillance satellites, even though the value of these machines is questionable in helping the United States understand such
contemporary global concerns as terrorism or China’s economic might. Cameras mounted on satellites or airplanes are unable to peer inside the
canvas tents, mud huts, or mountain caves in Afghanistan or Pakistan where terrorists plan their lethal operations, or into the deep
underground caverns where North Koreans construct atomic weapons. “Space cameras cannot see into factories where missiles are made, or
into the sheds of shipyards,” writes an intelligence expert. “Photographs cannot tell whether stacks of drums outside an assumed chemical-warfare plant contain
nerve gas or oil, or whether they are empty.” xxix As a U.S. intelligence officer has observed, we need “to know what’s inside the building, not what
the building looks like.” xxx Many of the best contributions from spy machines come not so much from pricey satellites as from the far less expensive UAVs.
On occasion, though, sophisticated sigint satellites do capture revealing telephone communications, say, between international drug lords. Moreover, the
photography that imint satellites produce on such matters as Russian and Chinese missile sites, North Korean troop deployments, or the secretive construction of
nuclear reactors in Iran, are of obvious importance. In the case of terrorism, though, one would like to have a human agent well placed within
the Qaeda organization; for America’s security, such an asset could be worth a dozen billion-dollar satellites. The value of techint, though
often exaggerated, cannot be denied. Yet, most intelligence experts agree, clandestine human collection has a place at the table, too. In the
aftermath of 9/11 and the WMD errors in Iraq, both the Kean and the Silberman-Robb Commissions expressed their faith in humint by criticizing the lack of a
sufficient number of assets in key parts of the world; and President George W. Bush authorized a 50 percent increase in the number of operations officers,
leading in 2004 to the largest incoming class of clandestine officers in the CIA’s history. xxxi On the diplomatic front, former DCI Stansfield Turner (1977-81)
has further emphasized the centrality of humint reporting in support of international negotiations involving the United States. “I have seen us
sit down at a negotiating table when we had the other guy’s plan for negotiating in hand, thanks to a humint asset,” he told me. “That’s
pretty useful.” xxxii Another DCI, William E. Colby (1973-76), offered this appraisal of humint: “It’s one of those things you can’t afford to say no to,
because sometimes it can be valuable.” xxxiii He added: “You can go through years with nothing much happening [with regard to CIA’s assets abroad], so then
you cut off the relationship. Since nothing had happened there for ten years, we were in the process of closing the [CIA’s] stations in El Salvador and Portugal—
just before these countries blew up!” His conclusion: “I think you’ll always have some humint, and it’ll pay off. And remember that the human agent is
also available to somehow engage in the manipulation of a foreign government” (the CIA’s covert action mission). The question of their relative merit aside for the
moment, it should be underscored that spies overseas can be enormously difficult to recruit in closed societies like Iran and North Korea, which possess effective
counterespionage and security defenses. Before the United States invaded Iraq the first time in 1990, for example, the CIA had only four human assets in that
closed society. xxxiv Michael Goodman has underscored the problem of acquiring humint about nuclear developments in the Soviet Union during the Cold War.
The West, he notes, was “attempting to gather intelligence on a strictly compartmentalized, highly secret programme within a secure police state.” xxxv Or, as
John Millis observed, humint faces its “greatest challenge going up against xenophobic states with effective counterintelligence services.” xxxvi Spies in nations
like Syria, Pakistan, and Iran today are particularly hard to recruit, because Americans have focused for decades on the communist world and largely ignored the
study of languages, history, and culture necessary to operate spies in the Middle East and Southwest Asia. How many Americans speak Pashto, Arabic, and
Farsi well? How many understand the nuances of slang and various dialects? How many are willing to serve as operational officers for government pay in some
of the world’s most perilous places, trying to recruit local assets? Near the end of the Cold War, I asked one of the most well-known of the former intelligence
directors, Richard Helms (1966-73), what the consequences would be if the United States were to eliminate humint altogether. Deeply agitated by this
prospect, he responded at length: You would eliminate most of the information that comes into the United States government . This idea
that photographic satellites, satellites that pick up electronic emissions, and all the rest of it—all those technical things . . . . they’re Jim-dandy when it comes to
photographing missile installations, listening to missile firings, checking on telemetry, looking at the number of tanks being produced in certain factories—in other
words, ‘bean-counting’ mostly. Great. But once you eliminate the issue of bean- counting, what good do those pictures do you? They’re nice to have, in a
situation like now [1990] in the Persian Gulf. You can count the tanks and so forth that the Iraqis are putting in there. It’s a wonderful device. But it doesn’t tell you
what’s inside [Iraqi leader Saddam] Hussein’s head. It doesn’t tell you what he is going to do. It doesn’t give you the price of oil; it doesn’t give you the price of
gold; it doesn’t tell you what the wheat production is going to be within a given place. Even though you photograph [something] and can make some assessments
from the photographs, that isn’t the final word that you want. In short, the end of the Cold War means that there’s going to be more emphasis on human
intelligence than there was before. Xxxvii Former DCI (1991-93), and now Secretary of Defense, Robert M. Gates agrees with the notion that humint
has been highly valuable. Acknowledging the substantial contribution made by techint towards America’s understanding of Soviet strategic weapons, he
recalls nonetheless that “a great deal of what we learned about the technical characteristics of Soviet conventional weapons we learned through humint.” xxxviii
He adds that when it came to fathoming the Kremlin’s intentions, not just its capabilities, humint provided especially important insights. The question of intentions
is particular significant and a matter that humint can addresses in ways that are impossible for machines. An asset well placed near a foreign leader might be in a
position to pose the question to him or her: “What will you do if the United States does X?” As former CIA officer Millis has written: “Humint can shake the
intelligence apple from the tree, where other intelligence collection techniques must wait for the apple to fall.” xxxix Yet, humint has limitations that go beyond the
already substantial challenge of how to successfully recruit well-placed agents. Even if this barrier is overcome, assets can be difficult to manage and are
frequently untrustworthy. After all, they are hardly choir boys, but rather the dregs of their society in many cases—individuals driven by greed, absent any moral
compass. In the recruitment of such individuals, the ethos of intelligence officers “would bring a blush to Machiavelli’s cheek,” suggests a close observer of the
Great Game. “Their job in life is to spot useful foreigners and then do what it takes—money, flattery, green cards and so on—to turn them into traitors.” xl The
untrustworthiness of agents is legendary. During the Cuban missile crisis, for example, of about 3,500 humint reports, “only eight in retrospect were considered
as reasonably valid indicators of the deployment of offensive missiles in Cuba.” xli Yet, at the same time, the 0.2 percent who proved correct in their reports
provided the vital geographic coordinates necessary to direct U-2 flights over the missile sites—an “int” synergy leading to the gathering of indispensable imint
about the missile sites. Foreign intelligence assets, then, will sometimes fabricate reports, sell information to the highest bidder, and scheme as false defectors or
double agents. Many of America’s humint recruits have turned out to be double agents who have secretly remained in the ongoing employment of the adversary.
The CIA’s assets in Cuba and in East Germany, for example, were doubled during the Cold War. xlii Unlike techint machines, humint assets can lie, cheat, steal,
and remain loyal to America’s enemies. A recent example of the risks involved in humint was the German agent in Iraq during 2002, Rafid Ahmed Alwan,
prophetically codenamed “Curve Ball.” He managed to convince the German intelligence service that WMDs did exist in Iraq; and the CIA, in turn, took this bait
through its intelligence liaison relationship with the Germans. Only after the war began in Iraq in 2003 did Curveball’s bona fides fall into doubt among German
and CIA intelligence officials; he was, it turned out, a consummate liar. xliii Pointing chiefly to this example, a leading intelligence scholar concludes that
“overwhelming evidence shows that humint can be very misleading.” xliv Certainly one important lesson to be learned in this case is that no U.S. intelligence
agency should rely on a single intelligence source as heavily as occurred with Curve Ball—especially without having full access to the asset and instead relying
on foreign liaison evaluations. Curve Ball aside, even the sharpest-tongued critics of humint are apt to acknowledge that now and then an agent can provide
extraordinarily helpful information, as did the Soviet military intelligence officer Oleg Penkosky during the Cold War. Information from Col. Penkosky helped the
United States identify the presence of Soviet nuclear missiles in Cuba in 1962, and also provided the Kennedy Administration with reports that suggested some
top figures in the Kremlin were arguing against a confrontation with the United States during the missile crisis. Aldrich Hazen “Rick” Ames, the most damaging
Soviet asset inside the CIA during the Cold War, provides painful evidence that humint can be valuable—in this case, unfortunately, for Moscow. At least U.S.
counterintelligence officials could find some comfort, perhaps, in Ames’s testimony during his trial for treason in 1994. He stated that the United States had
“effectively penetrated and manipulated the Soviet and Warsaw Pact intelligence services on a massive scale.” xlv With such successes in mind, the United
States and most other countries persevere in their quest for reliable and productive espionage assets, even though the cost-benefit ratio has been poor over the
years. Maybe the next recruit or the next walk-in will be a Penkosky from the Al Qaeda camp, someone who has managed to steal the terrorist organizations’s
next plot against the United States. As Kim Philby, the British intelligence officer who proved all too effective in serving as a KGB asset, concluded: “If one
attempt in fifty is successful [in recruiting a well- placed humint asset], your efforts won’t have been wasted.” xlvi The humint success rate would likely improve if
some of the CIA’s operational officers moved outside of America’s embassies aboard. The editor of Newsweek International has noted that “the best sources
of intelligence on jihadi cells have tended to come from within localities and neighborhoods [that is, from local humint]. This information has
probably been more useful than any we have obtained from waterboarding or sleep deprivation.” xlvii It goes without saying, CIA officers are unlikely to meet
members of Al Qaeda on the diplomatic cocktail circuit; a certain number of case officers need to be out in local society, making contact with the indigenous
population in foreign nations. As former CIA officer Reuel Marc Gerecht stresses, the “principle problem” when it comes to humint is “the inability of case officers
to meet Islamic terrorists . . . .” xlviii The best way to make local contacts, Gehrect and others stipulate, is through the use of NOCs with contacts in the local
community. xlix Synergy is important, too, for effective intelligence collection. The ints work best when they work together—what one former intelligence officer
has called the “Black & Decker approach” that uses every tool in the box. l DCI Woolsey once offered the example of spying against North Korea. “That nation is
so closely guarded that humint becomes indispensable to know what is going on,” he told me. “This humint then tips off sigint possibilities, which in turn may
suggest where best to gather imint. These capabilities, ideally, dovetail with one another.” li Conclusion The collection of intelligence is pursued as a means for
informing the foreign policy and security deliberations of America’s leaders. “Many elements make up a decision,” Secretary of State Dean Rusk has said. “First,
though, one must grapple with the facts. What is the situation?” lii In determining “the situation” overseas, no single “int” is sufficient. Success depends on the
synergism of all the ints working together, just as an engine performs best when each of its cylinders is firing. As a former CIA officer notes, the “‘ints’ can be
teamed to operate jointly.” liii Both survey data and qualitative case studies of collection operations indicate that humint contributes regularly, and sometimes
significantly to this synergism—particularly against certain targets like terrorists, narcotics dealers, and weapons proliferators, as well as events and conditions in
much of the developing world. Humint must be approached warily, however; assets are corruptible and a depressingly high number of foreign recruits have
proven to be doubled. Further, a large percentage have offered inaccurate and sometimes fabricated reports, as the postmortem results on the Cuban missile
crisis illustrate. Nevertheless, humint successes like Col. Penkovsky, Adolf G. Tolkachev (a Soviet aviation specialist liv ), and a host of others—including several
who managed during the Cold War to penetrate the Soviet intelligence services (though seldom the political or military leadership ranks in Moscow lv )—
underscore how crucial the payoff can be from classical espionage. Much can be done to improve humint as practiced by the United States. Even observers
sympathetic to this approach have serious reservations about its current effectiveness. The United States has a “moribund Clandestine Service,” writes one
experienced field officer. lvi The House Permanent Select Committee on Intelligence warned that humint is headed “over a cliff,” as a result of poor
management. lvii A former high-level humint operative in the CIA has complained about the “muddled requirements process” in the government, which leaves
humint managers with an unclear understanding of what the nation’s collection priorities are. “Clarity in tasking—that’s the main problem,” he said to me and
expressed his concern, too, about the lack of adequate cover abroad. lviii Based on the studies I have examined for this essay, as well as my own interviews, an
agenda for humint reform would embrace these initiatives: ¬ increase the number of operations officers in key parts of the world, especially
NOCs; ¬ develop more cover arrangements overseas, inside and outside U.S. embassies; ¬ hold more frequent tasking
(requirements) meetings between consumers and humint managers, with at least a once-a-year major revision of an administration’s threat
assessment priorities; ¬ encourage more calculated risk-taking by operations officers in the recruitment of potentially important
assets, especially in the domain of counterterrorism; ¬ boost the entrance requirements for operations officers, making this
professional career as demanding and prestigious as a Foreign Service career ; ¬ improve language training of operations
officers, along with the deep study of the histories and cultures of other societies; lix ¬ recruit more U.S. citizens with ethic
backgrounds relevant to the strategic locations of the world, such as the Middle East and Southwest Asia, and encourage
diversity generally throughout the humint services; lx ¬ provide easier access to U.S. embassies abroad to encourage walk-ins,
relying on perimeter physical searches and metal detectors as means for thwarting terrorist attacks against these facilities; ¬
reduce the size of the U.S. humint bureaucracy at headquarters, building up a small, nimble clandestine service that focuses on
high-priority foreign targets; ¬ reward operations officers on the basis of the quality of the assets they recruit, not the quantity; ¬
continue to encourage closer cooperation between the CIA’s Directorate of Intelligence (DI) and the National Clandestine Service—a partnership known as “colocation”; ¬ continue to encourage greater sharing of humint findings across the intelligence community; ¬ improve liaison relations
with foreign intelligence services allied with the United States in the global struggle against terrorists, drug dealers, and other
international criminals; ¬ provide longer periods of posting for humint officers in other countries, since asset recruitments take time; and, ¬ resort to
“global surge” only rarely, building “global presence” capabilities instead. This is a challenging agenda, but one well worth pursuing if the United States
is determined to prevent another 9/11 catastrophe. The prudent policymaker will continue to seek information from all the collection sources, overt and
across the covert ints, with human intelligence continuing to have a valuable role to play. The motivation: the fewer the missing pieces in the constantly changing
jigsaw puzzle of world affairs, the more likely the puzzle might be solved.