NSA Metadata Collection Case in Court Today


A case challenging the constitutionality of the National Security Agency’s (NSA) metadata collection efforts against United States citizens made its way into an appellate court today, where lawyer and opponent of domestic spying Larry Klayman was grilled by a three-judge panel. At issue is the collection of metadata in a massive program run by the National Security Agency known as PRISM, which was first revealed to the public by Edward Snowden.

The case is one which is difficult to sufficiently cover, due to its complexities. Much of the news media’s commentary has necessarily truncated descriptions of the PRISM program due to time and format constraints. This article will put forth a good faith effort to describe the case in meaningful terms, to the best of its ability.

The premise of the case is to determine whether the NSA is in breach of the Fourth Amendment. It is the Fourth Amendment which states that people and their property, including writings and other documents (the equivalent of personal information at the time), are constitutionally protected “against unreasonable searches and seizures,” and requires a demonstration of probable cause for the issuance of a warrant, which in turn is required to describe the specific terms of search and seizure. The case made its appearance before the District of Columbia Circuit U.S. Court of Appeals today, after a ruling last December from Judge Richard Leon, who characterized the NSA’s actions as “Orwellian.”

A slide released to the Washington Post, detailing a proposal of the NSA to datamine cloud storage.

According to slides leaked by Snowden, an NSA-contracted analyst-turned-defector, the PRISM program has the capability to access the servers of monolithic telecom and internet companies, and stores the data it collects in massive facilities. While there were confusing earlier reports regarding whether these companies were complicit in providing data or not, Snowden pointed out that it had little relevant meaning. The companies would be required to maintain secrecy if they were involved, and the NSA has the capability to extract the data regardless. An analyst, according to Snowden, has the authority to query this enormous database without needing even the approval of a supervisor, by assuming that the terms they search have a 51 percent chance (fulfilling “probable”) of relating to terrorism. Snowden has also explained that inevitably, data including voice recordings, email contents, search histories, and other private information of citizens unrelated to the initial query, would be included in search results.

The most obvious issue is that it is nearly impossible to make the case that every phone call, email or Google search that occurs involving Americans has the “probability” of being related to terrorism (for the purpose of justifying the initial massive and indiscriminate collection of data). If there is no probable cause, a warrant should not be issued, as the Fourth Amendment reads. A second major issue is the “meta” scope of the NSA program, whereby it does not target an individual, but rather electronic communication as a whole. The broadness of the search terms seem to negate an ability to provide specificity, deepening the murkiness in which the NSA operates. However, legal precedents stemming from the implementation of the Patriot Act and other electronic surveillance measures exist, altering the requirements of the use and issuance of warrants by the interpretation of online information in a manner distinct from physical documents. Precedent is rarely challenged.

There is a widespread expectation that today’s case will move forward to eventually be heard by the United States Supreme Court, and ultimately determine the prognosis of the NSA’s collection and use of metadata.

by Brian Whittemore



International Business Times

The Washington Post

Photo by Markus WinklerWikimedia Commons

Inset Public Domain, courtesy of Edward Snowden

Leave a Reply

Your email address will not be published.