The NSA collects and analyzes significant amounts of data from US communications systems in the course of monitoring foreign targets. (Photograph: guardian.co.uk)
With bizarre names—including “EvilOlive,” “ShellTrumpet,” “MoonLightPath,” “Transient Thurible,” and “Spinneret”—the existence of a series of internal data-mining programs developed and operated by the National Intelligence Agency, according to documents assessed by the Guardian, indicates that despite assurances from NSA officials, the massive collection of domestic internet data is not only ongoing, but possibly expanding.
As part of the latest revelations based on leaked documents provided by former NSA contractor Edward Snowden on Thursday, journalists Glenn Greenwald and Spencer Ackerman, say the internal communications “indicate that the amount of internet metadata harvested, viewed, processed and overseen by the Special Source Operations (SSO) directorate inside the NSA is extensive.”
“While there is no reference to any specific program currently collecting purely domestic internetmetadata in bulk,” they continue, “it is clear that the agency collects and analyzes significant amounts of data from US communications systems in the course of monitoring foreign targets.”
Despite claims by the Obama administration that large “metadata collection” of domestic internet data ceased in 2011, one specific data-mining program—”ShellTrumpet”—was celebrated in the documents for processing “its One Trillionth metadata record” on December 31st, 2012.
Greenwald and Ackerman continue:
It is not clear how much of this collection concerns foreigners’ online records and how much concerns those of Americans. Also unclear is the claimed legal authority for this collection.
Explaining that the five-year old program “began as a near-real-time metadata analyzer … for a classic collection system”, the SSO official noted: “In its five year history, numerous other systems from across the Agency have come to use ShellTrumpet’s processing capabilities for performance monitoring” and other tasks, such as “direct email tip alerting.”
Almost half of those trillion pieces of internet metadata were processed in 2012, the document detailed: “though it took five years to get to the one trillion mark, almost half of this volume was processed in this calendar year”.
Another SSO entry, dated February 6, 2013, described ongoing plans to expand metadata collection. A joint surveillance collection operation with an unnamed partner agency yielded a new program “to query metadata” that was “turned on in the Fall 2012”. Two others, called MoonLightPath and Spinneret, “are planned to be added by September 2013.”
Jon Queally is a staff writer for Common Dreams
Delivered by The Daily Sheeple
We encourage you to share and republish our reports, analyses, breaking news and videos (Click for details).
Contributed by Jon Queally of Common Dreams.