It looks like you're using an Ad Blocker.

Please white-list or disable in your ad-blocking tool.

Thank you.


Some features of ATS will be disabled while you continue to use an ad-blocker.


Surveillance Technology: Can they really watch everything?

page: 1

log in


posted on Jun, 24 2009 @ 04:17 PM
A while back QuietSoul asked this question and wanted to know how much real-world data there was supporting echelon-like surveillance of telephones and cellphone calls.

Here's the answer.


Just to be clear Echelon isn't a question. It's a reality. I thought I was ahead of the curve because I was aware of it back in the mid-to-late '90s when people first started to flip out about Carnivore, but as it turns out it's supposedly been operational since the '70s.

For a good several decades intelligence agencies were largely restricted to keyword searching email, faxes (using OCR), and other written communications. Outside of that a person needed to sit down and manually sift through raw SIGINT. What's somewhat new is the ability to machine-transcribe speech to text. For more on this read Suelette Dreyfus's, Ph.D, articles on the 1997 NSA patent or for a more directed search dig around for Julian Assange's 2000 AUCRYPTO mail-group postings.

As for your logistical questions, I'll start by commenting that the NSA operates the worlds largest computer, communication centers, and is the owner of the single largest group of supercomputers (1, 2). The scale of the NSA's efforts is hard to determine from unclassified data, but one clue is the electricity usage at Fort Meade. The NSA's budget for electricity exceeds $21 million per year, making it the second largest electricity consumer in the entire state of Maryland.

Storage Considerations

Back in 1999 the NSA contracted Cluster File Systems to tailor Lustre (lovingly referred to as the "inter-galactic filesystem") to theirs and the tri-lab needs (Los Alamos, Livermore, Sandia). It's safe to assume they're working in exabytes (($220 mil / 2 [dev&it]) / $50 [1TB HDD] = 2.2 exabytes!) if not zettabytes (1,000,000,000 1TB HDDs!) of data at this point (don't forget they probably also use robotic tape backup solutions too). If they have a reliable means of converting speech to text there's very little reason to keep the raw audio-file unless the file holds some evidential significance.

There is considerable debate about how much storage would be required to retain every word spoken by every human in all of history. The guesstimates range from 5 exabytes to 400 zettabytes.

The point being even if the NSA doesn't have the ability to store everything in it's raw format they do have the capacity to retain everything once it's converted to text (give or take compression) or they can transcode digital conversations to a more lossy format using something like Sony's DVF codec. The compression ratios for voice encoded data are amazing. I recorded an hour long meeting (1:02:09) and using DVF it only consumes 8.62 MBs! When confronted with huge blocks of video / binary data they very likely just hash TCP streams to reduce duplication of commonly recurring blobs. Furthermore with rapidly increasing hard-drive sizes and speeds (SSDs are getting very affordable) their ability to increase capacity gets a lot easier. Consider just a year and a half ago the first production 1 TB drive was released and now companies like Western Digital are manufacturing 2 TB HDDs. At some point space will be so readily available storage will no longer be the problem. Rather write and read speeds will be the issue.

Next lets consider our ability to passively capture data.

Passive Signal Collection

Back in the 60s geostationary communications satellites presented opportunities to intercept international communications. The report to the European Parliament of 2001 states, "If UKUSA states operate listening stations in the relevant regions of the earth, in principle they can intercept all telephone, fax and data traffic transmitted via such satellites." (source) Unfortunately for the NSA the role of satellites in voice and data communications has largely been replaced by fiber optics. As of 2006, 99 percent of the world's long-distance voice and data traffic is carried over optical-fiber. (source)

Intercepting data off optical relays is a bit more tricky as it requires splitting light. However as evidenced during the Bush-warrantless wiretapping scandal, in AT&Ts room 641A, the NSA has technology in place to do just that.

This leads to questions about processing power needed to sift through the huge amounts of data flowing across the backbone. According to a 2002 UC Berkeley study the world produces 1 to 2 exabytes of information annually or 2.7 to 5.4 petabytes / day.

[edit on 24-6-2009 by Xtraeme]

posted on Jun, 24 2009 @ 04:18 PM

Processing Considerations

Are you aware that there are fully functional 110 GHz ASIC chipsets (often referred to as Silicon Packet Processors) already on the market? They're used in Cisco's CRS-1 routers.

Before you get too excited, there's a big difference between a special-purpose ASIC network processor and a microprocessor like the one used in your computer. However the underlying technology, silicon germanium, is finding its way to in to more generic semiconductor research.

The point being through specialized equipment (anyone remember EFF's Deep Crack?) it's not only possible to transmit 40-100 Gbps of data, but to process and act on it!

This is just processing power as it relates to network operations like packet filtering. Another consideration is decryption and the computational requirements to perform cross-references or, in SQL-ese, joins.

To address this issue the NSA is betting its money on quantum (petaflop scale!) computing. What blows my mind is as of Sep. 2007 a private company, D-WAVE, has publicly demonstrated an operational (but still somewhat basic in the sense that it's not designed to execute Shor’s algorithm - the holy-grail for the NSA) 28-bit quantum computer.

This must have been somewhat to the NSA's chagrin as they spent $60 million researching quantum-computing semi-conductor fabrication in 2007 only to have a private company beat them (or so we're told) to the punch. As of 2008 the NSA, NSF, & DOE have been funding research efforts at Lawrence Berkeley National Laboratory. (1, 2)

Once this research pans out cracking modern day encryption will be a joke.

In the mean-time they have to make do with FROSTBURG aka "The Thinking Machine" (ca. 1991-1997, operating @ 65.5 gigaflops, ~2 TBs of physical memory), the newer Cray XT Jaguar (operating @ 1.64 petaflops potentially coupled with a 2.5 TB SSD RAM disk ) and their distributed network of lesser machines probably configured to work together using something like Beowulf, OpenMosix or BOINC.

Cooperative data-sharing

Most of the questions asked by the QuietSoul relate to feasibility of centralized voice-processing and whether or not all data can be sniffed and acted on instantaneously.

First I'd like to point out that there's no reason this can't be a distributed task. When the FBI wanted to monitor email via Carnivore they required all ISPs and the telecommunication providers to install the DCS 1000 hardware at the central office (CO). In the case of the Bush/NSA-warrantless wiretapping same thing. The DHS/NSA approached AT&T, compelled them to help and then armed with a Patriot Act authorized National Security Letter prevented AT&T from divulging the operation.

Once field deployed equipment breaks down signal intelligence in to manageable pieces higher priority messages are bubbled up to the NSAs database at Fort Meade or to temporary data-warehouses or colocation centers for additional processing or storage. I imagine lower priority data probably sits on the telecommunication companies network for upwards to several weeks / months before it's eventually overwritten.

Also realize that phone conversations are largely digitized at this point.


Basically the NSA in collusion with other countries and US agencies can capture just about all signal intelligence. Total Information Awareness if you will. What they can't necessarily do is cross reference everything other than sorting by aggregate data (like IP / geographic region / behavioral patterns).

The scary part is it will only get easier with better hardware. The only thing helping privacy is the increased growth of transmissions and relatively strong public encryption. However if my predictions are correct human signal growth is logarithmic. Which is to say once quantum computing becomes a reality, if quantum encryption isn't as strong as we're led to believe (*cough* *cough*).

Then we better pray our laws are enough to prevent others from invading our privacy.

[edit on 24-6-2009 by Xtraeme]

posted on Jun, 24 2009 @ 04:50 PM
reply to post by Xtraeme

great info. I must take a proper look at it.

here's some light reading for you (lol). it's DARPA's Strategic Plan 2009, and goes into things like quantum computing and lots of other research stuff.
eg look at p46

3.9.1. Quantum Science and Technology

Until recently, quantum effects in electronic devices have not exhibited overriding significance. However, quantum effects not only have to be taken into account, but can dominate how devices perform as they shrink to atomic dimensions. DARPA is sponsoring research aimed at technology exploiting quantum effects to achieve revolutionary new capabilities.

DARPA’s Quantum Entanglement Science and Technology (QuEST) program is creating new quantum information science technologies, focusing on loss of information due to quantum decoherence, limited communication distance due to signal attenuation, protocols, and larger numbers of quantum bits (Qubits) and their entanglement.

Key among the program’s challenges is integrating improved single- and entangled-photon and electron sources and detectors into quantum computation and communication networks. Defense applications include highly secure communications, algorithms for optimization in logistics, highly precise measurements of time and position on the earth and in space, and new image and signal processing methods for target tracking

[edit on 24-6-2009 by eniac]

[edit on 24-6-2009 by eniac]

posted on Jun, 24 2009 @ 05:41 PM
Very interesting indeed. You put a lot of effort into this. It will be interesting to see what they will be able to do in ten years time.

posted on Jun, 24 2009 @ 06:47 PM
reply to post by eniac

Quantum computing / communication is making huge leaps in recent history.

I'm particularly excited by the idea of quantum networking. It's amazing to think we already have technology that can instantly transport data through free-space at ~90 miles out.

posted on Jun, 24 2009 @ 07:16 PM
yeh, well i have nothing to hide. just as long as im not caught up in some frame job by putting lying junk on my computer and dont think that they cannot do that.

posted on Jun, 24 2009 @ 10:35 PM

Originally posted by pudgeego
yeh, well i have nothing to hide. just as long as im not caught up in some frame job by putting lying junk on my computer and dont think that they cannot do that.

Well think of it this way. If I wanted to I could fake your IP and make it appear as though you're participating in a torrent dealing with some form of illegal content without requiring access to your computer. I could also transmit data to a particular source, forging the IP, making it appear as though you were sending something salacious or illegal to a third party.

This is one of the reasons why I view the RIAAs lawsuits as such a tenuous legal argument. Even if they do manage to find a large repository of music on a persons computer it doesn't necessarily mean the person was doing anything illegal. It's quite common for people running botnets to offload vast amounts of data to nodes of their compromised computers.

It's really quite easy to manipulate people to voluntarily compromise their own security. Furthermore there are exploits that aren't known to the general white-hat security industry and black-hats do use this for their own personal benefit.

posted on Jun, 25 2009 @ 04:55 PM

Originally posted by xman_in_blackx
Very interesting indeed. You put a lot of effort into this. It will be interesting to see what they will be able to do in ten years time.

The march of technology is leaning more and more in favor of the concept of the semantic web. I've been a very big proponent of the idea of mapping natural language to database concepts. Borrowing a stub from the SMW projects introduction page,

Wikis have become a great tool for collecting and sharing knowledge in communities. This knowledge is mostly contained within texts and multimedia files, and is thus easily accessible for human readers. But though wikis are very good for storing and retrieving individual facts, they are less useful for getting queried or aggregated information.. As a simple example, consider the following question:

«What are the hundred world-largest cities with a female mayor?»

Wikipedia should be able to provide the answer: it contains all large cities, their mayors, and articles about the mayor that tell us about their gender. Yet the question is almost impossible to answer for a human, since one would have to read all articles about all large cities first! Even if the answer is found, it might not remain valid for very long. Computers can deal with large datasets much easier, yet they are not able to support us very much when seeking answers from a wiki: Even sophisticated programs cannot yet read and «understand» human-language texts unless the topic and language of the text is very restricted. The wiki's keyword search does not help either.

These sorts of concepts are being greatly expanded on in the realm of user relational database file systems and other similar concepts. Given ten years data-relations will be much better understood and utilized in technology.

As evidence of this just look at the news today from IBM.

An IBM researcher has solved a thorny mathematical problem that has confounded scientists since the invention of public-key encryption several decades ago. The breakthrough, called 'privacy homomorphism,' or 'fully homomorphic encryption,' makes possible the deep and unlimited analysis of encrypted information — data that has been intentionally scrambled — without sacrificing confidentiality.


might better enable a cloud computing vendor to perform computations on clients' data at their request, such as analyzing sales patterns, without exposing the original data. Other potential applications include enabling filters to identify spam, even in encrypted email, or protecting information contained in electronic medical records.

I wouldn't be surprised if in ten years from now the NSA has the ability to induce through computational models peoples criminal tendencies, areas of interest, personal / professional / friendly relationships, psychological make-up, and many other hard to graph concepts that usually require some form of manual leg-work to evaluate.

posted on Jul, 5 2010 @ 11:59 AM
On Physorg today there's an article titled "Could some entangled states be useless for quantum cryptography?",

One of the widely accepted properties of quantum entanglement is secrecy. Since scientists and researchers began working with quantum key distribution, entanglement has been considered an essential part of keeping communications private. What if entanglement didn't always mean secrecy, though? New work is shedding light on the nature of entanglement and quantum key distribution - and possibly proving that a high degree of entanglement does not necessarily lead to complete secrecy.

This just goes to reinforce my initial observation that as quantum cryptography becomes more a reality it's not going to be nearly as strong as we were originally led to believe.

[edit on 5-7-2010 by Xtraeme]

posted on Aug, 29 2010 @ 04:54 PM
reply to post by Xtraeme

Another article continuing the discussion of weaknesses found in quantum crypto can be found here:

The Norwegian University of Science and Technology (NTNU) and the University of Erlangen-Nurnberg together with the Max Planck Institute for the Science of Light in Erlangen have recently developed and tested a technique exploiting imperfections in quantum cryptography systems to implement an attack...

It's a fascinating read.

posted on Aug, 29 2010 @ 05:37 PM
You're too focused on the headquarters in Maryland.

Several NSA data centers are being built or are fully functional in Georgia, Texas, and Hawaii. There is serious money being spent and hardware being installed with these facilities.

TIA, in theory, was considered illegal. Problem is, the agency is building facilities as if it will be legal in the future.

posted on Aug, 29 2010 @ 07:54 PM
reply to post by hinky

I will have to disagree with you. The OP, using information from the largest NSA data center has shown his thesis as entirely possible. Why even go into details of other NSA buildings that may or may not be used for the same purpose as the Maryland one?

I thoroughly enjoyed the analysis provided by the OP and find it disheartening that this all goes on as if it were legal to do. The billions (or trillions?) being spent on intercepting and recording every sort of communication could be used for so many other things.

posted on Aug, 29 2010 @ 08:25 PM
You missed my point. The opening is fairly well accurate. I'd question the largest data center being in Maryland. There is a huge data center going in at Lackman AFB.

Let's not focus to much on the head while the body works elsewhere.

posted on May, 13 2017 @ 05:21 PM
a reply to: Xtraeme

Surveillance Technology: Can they really watch everything?
For more on this read Suelette Dreyfus's, Ph.D, articles on the 1997 NSA patent or for a more directed search dig around for Julian Assange's 2000 AUCRYPTO mail-group postings.

"Can they really watch everything?"

Very nearly, within a year or two, yes.

Impressive this was written well before anyone knew much of anything about the IC's mass surveillance programs. Julian Assange wasn't even on the radar yet.

edit: found what was needed
edit on 13-5-2017 by MMMCCCI because: removed

new topics

top topics


log in