Originally posted by H1ght3chHippie
Just let's tie the biggest ISP's in here for a brief second, let's asume most major networks might serve a second purpose, beyond what you use
them for, and let's assume there are algorithms and mechanisms in place...
They've been doing that 20 years ago already, you are aware of that I assume ?
Now, that's something very, very different than concerns over what nefariousness might be possible via cookie abuse -- you're referencing "deep
packet inspection," and is a very different animal.
So after monitoring your online behaviour for a couple years, they have the perfect profile about you...
Okay, there's a couple issues here, many are related to misconceptions, but there are concerns.
The most important thing is to visualize the quantity of data, and project forward to the plausibility of such a thing. Imagine a system that would be
able to track and quantify all the various IP address you use, in real-time. Not necessarily impossible, but would require highly-sophisticated
deep-packet inspect by every network you use, engaged in real-time reconciliation and communication back to some central source. Then, consider how
such a system would engage in such real-time reconciliation for every HTTP packet you receive -- just the ATS home page would require more than 100
such packets, many pages use much more. Then imagine the scale of such a thing as it attempts to recognize, reconcile, track, and record every packet
received by every person using the web in the United States for just one day. We're talking about dozens of petabytes of data being categorized and
exchanged in just one day.
And if you want to scale that even further, consider the amount of data resulting from a month, a year, or several years.
And then, imagine how an inept "government" who is unable to keep an Army Private from stealing secure government communications could create and
manage such an unimaginably massive and sophisticate system.
Such a system would require massive bandwidth and more than 50 billion terabytes of
data storage for 10 years worth of information. Why... the hard drive maintenance alone would keep an army of IT geeks running in circles.
That level of intelligence as a result of wide-spread deep-packet inspection has received a lot of speculation, but it's not plausible to believe an
inept government who can't keep their law enforcement agencies up to speed with computer technology that is less than five years old can pull it
However, that's not to say that some level of data reconciliation and inspection isn't going on... we know it is, but just not on the grand scale
that would be required above.
Based on the tidbits we know, there are three strategies being used:
(1) Deep packet inspection of certain protocols (such as HTTP post and SMTP) for important keywords, phrases, or destinations.
(2) Monitoring of interconnected communications (using #1 at times) on certain subjects, some of which may be "seeded" by provocateurs or purposeful
release of low-level semi-classified information.
(3) Deep packet inspection and monitoring of specific computers that have been identified as a result of #1 and #2.
This is not only much more plausible, but also much more likely to result in a manageable amount of data on which law-enforcement action can be taken.
And on a side note, there's not really a difference between a single webserver and a server farm in terms of transfering data through whatever
backbone or WAN link to other domains
Except when you start considering the massive scale of the amount of data you originally proposed. Even in our little cluster for ATS, we often flirt
with the upper limits of a 10GB network connection between our database server and web server during spikes of high traffic, and that's just for the
posts and threads on ATS.