page: 1
<<   2 >>

log in


posted on Mar, 3 2003 @ 08:46 AM
Well ill tell ppl to be really careful searching
in internet, and using net, wireless lan, satellite
whatever is related with a communication medium,
Coz they have everything being check everyday,
they look for ppl that talks about them, that knows about them, because we are the most dangerous
on this society, ppl that is sleeping wont be a problem for them....

So just wanna show everybody that we are not safe
anywhere, and they can spy at any time, the plan is almost over, there is just no way no use a communication system without the possibility to being listen to or whatever. So as u can see they create "communication" for being able to control this creation, for example, if im the elitist at the top ill put a limit to the knowledge available for the public, same
as internet, was created with 2 limits, and the truth about them is out of those limits, so dont think we will find the truth on the net.... at least make yourselves an idea.....

Have access to your messages, to your mails, to your conversation to your opinions ideas or whatever, it makes it for them easier to control with immunity...
And u will never surprise them...

TCP/IP History

The architecture of TCP/IP is often called the Internet architecture because TCP/IP and the Internet as so closely interwoven.The Internet standards were developed by the Defense Advanced Research Projects Agency (DARPA) and eventually passed on to the Internet Society.

The Internet was originally proposed by the precursor of DARPA, called the Advanced Research Projects Agency (ARPA), as a method of testing the viability of packet-switching networks. (When ARPA's focus became military in nature, the name was changed.) During its tenure with the project, ARPA foresaw a network of leased lines connected by switching nodes. The network was called ARPANET, and the switching nodes were called Internet Message Processors, or IMPs.

The ARPANET was initially to be comprised of four IMPs located at the University of California at Los Angeles, the University of California at Santa Barbara, the Stanford Research Institute, and the University of Utah. The original IMPs were to be Honeywell 316 minicomputers.

The contract for the installation of the network was won by Bolt, Beranek, and Newman (BBN), a company that had a strong influence on the development of the network in the following years. The contract was awarded in late 1968, followed by testing and refinement over the next five years.


Bolt, Beranek, and Newman (BBN) made many suggestions for the improvement of the Internet and the development of TCP/IP, for which their names are often associated with the protocol.


In 1971, ARPANET entered into regular service. Machines used the ARPANET by connecting to an IMP using the "1822" protocolso called because that was the number of the technical paper describing the system. During the early years, the purpose and utility of the network was widely (and sometimes heatedly) discussed, leading to refinements and modifications as users requested more functionality from the system.

A commonly recognized need was the capability to transfer files from one machine to another, as well as the capability to support remote logins. Remote logins would enable a user in Santa Barbara to connect to a machine in Los Angeles over the network and function as though he or she were in front of the UCLA machine. The protocol then in use on the network wasn't capable of handling these new functionality requests, so new protocols were continually developed, refined, and tested.

Remote login and remote file transfer were finally implemented in a protocol called the Network Control Program (NCP). Later, electronic mail was added through File Transfer Protocol (FTP). Together with NCP's remote logins and file transfer, this formed the basic services for ARPANET.

By 1973, it was clear that NCP was unable to handle the volume of traffic and proposed new functionality. A project was begun to develop a new protocol. The TCP/IP and gateway architectures were first proposed in 1974. The published article by Cerf and Kahn described a system that provided a standardized application protocol that also used end-to-end acknowledgments.

Neither of these concepts were really novel at the time, but more importantly (and with considerable vision), Cerf and Kahn suggested that the new protocol be independent of the underlying network and computer hardware. Also, they proposed universal connectivity throughout the network. These two ideas were radical in a world of proprietary hardware and software, because they would enable any kind of platform to participate in the network. The protocol was developed and became known as TCP/IP.

A series of RFCs (Requests for Comment, part of the process for adopting new Internet Standards) was issued in 1981, standardizing TCP/IP version 4 for the ARPANET. In 1982, TCP/IP supplanted NCP as the dominant protocol of the growing network, which was now connecting machines across the continent. It is estimated that a new computer was connected to ARPANET every 20 days during its first decade. (That might not seem like much compared to the current estimate of the Internet's size doubling every year, but in the early 1980s it was a phenomenal growth rate.)

During the development of ARPANET, it became obvious that nonmilitary researchers could use the network to their advantage, enabling faster communication of ideas as well as faster physical data transfer. A proposal to the National Science Foundation lead to funding for the Computer Science Network in 1981, joining the military with educational and research institutes to refine the network. This led to the splitting of the network into two different networks in 1984. MILNET was dedicated to unclassified military traffic, whereas ARPANET was left for research and other nonmilitary purposes.

ARPANET's growth and subsequent demise came with the approval for the Office of Advanced Scientific Computing to develop wide access to supercomputers. They created NSFNET to connect six supercomputers spread across the country through T-1 lines (which operated at 1.544 Mbps). The Department of Defense finally declared ARPANET obsolete in 1990, when it was officially dismantled.

[Edited on 25-6-2003 by CoLD aNGeR]

posted on Mar, 3 2003 @ 09:06 AM
now u know what DARPA IS...

btw it says at the bottom,
Science is Power

posted on Mar, 3 2003 @ 09:51 AM

Originally posted by CoLD aNGeR

btw it says at the bottom,
Science is Power

"Scientia Est Potentia," means, "Knowledge is Power."

posted on Mar, 3 2003 @ 09:51 AM
Pentagon database to spy on Americans

The US government is spending millions of dollars on databases designed to weed out terrorist threats
The US Defence Department has awarded millions of dollars to more than two-dozen research projects that involve a controversial data-mining project aimed at compiling electronic dossiers on Americans.

Nearly 200 corporations and universities submitted proposals to the Defence Advanced Research Projects Agency, according to government documents brought to light by a privacy group on Thursday.

John Poindexter, who oversees the agency's Total Information Awareness (TIA) program, approved 26 of them last autumn, including grants to the University of Southern California, the Palo Alto Research Centre, and defence contractor Science Applications International.

Over the last few months, TIA has become a lightning rod for criticism, with Republican and Democratic legislators speaking out against it on privacy and security grounds. On 20 February, as part of a large spending bill for the federal government, Congress approved additional scrutiny of research and development on the TIA project.

Those restrictions do not halt TIA research. They would permit the 26 grants to be fully funded if Poindexter sends Congress a "schedule for proposed research and development" that includes a privacy evaluation, or if President George Bush certifies that TIA is necessary for national security.

The Electronic Privacy Information Centre (EPIC), a civil liberties group in Washington, won a court order forcing Poindexter to disclose approximately 180 pages of documents under the Freedom of Information Act.

The documents show that funds for TIA and two related information-analysing projects, Genisys and Genoa II, have been awarded to companies including CyCorp of Austin, Texas, for a "Terrorism knowledge base," 21st Century Technologies of Austin, Texas, for "AUDIT: Automated Detection, Identification, and Tracking of Deceptive Terrorist Activity," and Evolving Logic of Topanga, California for "Confronting Surprise: Robust Adaptive Planning for Effective Total Information Awareness."

University recipients include the University of Southern California, for its "JIST: Just-In-caSe just-in-Time Intelligence Analysis" system, and Colorado State University.

Columbia University applied for, but did not receive, a grant for its proposal for "Behavior-Based Profiling of User Accounts to Detect Malicious, Errant, and Fraudulent Internet Activity."

"It shows the breadth and impact of this program, which its defenders have tried to describe as being on the drawing board or in the research stages," said EPIC director Marc Rotenberg. "The activity is extraordinary. This is a Defence-funded project for domestic surveillance, and it's very important not to lose sight of that."

It's unclear how much money is being awarded to the grant recipients. A Defence Department notice suggested that the "annual budget for each is in the $200,000 to $1m (126,428 to 632,143) range."

A DARPA representative on Thursday declined to provide additional details about funding or on what the projects entail, saying the proposals would not be made public. "The companies put private information in their proposals," the representative said.

A representative of Veridian, of Arlington, Virginia refused to disclose details. The information-system provider won an unspecified amount for a grant proposal titled, "Human augmentation of reasoning through patterning."

"This is an area that I need to refer you directly to DARPA," the Veridian representative said. "They're the customer, and they've asked us to refer calls to them."

posted on Mar, 3 2003 @ 10:17 AM
High Priests of the Technocracy: The Information Awareness Office :.

Since the end of the Cold War, governments around the world have been increasingly viewing their own states' populations as threats to national security. What we are now witnessing is a struggle for control of ideas. The battlespace is the Internet.

In current national security parlance, terrorists, individuals, non-governmental organizations and other actors are considered asymmetric threats.

More :Link

May I tell you, in a friendly way, that your post are too long ?
A post is different from a woman. With a post, size doesn't matter.

P.S : For " them ", we are all an asymmetric threat.

posted on Mar, 3 2003 @ 10:21 AM

Originally posted by ultra_phoenix
control of ideas. The battlespace is the Internet.

May I tell you, in a friendly way, that your post are too long ?
A post is different from a woman. With a post, size doesn't matter.

P.S : For " them ", we are all an asymmetric threat.

LoL ejejeje, ok what ill do next time is post just the link, but ppl doesnt click on links a lot, is more easy to open forum and read, but u are right also for not keeping this forum getting full that fast...

posted on Mar, 3 2003 @ 10:27 AM
"so dont think we will find the truth on the net.... "

How do you know that a "truth" exists if it is impossible to find it on the net? How is it impossible that, of all of the people on this messageboard, not ONE could POSSIBLY post something that comes close to the "truth"? What do you think "they" will do if somebody posts the "truth"?

posted on Mar, 3 2003 @ 10:47 AM

Originally posted by CoLD aNGeR
Pentagon database to spy on Americans

The US government is spending millions of dollars on databases designed to weed out terrorist threats

If you read more current news articles on this topic, you will discover this has been killed in Congress. The concept was killed from two perspectives, budget and privacy.

posted on Mar, 3 2003 @ 10:59 AM
Some experience as a sysadmin/netgeek might change your perspective here.

It is impossible to completely monitor internet traffic... that would be like you sitting in a room with 200 televisions, all of them tuned to different channels, trying to pick out certain code words with them babbling at you constantly. Furthermore, if and when something becomes available, the netgeeks and webgeeks would be creating underground pathways and channels that could not be monitored.

Computers are complex things and unfortunately many of the people writing articles about computer issues for magazines and newspapers know basically how to turn their machines on and how to find email and word processors and nothing else. They wouldn't know a router from a mail daemon and couldn't even do a simple telnet HELO on a mailserver.

posted on Mar, 3 2003 @ 11:07 AM

Originally posted by CoLD aNGeR
Well ill tell ppl to be really careful searching
in internet, and using net, wireless lan, satellite
whatever is related with a communication medium,
Coz they have everything being check everyday,

Mr. Cold Anger...

I understand and appreciate your fear, however, in reality, there is very little to fear about the Internet being used to spy on you and others like you.

The sheer scale of the issues is astounding:

150 million Google searches every day.
Approximately 4.8 billion e-mails every day.

Without considering website access at all, let's examine what it would take to simply track these two slices of overall Internet activity.

The average Google search return page is about 20 kilobytes of data. If all we track are the searches, who searched, and what was clicked, we might be able to keep each search at only 20k of data: The result is 3 terrabytes of data each day.

The average text e-mail is about 8k with all header and routing data intact: That results in 38.4 terrabytes of data each day.

So, to keep track of one month's worth of e-mail and Google searches, we will need 1,602 terrabytes of storage space.

This is a 4.4 terrabyte data server from Gorilla Systems, a reliable builder of mid-range server products.

For one month of data, you'll need 356 of these. At a cost of $17,000 each, that's $6 million dollars just to store the data.

I think you can see where this is going.

The scale of such a project makes the reality absurd.

posted on Mar, 3 2003 @ 12:19 PM
Cold Anger, it seems that you haven't yet realized that the internet connects the whole world (minus a few *very* restrictive countries)...There's no way that the US Intelligence Agencies can cut off the rest of the world without cutting their own links for intelligence.

Merely the cost of equipment & manpower required to be able to simply *monitor* everything is prohibitive...There's only so much they *can* do; Even if they had the "legal justification" to cover *everything* in the world, they simply *can't* do it. The internet is simply too big & widespread.

posted on Mar, 4 2003 @ 04:38 AM

posted on Mar, 4 2003 @ 05:18 AM
Sorry about the above

Well ill tell u guys, that it may be much more easier than u think to monitor internet, i work in IT related
now for 6 years, i work actually in a company that does
support of IT for another companies, and we have here
intranet, also a proxy to go out etc etc, but if someones
sends a mail with keywords or uses novell msg for intranet in the company and uses words like joint
or "weed" or "porno" it will jump and be redirected to the mailbox of managment and 2nd level. I know this
because i am 2nd level, but agents are not allowed to know about that popup alert with keywords.
My company has 3k people working here, and it takes second to get the ALERT MAIL.
Also because of the cookies, ip and stuff we have procedures to follow, and believe me if i want to know where an agent has been today in the internet i can show him every single web visited, also norton helps for that. Registry that no ones undertand in regedit of WXP
Keeps more than u think, is not just registry keys that your O/S needs to run,
Do u know copyright? You should check the systems they have to get people burning, or copying illegaly material with copyright.
Actually almost all the info that we are getting about conspiracy is on the net, and almost all is searched in google, google actually is not a simple server, or a simple searcher, is the fastes & biggest in the net, as u say with millions of users, searching stuff etc etc etc,
above i show how TCP/IP was created, and also PoP3 for allow mail, (FTMP File transfer mail protocol) and all these, it had a computer registering every log file needed, because of the remote login, people had to have STATIC FILES, that was the testing time, and think about other thing, the same as i have in the company, is used in the pentagon (yes this little tool from them)
and with special characthers, letters and keywords they will get who when how what, now with the communications depending also on satellite we have less possibilities of privacy in that matter, u can spy people via satellite, via phone, via mail (even encrypted 1024), via mobile phone.....
Look, "Cold Anger, it seems that you haven't yet realized that the internet connects the whole world (minus a few *very* restrictive countries)...There's no way that the US Intelligence Agencies can cut off the rest of the world without cutting their own links for intelligence"
If theyve created the system to connect the world, they can perfectly and easily do the rest, actually i know
already about internet restrictions in countries, specially
arab ones, if u dont believe me make sure, that links related with conspiracy come to friends from u everywhere in the world, some they get no permission, or page encrypted, or no authorized, or page cannot be displayed, while for me its working, means that they make sure that no one gets the same info for believe in it, the truth is more difficult to believe than the science fiction, and of course is not public, just be careful, i mean it...

I dont say EVERYTHING IS BEING checked, but imagine
that u are one of the rules of the world, u create a system of communications for comunicate ppl between ppl, would take care that what u do, know and plan is appearing on someone of these systems? NO. Obviosly
ill take care first of make sure that the two limits that this source of info has been setted, and that no one is able to get more than that, public, media opinion, encyclpedias, internet news, newspapers, Radio, Tv...
Ill make sure that keywords searched on MY MAIN SERVERS THAT I HAVE CREATED, will show me ppl searching
for secret societies, or brotherhoods, or esoteric theorys, or aliens, conspiracy etc etc etc, because those are the only ones
who really know that i am there and they will fight me, so even dont need to fight them or get rid of them, just make them
crazy with the "in what to believe", they really play hard with our psicology serious, is not a joke, look how the create
revolutions, concepts of freedom, liberty, rights etc etc, they are really playing to give form to puppets for the final time.

Just take care and start to think about this film, coz
we are just actors from their big movie, and we have to know that everything in this society has been created for a meaning...(not good for us of course)

posted on Mar, 4 2003 @ 07:14 AM

Originally posted by CoLD aNGeR

Well ill tell u guys, that it may be much more easier than u think to monitor internet, i work in IT related
now for 6 years,

I'm sorry Mr. Cold Anger, but your IT expreience does not express itself in the nature of your opinions.

The systems in your company can indeed parse through the inbound and outbound e-mail to look for certain words, because the e-mail is being managed by the same system.

The TCP/IP nature of Internet data communications requires that any technology that attempts to monitor must assemble the data packets for any real reliable parsing of keywords. It's not possible to simply "sniff traffic". Some have tried, all have failed.

According to public data published by Sprint, there are over 10 billion active packets on their backbone, at any given moment. This is just one service provider.

No sir, I'm sorry. But the only way to monitor traffic is the situation you're currently experiencing -- dedicated hardware, monitoring specified connections. Even the most advanced equipment of the NSA/CIA is limited to this.

posted on Mar, 4 2003 @ 07:23 AM
Winston has very clearly put the main "technology" counter-argument; but the sheer logistics referred to elsewhere is even more convincing. No doubt every letter posted in the US could be opened, read and re-sealed in such a way that scarcely anyone would know -except the many millions of staff required.
Govt. control is essentially limited to just shutting the whole thing down: that could be done -or highly selective hacking - based on publicly available data (or spying)- one effective but crude and commercially unacceptable (let alone legally); the other too hit-and-miss: as recent events have all too painfully shown us.
Deliberate mega-spamming is not unknown, either.

posted on Mar, 4 2003 @ 07:53 AM
we are not talking about monitoring, we are talking about redirected area, so u run a command that requires ID in the net, u are sending info Somewhere and u will get packets back as an answer, all works
with ASCII (American Standard Code for Information Interchange, used for define bits into characters) and Binary sistem, right? So then every fluid
of packets send between locations is transfering data,
we are not talking about the transfer it self, we say that is redirected to an unkown place depending on the content of the mail, file, as is using binary sistem, and that is represented in the ascii code in letters and numbers, it just depend on the machine that is decoding and the program used, the same as receiving something encrypted and depending on your system and which Hex program u have u will decodified faster or slower, with that is the same, it has to be a main MACRO SERVER that was there from the begining, we have internet because of the american research, and we have the net because they keep it runing, we as persons dont know all about "how the net its being there 24/7" people gives answers about upload info to the net, or put it into a server, but net is just a channel
to transfer all this data, Computers were created before
internet was, that is obvius, they just needed a way
to communicate them, and have for sure that 1 of the first main things of the proyect was how to take the fluid
of info decodified and be able to read it, maybe for all of u (also for me at the begining) is very difficult to think about net being monitored, but i didnt ment that, just
that specified info, related in specified people and facts are being monitored, NOT THE ENTIRE NET, but the stuff that can hurt them yes.

Any suggestions, plz feel free to post


ASCII stands for American Standard Code for Information Interchange. Computers can only understand numbers, so an ASCII code is the numerical representation of a character such as 'a' or '@' or an action of some sort. ASCII was developed a long time ago and now the non-printing characters are rarely used for their original purpose. ASCII was actually designed for use with teletypes and so the descriptions are somewhat obscure. If someone says they want your CV however in ASCII format, all this means is they want 'plain' text with no formatting such as tabs, bold or underscoring - the raw format that any computer can understand. This is usually so they can easily import the file into their own applications without issues. Notepad.exe creates ASCII text, or in MS Word you can save a file as 'text only'

So as u read here, the (American Standard Code for Information Interchange or "Thank u for use a code that we can understand")
is possible to read no matter what, if u think the speed conex its a problem, ill tell u that speed light is fast enough to do it

posted on Mar, 4 2003 @ 12:24 PM
Still, you're not considering the sheer number of programs available to the general public that allow you to control wht types of info get *into* your computer...A common one on the open market named "Web Washer", for example allows you to control what type of "cookies" actually get saved to your computer, it can "wash" out items in your browser that allows the website you're visiting to track your navigation through that site, etc.

Not to mention the technically "illegal" such washing programs written by hackers that even allow you to "jink" you computer's IP protocol, making it difficult-to-impossible for your address to be tracked through IP route tracing...

In other words, with the variety of people that have access to the internet & the types of self-protection programming available *still* makes it impossible for US Intelligence agencies to be able to track & monitor *everything*. This is even *not* considering the other financing, hardware & manpower restrictions that they'd have to deal with to make such a task possible.

posted on Mar, 4 2003 @ 12:37 PM

Originally posted by CoLD aNGeR
u are sending info Somewhere and u will get packets back as an answer, all works with ASCII and Binary sistem, right? So then every fluid of packets send between locations is transfering data,

You seem to have a limited understanding of (or exposure to) the packet routing technology in place that manages the sending of data on the Internet.

Once all TCP/IP packets are assembled on the receiving end, decypherable binary data does indeed exist. However, it's not always simple ASCII text. If the Internet worked as you seem to think, then it would be possible to intercept data and decode, however, this is not the case. Let's examine:

A typical e-mail will often be broken down into at least 20 TCP/IP packets for transmission. Even on the most reliable backbones, 15% of all packets in a single data-set will take a different route than the majority of packets (published data often referred to by the likes of Sprint, AT&T, Global Crossing, etc.). So, at least three of those TCP/IP packets for our small 20 packet e-mail will take a different multi-server hop in finding its way to your INbox. If your hypothetical Internet Sniffer only discovers 17 packets, critical data about the structure of the e-mail may be lost, and the resulting data unreliable.

The very nature of packet routing makes general Internet monitoring impossible without direct installation of dedicated hardware, monitoring specified servers.

I hope that helps.

posted on Mar, 5 2003 @ 01:53 PM
What about computers not connected to the internet or via wireless?

posted on Mar, 5 2003 @ 04:27 PM
Just you putting that post here on the internet seems hypocritical. ?

new topics
top topics
<<   2 >>

log in