It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Dumbing down computers in the name of national security?

page: 1
4
<<   2 >>

log in

join
share:

posted on Jan, 1 2011 @ 05:29 PM
link   
Dumbing down computers in the name of national security?
Next week you will hear of the new direction for computers and how they are going to create motherboards with cpu's and graphics cards built in to them. they will tell you its the way of the future and will be a improvement. But what really is going on?

Military computers run on memory not hard drives. The military use memory because incase the enemy over runs the camp you can turn them of and the computers will be zeroed or erased. There computers don't have a hard drive because they don't want anything saved to disk like Bradley Manning did with his flash drive. Makes since right!

Well the real story is kind of unbelievable. Why would anybody want to dumb down computers and how exactly does building one peice computer motherboards, cpu and graphics card do that? Lets look at the tech out there that scares them.

The supercomputer. "Today, parallel designs are based on "off the shelf" server-class microprocessors, such as the PowerPC, Opteron, or Xeon, and coprocessors like NVIDIA Tesla GPGPUs, AMD GPUs, IBM Cell, FPGAs. Most modern supercomputers are now highly-tuned computer clusters using commodity processors combined with custom interconnects." So todays supercomputers are built with off the shelf components linked together. Just like this US Military Play Station cluster. And yes Wikileaks has there own Play Station cluster. but what does this have to do with the dumbing down of computers. Well the idea behind those clusters has moved to computers. yes the ability to connect multiple graphics cards together made it to computers. SLI and Crossfire. First it was 2-way now its 4-way and soon it will look like the US military Play Station cluster. And the other portion of that idea of Play Station cluster have been created, the ability to use those graphics cards for more than graphics. Nvidia Physx and now Nvidia Cuda. They offer the ability to use those video cards as scientific processors connected to CPU's to make the CPU's even more powerful.

Well the Video card market never got flooded with military funding because they choose the Play Station gaming console as there video processor for scientific processing. Now the CPU makers Intel and AMD both got in on the military funding along with software companies and other computer component manufacturers. They were involed in Project Aurora, Project Boralis and Project Meddusa. Those projects were secret projects funded by the military after 9/11 to create new processors to be used by the military. Stream processing engines infact. Thats why Project Aurora got confused with an aircraft because it was a secret project to create a new engine funded by the Air Force.So you can see why the Military and Big Brother has control over computer technology. They pumped billons into those companies and still do. Video card manufacturers were left out and there technology was not limited due to government control. So how does the government gain control back by buying up those video card manufacturers through companies they fund and control (Intel and AMD) then do away with video cards by incorporating them into the motherboard and CPU which they have control of. Motherboards will delet the slots for video cards in the future and there problem is solved. They will say its the next big step in the computer market to combine all these components into one board as a single component. but the real reason is to control the computers in the future and have no more surprises like Wikileaks building there own supercomputer. They will tell you its because we live in a throw away society where it cost more to repair then it does to buy a new one. But nobody ever repaired there components they bought new and better ones and updated there computers.

And how about memory on a computer. Intel motherboards have six slots for memory (24 gigs max),and AMD motherboards have four slots for memory(16 gigs max). the military has been using 40 gigs or more of memory for a long time. Back in 2001 they decided to create laptops for the military that ran on memory with no hard drives. If the laptop was taken all information on the computer would be gone when the power ran out or was turned off. With no hard drive it could not be saved on the computer. Good if your base was over ran just turn the power off and all intel is zeroed out or erased. So why has the memory on public computers been stuck for so long. Big Brother does not want you to have it. If you did then you could possibly spy on them like Wikileaks is doing. If you can't open there files on your computer you can't see there files.

And I don't know if anybody tried to buy a computer for Christmas and found it was put on back order. Well Taiwan cut down on the number of motherboards they produced and all computer makers ran out of motherboards. Could th real reason be they are prepareing to not make them any more and start making there one piece design. I thought America was supposed to be the land of the free where we have more coices then any where else due to freedom of the market. Don't make me laugh!

The militaries Play Station cluster.
i.bnet.com...

Nvidia Cuda
www.nvidia.com...

Nvidia Physx
www.nvidia.com...

Supercomputer info
en.wikipedia.org...

The motherboard they don't want you to get.
www.evga.com...
When you get all that on one board don't forget to water cool it and cut those temperatures in half. The average CPU under heavy load will run at 150F. your memory under heavy load 90F. Not to mention your video cards and even your motherboard itself.
Computer water cooling.
Simple CPU Cooler
www.coolitsystems.com...
Or even a whole System Cooler
www.thermaltakeusa.com...
You can even go bigger water cooling your video cards with a bigger system.
www.koolance.com...



posted on Jan, 1 2011 @ 05:30 PM
link   
Im not trying to sell any products. I gave examples of new tech.
Do your own search if you want to buy something.
And if the government wants me to be stuck playing with a net book and playing on there database called Facebook and Myspace there only fooling them selfs. Buck the Big Brother dumb down and build your own its not hard. And hopefully the products to build your own wont disapear in the near future.
edit on 1-1-2011 by JBA2848 because: (no reason given)



posted on Jan, 1 2011 @ 06:17 PM
link   
Heres the link for the article about building the graphics card into the cpu and motherboard.
gadgetwise.blogs.nytimes.com...




Intel’s Sandy Bridge processing chip will have its formal debut at next week’s Consumer Electronics Show when it arrives in a number of high-end, $1,000-plus, laptop computers.

Of course, there had to be a leak before CES formally opens and Intel draws the curtain, and this time, Hewlett-Packard has shown its goods early.

H.P. released some details about two variations of the new Pavilion dv7, which were subsequently taken off its product support pages. The chips, based on Core i7 technology and architecture, have graphics technology integrated on the same die as the main processing chip, or C.P.U. A result is a smaller chip, improved graphics performance and more efficient use of power. There is also a feature in the Core 17 chips called Turbo Boost, which bumps up the clock speeds by about a third.




posted on Jan, 1 2011 @ 06:18 PM
link   
If anything it probably has little to do with "dumbing down" and more to do with profit margins. Years ago you could buy motherboards with the cpu soldered onto the motherboard (common on laptop boards too). These were generally dirt cheap budget boards used by back street independant retailers who sold them in low cost systems. The problem with these was that if there was a fault with either the cpu or board you couldn't just replace the faulty component. Also, intergated sound and graphics have been the norm forever. There isn't really anything new here.

Might be; Say you buy one of these new all-in-one solutions and your graphics go pear shaped, rather than just pop out and buy a replacement you will have to buy a whole new all-in-one solution. At the moment only Nvidia or ATI would see your cash where as everyone gets a slice of the pie if individual components cannot be sourced and fitted.

Personally I cannot see it happening in this way. I would suspect there will always be an upgrade path in the form of expansions slots eg. pci/pcie. The intergrated solutions are nothing more than a continuation of existing practices. Advancements such as cpu/gpu intergration, well they just make sense from a performance standpoint reducing the data path between the two considerably and increasing bandwidth in the process.



posted on Jan, 1 2011 @ 06:28 PM
link   
reply to post by quackers
 

I don't know. Even looking at the CPU's I7. I7 has one way sharing which only allows sending info out to a device. Where 5700 and 5600 xeons have two way sharing that allows for miltiple CPU's on a motherboard. Why not have the ability to add multiple CPUs to a computer when the technology is already there. They took the technology out not put it in.

www.tyan.com...

To me that is dumbing down the computers for the general public. They want you to use no more then a net book any more. They fear there secrets will be discovered. Such as Wikileaks has done to them already.
edit on 1-1-2011 by JBA2848 because: (no reason given)



posted on Jan, 1 2011 @ 06:35 PM
link   
Even water cooling is not being used in computers when they should be adding it as a feature. The water cooling can cut temperatures in half allowing over cloaking of CPU's and even memory. Theres people out there runing 3 gig CPU's at 6 gigs over cloaked. Why not add a $70 CPU water cooler and run computers faster. it can't be because it cost to much.
edit on 1-1-2011 by JBA2848 because: (no reason given)



posted on Jan, 1 2011 @ 06:35 PM
link   
The idea of integrating CPU, Graphics, Audio, etc. into the MB is quite old. And the industry has been manufacturing these for quite some time.

All of the retail manufacturers have sold millions of computers without an AGP or PCIe graphics slot for many years. Look at some old machines like the Dimension 2000 or 3000 series, though there are many others.
The trend now is to include that slot on new retail machines, not take it out. But there will always be a portion of the industry that builds a cheap machine that doesn't have this slot, and will also cut down on memory slots to keep cost and size down. Example might be an itx MB.

The military's main computers are not run on memory only. They do however have terminals that run from the server, and have no permanent storage capabilities. This may be what you're thinking of. Their other computers do have hard drives. But some of them are not anything the consumer can get hold of. And yes, they are quickly eraseable.

The capability to change your CPU and graphics card are not going away any time soon. There is a whole industry out there that caters to Hobbyists, power users, and professionals like myself. It is a huge industry.

The black project military computers are so far ahead of the best consumer, or even business machine out there, it makes what we have look like toys.

All technology goes through national security before it's allowed on the market. We have exactly what they want us to have.

And BTW, my CPU does not run at 150F under a load. And my memory does not run at 90F. However, you are correct that many retail machines do.

I have heard of these laptops you mentioned. But my understanding is these are not common. They are for special use only.

Good post. S&F


edit on 1-1-2011 by Klassified because: (no reason given)



posted on Jan, 1 2011 @ 06:42 PM
link   
Interesting concept but realistically 95% of the population don't have the know-how or financial ability right now to create a "mini supercomputer" with networked hardware anyway...
Also, what would happen to companies like ASUS and aftermarket mod companies like Sapphire?
You think they would just decide to close up shop?
No...they would still deliver unique and non-conglomerated PCU parts to the customers that demanded them...albeit @ a premium...but how is that any different than today?
I think the poster above me is right in suggesting that it's probably more about the bottom line than a conscious effort to reduce the power of consumer technology...



posted on Jan, 1 2011 @ 06:46 PM
link   
Apple computers are already dumbed down.
No offense, but you are a bumbling idiot if you buy an Apple computer for anything.



Guess who the drones sitting down are?



posted on Jan, 1 2011 @ 06:47 PM
link   
Very interesting stuff! Now if this were to happen how would it effect the IT industry? That is really neat to see how they make the clusters.

I would have thought "They" want us to have hard drives. Because it gives them a way to put people in jail.



posted on Jan, 1 2011 @ 06:47 PM
link   
Im not sure. Outside the server enviroment there is little need for multiple cpus when the same is achieved by multiple cores per single cpu. The rather fine example you gave, the dual socket board, well thats purely an enthusiast example. The enthusiast market is huge and Im not convinced that manufacturers will voluntarily cut their nose off to spite their face, so to speak.

To me a dumbing down would be indicated by an obvious lack of performance gains in the newer tech in comparison to previous generational leaps in similar technology. Lets not forget that Moore's Law playes a role here. There's going to come a time, and soon, where the conventional methods of fabrication simply become redundant. Perhaps this is just a way of stalling by giving the illusion that we're getting something new when really it's just the same old thing in a shiny new box? A reluctance to release the real groundbreaking technology we all know they are capable of producing, for whatever reason.


Originally posted by JBA2848
Even water cooling is not being used in computers when they should be adding it as a feature. The water cooling can cut temperatures in half allowing over cloaking of CPU's and even memory. Theres people out there runing 3 gig CPU's at 6 gigs over cloaked. Why not add a $70 CPU water cooler and run computers faster. it can't be because it cost to much.
edit on 1-1-2011 by JBA2848 because: (no reason given)


To be fair, watercooling has been prohibitively expensive in past years and considering most aftermarket air coolers do just as good a job with modern low power cpus, well it was never anything more than a gimmick to jazz up the inside of your case. Now if used in combination with phase-change then we start to see an actual significant reduction (>0) but again this is something that only a enthusiast would be interested in and is completely wasted on the majority who won't be trying to squeeze 8ghz from their desktop. Besides, Intel and AMD have both locked down their chips over the years for little other reason than to sell "faster" cpus at an inflated price. That and Joe Average just would stare blankly at the mention of FSB, multipliers and memory timings.

edit on 1-1-2011 by quackers because: (no reason given)



posted on Jan, 1 2011 @ 07:07 PM
link   
It's really all about price. You can build your own computer and put whatever motherboard you wish.

Think about it like this. What percentage of CPU users would you say utilize their computer for more than simple Internet and solitaire - maybe importing a few .jpeg's from their digital camera? My guess is not that great.

I believe what you might be referring to is (.mil) something like a VPN (Virtual Private Network). If at some point down the line you can only purchase retail CPU's from Best Buy etc. then I might have to agree. Otherwise it is really all about the retail price point.



posted on Jan, 1 2011 @ 07:27 PM
link   
The consumer already have more power than what is needed in relation to the software that is available to them.

Multiple core CPU's have been out for years with the introduction of hexacores (6 cores) slated to become the normal in 2011. Very few software packages take advantage of more than 2 cores at the present time(Mostly Video editing and encoding). Multi-tasking while used in the consumer market is no where near the need for 6 cores at this time. The same could be said of memory, I have seen consumer boards that could hold more than 16GB but way on earth would you need more than 16 GB at this point? Just how many programs are you running simultaneously to actually use 16 GB of memory?

How long has 64-bit processing been out? 32 bit processors are not even manufactured anymore by Intel and AMD. How many programs take advantage of 64 bit computing? Not very many.

Mechanical hard drives are also on the way out.The price of SSD's has been dropping over the last couple of years and are starting to become a common option on laptops.

The AMD fusion processor (CPU and Video in one chip) is slated for more low end to main stream users. AMD is doing this to capture more of the video sub-processing market(Remember AMD owns ATI, just as Intel owns Nvidia). By combining the two, the average consumer is less likely to go out and buy the competitors product(Nvida in this case).
There are definite advantages to the GPU having a direct line to the CPU by being on the same die. You are not going to get a 6900 series GPU on the same die, but a lesser GPU attached directly to the CPU could give a mid to high end discrete card a run for its money. The only big disadvantage is if the GPU is forced to use main memory instead of its own discrete memory.

In this case I do not see a conspiracy , just smart marketing on AMD's part. Even GPU's have been taking on more roles than just video. Nvidia physics, for example, allows a program to use the GPU to do physics calculations. AMD has something similar by allowing extra GPU cycles to be used like a general processor.

Anyway, not a conspiracy just a marketing ploy to try and lock in more market share.



posted on Jan, 1 2011 @ 08:04 PM
link   
reply to post by Dreamwatcher
 

All well said Dreamwatcher. There is however, a lot of deception in the computer industry. So much, I don't know where to start. And consumers pay dearly for that deception. TPTB are indeed guilty of withholding technology far beyond what is being sold to the public as the latest and the greatest. What we have is not only outdated, but obsolete. The industry itself is guilty of having the next handful of "new technology" sitting on the shelf for years waiting for the OK to release it as the newest and best. They also charge outrageous prices for new tech that didn't cost them any more to make than the old stuff. Like SSD's. Microsoft and Apple play both ends of the consumer market against the middle. Both their Operating Systems are the best spyware out there with glitches and bugs purposely built in. Er...Uhm...IMHO. And to top it all off, new viruses, rogueware, and malware just seem to find their way right past you AV software. Wonder how that happens? And I could go on and on.

So conspiracy? To some degree. Yes.



posted on Jan, 1 2011 @ 09:46 PM
link   
To help understand where computers are going it helps to know where they came from. First as clockwork devices in the 1800's, then onto valves during WW2 to help decode Nazi transmissions, the introduction of the transistor shrank the size and Moore's law went into effects with a fast size reduction and capability expansion. Hard to say just where exactly this trend will end. The military has been a motivating factor in it's progress with the public release a few years behind. When it comes to computers, smaller does not mean dumber. Just keep your head up and watch out for any negative trends as this technology could end up anywhere.



posted on Jan, 1 2011 @ 10:17 PM
link   
reply to post by JBA2848
 


Personally, I think you are on the money.
S&F!

Like the quacker above, many will doubt that technology is contolled by TPTB.
Those who know, know your right.

Great post, and thanks for giving us the big heads up.



posted on Jan, 1 2011 @ 10:25 PM
link   
reply to post by burntheships
 


Now I never said the TPTB did not control the general level of technology that is available to the public.

Some estimates indicate the military technology may be 50 - 75 years ahead of public technology.

My response to the OP was directly related to the topic at hand, if the Fusion chip was dumbing down computers. My response was purely related to the technology available to the public, not the technology that may or may not be withheld from the public.



edit on 1-1-2011 by Dreamwatcher because: Spelling

edit on 1-1-2011 by Dreamwatcher because: Grammer..ugh..



posted on Jan, 1 2011 @ 10:30 PM
link   
reply to post by Dreamwatcher
 


No offense intended, I was acutally referring to a different post, poster.
I agree technology that is marketed to the public as 50 to 70 years behind the curve.



posted on Jan, 1 2011 @ 10:37 PM
link   
reply to post by burntheships
 


No offense taken.


I just picked yours because it was the last post related to the suppression of technology and I wanted to clear up that my post was not related to that particular subject and that I most definitely believe that technology is suppressed to some extent.

How much is it suppressed? I do not have an answer for that but being a techie myself I would be like a kid in a candy store if ever offered the opportunity to find out.



posted on Jan, 1 2011 @ 10:58 PM
link   
reply to post by Dreamwatcher
 


no doubt about it.

Although there is alot of potential in those PS2's.
More than most of us know about.




top topics



 
4
<<   2 >>

log in

join