It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

A Chip With The Power Of A Supercomputer

page: 1
9
<<   2 >>

log in

join
share:

posted on Feb, 13 2007 @ 08:22 AM
link   
Intel has unveiled an 80 core microprocessor chip. Intel plans to demonstrate the teraflop chip at the Solid State Circuits conference in San Francisco. The chip in codenamed Polaris. The chip cane make a trillion calculations per second. Eleven year ago that would have taken a super computer of two thousand five hundred square feet. This new terra flop chip uses about the same power as today's dual core processors. This new processor could be used for image recantation or gaming that will have graphics that look like filmed television shows.
 



www.informationweek.com
The 80-core Polaris can crunch a trillion mathematical calculations a second, a feat that 11 years ago would have required a 2,500-square-foot supercomputer. Polaris's size and the fact that it uses about the same power as today's 2-core processors mean that it could someday provide unprecedented processing power within a home computer.

Such muscle would be useful, for example, in developing videogames that look as real as TV shows, or in providing imaging recognition, which means a computer could recognize photos and catalog them appropriately. But beyond gaming and specialty applications, the need for a desktop supercomputer in the home is not yet clear. "That's a little bit nebulous," Martin Reynolds, analyst for Gartner, said.

But assuming that the technology industry finds a use for Polaris in the five to eight years it will take Intel to move from experiment to production, there are still a number of challenges that will have to be met before the chip enters the consumer or business markets.




Please visit the link provided for the complete story.


We all like using computers hear. So a new processor this fast sounds great to all of us. The article however does say that the chip is about eleven years away from production. Even then we may not see the eighty core chip on the market, however we probably will be seeing twenty or forty core processors on the market. Eleven years ago that much computing power required a supper computer larger then my house. It will soon be on a chip, so think back a little bit and compare what you were using eleven years ago to now. I also wonder what that much computing power will be used for in a desk top PC? A lot good I am sure, but some bad also. Could that much power be used for say optical cloaking?

Related News Links:
www.nytimes.com
www.dailytech.com
news.xinhuanet.com
www.technewsworld.com

Related AboveTopSecret.com Discussion Threads:
Chip Implants could be a key issue in 2008!
Plans are Underway to Microchip every Newborn in U.S. and Europe…




posted on Feb, 13 2007 @ 08:40 AM
link   
on the bbc website, it says that programmers will find it difficult to think up software processors that account to 80 tasks at one time. the article says it is even hard to find 2 that need to be done at the same time.

looks like there is technolgy already behind the scenes that is way ahead of everything we already have, but the software is not there.

just think on your computer whats the maximun amount of things you have going at one time.

it will take a whole new era of software to take advantage of this. i saw the other day on cnn, a video of asians, having a massive touch screen with like 30-40 different tv screens at once on there. maybe this is the type of thing it may be used for.



posted on Feb, 13 2007 @ 08:52 AM
link   
Andy,
Yes you are right. I have a dual core and in the task manager one always works harder then the other, but at lease both are being used.
having eighty different screen hooked up is one way to utilize this new processor, but think about what could be done with something similar. Like optical cloaking. The single chip run eighty different high resolution monitors mounted all around a vehicle or maby a plane. All run by one chip.
If something with that much power is that small, think what kind of chip could be put in your credit card?



posted on Feb, 13 2007 @ 01:49 PM
link   
i think maybe this article leaves us mainly with questions of what processors do they have in the military. computing technolgy of tomorrow(i mean by that 10 years time), may be something totally unimaginable. but of course they release technology gradually, but i have always heard they have chips and technology ready for release but the software is not there yet to release it.

people say that the military have technology 500 years in front of what is out in public, so what kind of tech would be 500 years down the road, if these sort of things are coming out or talked about.

just what will computers look like with these things, will they just be pieces of glass, and we talk to, and tuch screens. makes one wonder what kind of tech is around the corner, and what they are hiding.



posted on Feb, 13 2007 @ 02:04 PM
link   
I brought this topic up yesterday in advanced technology. It's interesting to see these large leaps in microprocessing. It doesn't surprise me that this happened with the 80nm chip. A 45nm just came out recently. Looking at this from a programmers perspective...I have a program that could use 80 processes. In fact, it would take about 8 seconds on that new chip. On a P4 it takes about 10 min.



posted on Feb, 13 2007 @ 04:27 PM
link   
Well, you really have to ask yourself the Marcus Aurelius question: "For any particular thing, what is it, in itself?" In this case, it's a test part that Intel is making in order to play around with mesh bus structures and data routing algorithms. The end goal is to find efficient, simple routing algorithms and structures, probably as a prelude to doing it photonically in the future.

Polaris isn't intended to be a general purpose computer. It's what we call a vector processor. A vector processor is a lot of little floating point computation engines. In this case, each tile in the Polaris is a pair of FP engines, a router, a bit of ram and a few simple peripherals. There's no general purpose logic like you have in an x86 core; they're not really good for general computational work.

Vector processors don't run Doom, or Sim City or balance your checkbooks. They're used for crunching numbers. Lots and lots of numbers. A thing you could do with it, if it was a production chip, would be to run crypto and voice analysis on it: NSA and a few others buy a freaking TON of vector processors. The NSA is the world's largest consumer of general purpose vector processor and DSP boards in the world. They're always in the market for something new, and if you have something unique, fast or aimed at a particular problem you can nearly always sell them some. At one time Motorola had the world's most kickass VP chip, you just didn't ever see it on the shelf. We did the demo boards and development kits for them long ago.

Now, there are other vector processors you may be familiar with, one of which is the Cell processor that's in the PS3. One or two of them is ok, but when you start cross connecting 256 of them, you put Polaris to shame. Guess who buys lots of Cells.

IBM wants a flat million bucks per vendor per year to buy in for DoD development, we won't spend it, so we don't do Cell designs.

Someone discovered that a lot of GPUs like the ones on Nvidia or ATI video cards had an architecture that was almost useful for a vector processor and spent a good bit of development time to see if the latest game video processor would work. At one time, there was a part from 3D Labs called a P10 which was sort of a more flexible video part than an Nvidia that solved some of the issues with using them as general vector processors, and the Navy was all over it for sonar processing.

We sort of like the Clearspeed CSX600, it has 96 cores, runs on 10 Watts and has 25GFlops of throughput. It's not in the same class as the Polaris but it scales really well and is a lot more producable than a Cell.

As far as optical cloaking goes, it's not a matter of computational power, it's more a material science issue.

There are also a lot of other things you use something like this for in the real world other than crypto and "voice processing", although that's probably the biggie in terms of volume. You can also use it for sonar image processing, for calculating SAR for radars, for doing some types of imaging probably better not discussed from satellites etc. Image crunching for radar, sonar and counter measures is a huge computational burden that fits a vector processor perfectly.



posted on Feb, 13 2007 @ 04:32 PM
link   
Tom Bedlam- you seem to understand this quite well. where do you think computers will be in 10 years time. do you have any inside info.



posted on Feb, 13 2007 @ 05:00 PM
link   
Hm. Well, Intel's new hafnium process is really interesting. I think they're going to hurt some people with that.

Microsoft and Intel bought up all the patents that they could get their hands on on organic semiconductors about 5 years ago. There's a lot of ponies in that barn in terms of displays, but you can also use organic transistors for really dense low power logic. I don't know if that has hit a dead end or not.

I think you'll see the hafnium thing hit, then multicore chips, then with photonic links, then probably quantum-dot logic, then organic. Organic and qd may not compete in the same apps. You'd use organic semiconductors for wall sized screens, for example, but which one is better for general computation I don't know.

When I say quantum-dot, I don't mean quantum computing in the sense of using qubits, although you can do that with it too. A qubit computational cell is more like a vector processor, there's a lot you can do with it but it doesn't tend to lend itself to running Doom 2010, at least not the way we program now.

However with qdl (and there's several types, qd opto, qd magnetic etc) you can make sequential logic as small as it gets. Very fine organic logic can sort of compete with it but as you get really small the organic starts to run into quantum indeterminacy issues that the qd logic does not. LLNL, Sandia and a couple others are pouring your tax dollars into QD logic, so take that for what it's worth.

There's something else that is occasionally used and extremely cool, but it's real specialized and I don't think you'll ever see it in the civilian world. It also isn't great for building a desktop around but is hell on wheels for some types of processing. I think qdl will supplant it over the next decade or two.



posted on Feb, 13 2007 @ 05:06 PM
link   

Originally posted by andy1033
where do you think computers will be in 10 years time.


I posed this question a couple of years ago to a comp. prog. and he said holograms. Is he right? Time will tell but the speeds of comp's are getting amazing.



posted on Feb, 13 2007 @ 05:15 PM
link   

Originally posted by intrepid

Originally posted by andy1033
where do you think computers will be in 10 years time.


I posed this question a couple of years ago to a comp. prog. and he said holograms. Is he right? Time will tell but the speeds of comp's are getting amazing.


That's what I was alluding to in the last paragraph. The military uses it for some really specialized processing but it's not very useful as a general purpose computer.



posted on Feb, 13 2007 @ 05:17 PM
link   

Originally posted by Tom Bedlam

Originally posted by intrepid

Originally posted by andy1033
where do you think computers will be in 10 years time.


I posed this question a couple of years ago to a comp. prog. and he said holograms. Is he right? Time will tell but the speeds of comp's are getting amazing.


That's what I was alluding to in the last paragraph. The military uses it for some really specialized processing but it's not very useful as a general purpose computer.


i think the public view of computers will maybe not be moved that fast. i remember bill gates saying voice recognition and touch screens are going to be the norm.

public technology does not move that fast unfortunately.



posted on Feb, 13 2007 @ 05:27 PM
link   

Originally posted by andy1033
i think the public view of computers will maybe not be moved that fast. i remember bill gates saying voice recognition and touch screens are going to be the norm.

public technology does not move that fast unfortunately.


I don't know about that. Comp's are all about size, speed and capacity and their development seem to be exponential. How long ago was a gig unthought of? Not 10 years I believe. Now we're looking at tera. What's next and how soon exponentially?



posted on Feb, 13 2007 @ 05:40 PM
link   
Well the next iteration would be the petaflop chip. They have already built a supercomputer capable of this speed. Who knows how long it will take to get this technology available to the public. What will come first, the petaflop chip or the domestic quantum computer?

en.wikipedia.org...

These chips would undoubtedly make my Snood at work run much faster



posted on Feb, 13 2007 @ 05:41 PM
link   

Originally posted by andy1033
public technology does not move that fast unfortunately.


Hrm. Maybe not. I may be spoiled.

I don't think your estimate of 500 years ahead is right. I'd say 10-20, depending on the technology, for developmental stuff. Well, a couple of the things I've seen are right off the chart, but for the most part, 10-20 years ahead is about right. The real problem is keeping stuff working, because by the time it gets fielded you are 5 years BEHIND.


We are still under NDA on the project so I can't much comment on it, even though it's ended. I saw it go to the civilian sector released through an Israeli company back in late 2003 (?), at least part of it.

I will, however, tell you that I just peeked into the SCIF server archives, and the date code on our part of the design is in the spring of 1996. So, it was in field test by that fall.

Wow, memory lane. That was one of the first designs we did as a company.



posted on Feb, 13 2007 @ 06:26 PM
link   

Originally posted by Tom Bedlam


That's what I was alluding to in the last paragraph. The military uses it for some really specialized processing but it's not very useful as a general purpose computer.


Tom,
first off thanks a bunch for all the good information you have been relaying. You really do seem to know what you are talking about. The above was in response to holographic computers. I knew they were working on the a few years back but do not know what became of them. Could you say anything about them? What they are capable of? Where are they being used ect....?



posted on Feb, 13 2007 @ 07:02 PM
link   

Originally posted by RedGolem

Tom,
first off thanks a bunch for all the good information you have been relaying. You really do seem to know what you are talking about. The above was in response to holographic computers. I knew they were working on the a few years back but do not know what became of them. Could you say anything about them? What they are capable of? Where are they being used ect....?


Hm. Well, if we didn't do that job 10 years back, I could. I see a lot of stuff out there on it. I would give you explicit links, but believe it or not I got my little face slapped for THAT in 2004 as part of a comprehensive ass kicking. Even though I saw them selling it over the counter, technically I'm not released.

I guess I could say that lenses can be used to perform mathematical operations at breath taking speeds. Early on, they wanted to do SAR scans of various areas. SAR takes a LOT of processing. In the 60's, there wasn't enough processing horsepower in the world to do it. But they did it anyway! How did THAT happen, you may ask. Well, you can take SAR radar returns and put them on photographic film just SO. It doesn't look like anything at all but gray grainy mush. But you can run that mushy junk through some lensing, and the lenses will perform teraflops of processing just like magic. If you can grind your lenses just right, you can do a lot of really amazing mathematical operations. So in the 60's, teraflop processing was done by some whizbang lens grinders slaving away in an attic in some governmental installation we won't mention. No-one seems to consider a lens as a floating point processor of amazing capacity.

So, you can solve some problems optically by letting photons and lenses do all your calculations for you. It's not the sort of thing that you could make a computer out of, but it does computations. If you're really smart, you can make it do all sorts of unexpected amazingly neat tricks, that I'm sure you can find described all over but I can't describe them to you. The really nasty part, though, is that they're fixed. You have to have your lens just right for a SAR scan at a certain speed, altitude, some radar parameters I won't describe, etc. If you didn't hit the scan just right with your...scanner...you can't do the magic lens trick without getting Hans out of the closet and having him grind a few more lenses to match. Messy.

Now extrapolate to infinity.



posted on Feb, 13 2007 @ 07:14 PM
link   
Tom,
thanks for the info again. I will try looking a few things up.
Just so I know for sure what your talking about could you say what SAR stands for?



posted on Feb, 13 2007 @ 07:38 PM
link   
I had read in SEED magazine(I believe) a while back about the guy who connected a bunch of cpu together to form a supercomputer that uses a evolution type program. I think it made a better antenna for NASA even though the thing was bent in many places. Would these chips make those cpu faster in their computations being as the program does many calculations to evolve whatever is being imputed over several generations?

Also, what about the future of aerospace? I know that the cpu in fighter jets do many calculations to adjust as the pilot flies?



posted on Feb, 13 2007 @ 08:55 PM
link   

Originally posted by RedGolem
Tom,
thanks for the info again. I will try looking a few things up.
Just so I know for sure what your talking about could you say what SAR stands for?


Whoops, sorry. Synthetic Aperture Radar. It's a way of forming images from radar returns. Think of it as sort of taking a CAT scan of the terrain...it looks like a picture but it's not. That's not technically accurate, but it's a good visual.

I have some good books that sort of hit on how it works but I can't find a good overview online.

SAR is used a LOT. In all sorts of things. The basic math lends itself to all sorts of unexpected image extraction opportunities. You can form SAR-like images from stuff you wouldn't think was returning you enough data.



posted on Feb, 13 2007 @ 10:00 PM
link   
Imagine a hundred of these chips running cloaking cameras as well as hydrolic or electro mechanical shape changing surfaces. A stealth craft that has the atributes of an octopus.

even better holodecks become reality!

[edit on 13-2-2007 by linuxfueled]



new topics

top topics



 
9
<<   2 >>

log in

join