It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Cray unveils supercomputer with 50 PETAflops of power

page: 1
10

log in

join
share:

posted on May, 29 2011 @ 09:16 PM
link   
And that's about 20 times ``more powerful`` than the ``fastest`` right now which is located in China with a power of 2.56 PETAflops.

Cray's XK6 Supercomputer Promises 50-Petaflop Performance

AMD 16-core, Opteron processors, Nvidia Tesla graphics processors, and a Linux-based operating system are the foundation of Cray's fastest machine.

The Top 500 list Schulthess mentioned is usually revealed at the International Supercomputing Conference, which takes place in June in Hamburg, Germany. Last year, China took the top prize for fastest supercomputing with its Tianhe-I, beating out Cray's XT5 supercomputer at the Oakridge National Lab. The XK6 will not be competing at this year's conference however, since it will become available in the second half of 2011, Bolding said.

And it's gonna be released THIS YEAR.

What's crazy is that one human brain has about 100-1000 petaflops of computer power. So we're ``almost`` there...if we base ourselves on the lower estimate. Of course a human brain is way smaller and uses way way way less power (about 20W compared to several terawatts) And that is not counting the memory which the human brain can store about 500-1000 terabytes and that is from the low end estimates.

I bet the singularity believers are gonna like this piece of news.



posted on May, 29 2011 @ 09:34 PM
link   
Damn thats crazy.
All i'd do with it is play battle field though :/ haha



posted on May, 29 2011 @ 10:08 PM
link   
I have often wondered if even half of such a computer is used efficiently. I mean, I guess if you are looking to model earth's natural processes over the last billion years down to the tree, then sure, I could see it being handy, but it still seems like something that can be done with my home computer (given a little time).



posted on May, 29 2011 @ 10:13 PM
link   
reply to post by Vitchilo
 



LOL @ "Terrawatts".

Let's see;

KiloWatt - 1000 watts
Megawatt - 1 million watts
Gigawatt - 1 billion watts
Terrawatt - 1 trillion watts

I would love to see their electric bill.



posted on May, 29 2011 @ 11:15 PM
link   
Computer speed will be ultimately finite until there is a multi-probe transistor like our very own brian neurons. It was about three years ago I read up on a crystalize molecule that has a 4-prong transistor connected to 15 other such molecules to create a cell computer point capable of transiting impulses at a rate of 4x16^2. I thought I saved a link to that but it was a computer ago and now I cannot find a satisfactory search 'call phrase' to find it.

I do remember that the original article mentioned in it's headline somewhere, 'the world's smallest computer', or 'World's fastest nano brain'. I wish I could find that link, and more so wish it wasn't just woo.

Binary seems to be so archaic, no matter how many times you stack it up and link it together. It seems to me we should be way beyond simple off/on. 1-zero. Don't you think?


BTW I wonder if those Chinese are really going to prove a machine faster than what they have at Oak Ridge National Laboratory? We shall see.



posted on May, 30 2011 @ 03:06 AM
link   
docs.google.com...:2tMexwdgMr0J:citeseerx.ist.psu.edu/viewdoc/download%3Fdoi%3D10.1.1.86.7890%26rep%3Drep1%26type%3Dpdf+4+prong +molecule+transistor&hl=en&gl=us&pid=bl&srcid=ADGEESgilXUQLgziQu4BZv41eU1XuTU1mekjYtzW5C72mcV3_wEqTzsck4TvWOw0xJoQlA6TKEbgnRN-9hEqYlNg8gBRoFlOqTKfv_nW wVi1cgBkb8VRuYHEYW5wZs64NWuDjTUdXi0s&sig=AHIEtbQ-xi8wQilvyiVCBcY0ZWBLmJ6I0Q

perhaps this?

www.cns.cornell.edu...

or this?

could be what you are talking about?




posted on May, 30 2011 @ 03:52 AM
link   
Linux based eh?

Not a surprise, I could just see a Windows based supercomputer constantly losing internet connection, crashing, getting viruses etc etc.......



posted on May, 30 2011 @ 04:04 AM
link   
reply to post by Version100
 


Watts and calculation speed are not linked. It probably only consumes power in the kilowatts range.

If they have Or*gin as their provider, I would hate to see their power bill too.................



posted on May, 30 2011 @ 04:11 AM
link   

Originally posted by OZtracized
reply to post by Version100
 


Watts and calculation speed are not linked. It probably only consumes power in the kilowatts range.

If they have Or*gin as their provider, I would hate to see their power bill too.................


I did not say anything about power consumption relating to
computational speed.

My point is that NOTHING on Earth consumes Terrawatts of power.

Power consumption from all sources on Earth is estimated at 15
Terrawats, total.


edit on 30-5-2011 by Version100 because: (no reason given)



posted on May, 30 2011 @ 04:16 AM
link   
Finally!!!!

I can watch porn the way it was intended to be seen on the internet!!

WOO-HOO



posted on May, 30 2011 @ 01:47 PM
link   

Originally posted by Illustronic
Computer speed will be ultimately finite until there is a multi-probe transistor like our very own brian neurons. It was about three years ago I read up on a crystalize molecule that has a 4-prong transistor connected to 15 other such molecules to create a cell computer point capable of transiting impulses at a rate of 4x16^2. I thought I saved a link to that but it was a computer ago and now I cannot find a satisfactory search 'call phrase' to find it.
.


4x16^2 is a K. What are you talking about.?



posted on May, 30 2011 @ 02:07 PM
link   

Originally posted by OZtracized
Linux based eh?

Not a surprise, I could just see a Windows based supercomputer constantly losing internet connection, crashing, getting viruses etc etc.......

Windows is closed source, thus cannot be customized for such bespoke applications.



posted on May, 30 2011 @ 02:09 PM
link   


Binary seems to be so archaic, no matter how many times you stack it up and link it together. It seems to me we should be way beyond simple off/on. 1-zero. Don't you think?

"Simple off/on" means simple, cheap and reliable hardware. It is by no means a measure of architectural complexity.
edit on 30-5-2011 by john_bmth because: (no reason given)



posted on May, 30 2011 @ 11:13 PM
link   

Originally posted by Version100
reply to post by Vitchilo
 



LOL @ "Terrawatts".

Let's see;

KiloWatt - 1000 watts
Megawatt - 1 million watts
Gigawatt - 1 billion watts
Terrawatt - 1 trillion watts

I would love to see their electric bill.



Yeah I just wanted to say that it used a hell lot of power compared to our brain.


Terrawatts was pushing it a little.



posted on Jun, 1 2011 @ 03:18 PM
link   

Originally posted by SmokeandShadow
I have often wondered if even half of such a computer is used efficiently. I mean, I guess if you are looking to model earth's natural processes over the last billion years down to the tree, then sure, I could see it being handy, but it still seems like something that can be done with my home computer (given a little time).


They use parallel processing, note the 16 core processors....times many vs our PC's at best have 4...most are twin core.
But besides that, our operating system isn't optimized to even use the multi core hardware yet.

This is the Cray advantage and why they use Linux, open source software allowing them to scale additional processors where each instruction in a program is pipelined and executed in parallel.

Imagine your car having 16 engines running in parallel transmitting power to your vehicle's wheels vs your single engine with a Turbo Charger.

This is Cray's method of SuperComputing....

It is hardware and Software combined....Our PC's, running Windows, created for ease of use for the Consumer Market, are beleaguered with a ton of Bloat Ware and are a Millennium behind....computing power and throughput wise.



posted on Jun, 1 2011 @ 03:25 PM
link   
reply to post by nh_ee
 


Windows is perfectly capable of utilising multi-core CPUs. The problem is, the majority of computing tasks simply do not lend themselves to parallelisation so scaling them for multi-core configs doesn't yield much in performance boost. The reason why Linux was used is because Linux is open-source and thus can be modified to accommodate such bespoke hardware and tasks.



new topics

top topics



 
10

log in

join