It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Science supercomputer tackles first questions

page: 2
9
<< 1   >>

log in

join
share:

posted on Apr, 10 2009 @ 06:32 PM
link   
reply to post by The Godfather of Conspira
 


You might find Bechtel an 'interesting' company. I'll plead the 5th for now though it pays to be ever vigilant. We are in some dangerous times compadre.



posted on Apr, 10 2009 @ 06:38 PM
link   
reply to post by Perseus Apex
 



You might find Bechtel an 'interesting' company.


They handle a lot of pipeline and underground construction contracts right? Especially military-related.



posted on Apr, 10 2009 @ 07:25 PM
link   
Ian's right. The core is partitioned into 2-gig 'mini-spaces' ...

The way these simulations are typically run is to assign a 'finite-element' of the 'problem space' to each processor. Then they all 'run in parallel,' 'propagating' values that 'cross boundaries' after each 'iteration.'

For example, an 'atmospheric simulation' would 'divide up' the Earth's atmosphere, and whatever else constitutes the 'problem space' into 'hexahedrons,' which is geek-speak for '3d-cubes.' Then each processor 'handles' all the 'simulation dynamics' for it's assigned cube. When the 'boundary values' for its cube changes ( things like pressure, temperature, density, humidity, velocities, etc ... ), this 'state information' must be 'communicated' to the processors 'handling' the adjacent 'cubes.'

Running the 'dynamics model' is where the 'flops' make a difference, and 'propgating boundary values' is where 'communications bandwidth' comes in. It turns out that 'marshalling' all this data around takes up the lion's share of each processor's resources. It may have to just sit there and 'spin,' waiting for 'updates' to arrive or complete to or from adjacent processors ...

This 'boundary value propagation' is what Ian referred to in his post when he mentioned that the 'real question' was 'how fast could it move data.'

Regarding that question - it seems this beast is pretty 'capable' indeed !


Jaguar - Technical Specs

[atsimg]http://files.abovetopsecret.com/images/member/3fb0812789d2.jpg[/atsimg]
Source : NCCS



posted on Apr, 10 2009 @ 08:56 PM
link   
I bet if they threw vista at it, it wouldn't be so super.
I bet microsoft aren't allowed within ten miles of super computers



posted on Apr, 10 2009 @ 08:57 PM
link   
A more likey purpose for justifying all that computing power is to scan the flood of data from spy satellites. If a satellite could monitor the entire surface of the earth down to a pixel size in detail, and do it in ten different spectrum, like visible light, xray, neutron, gravitic, infrared.

Going over that data for what you want would be like sending a team to actually monitor the entire surface of the planet ten times over.

Weather monitoring involves measures of various values at different altitudes and distances between samples.

It is all very much the same thing. Gathering all the possible data and mining it for all the possible high value information. I truly doubt that there is a seperation of technology from surveilance and weather.

I have some small experience in that field, with those organizations.

I find it hard to celebrate. The faster bigger better real time data acquistion and analysis has been going on for decades. The one thing the government never issue me was a sense of comfort from their efforts.



posted on Apr, 10 2009 @ 08:59 PM
link   

Originally posted by Azador
I bet if they threw vista at it, it wouldn't be so super.
I bet microsoft aren't allowed within ten miles of super computers

Well first they would have to invent an OS that could take advantage of the kind of hardware a supercomputer has. They have hardly taken advantage of even the x86 PC architecture yet



posted on Apr, 10 2009 @ 10:40 PM
link   
I think you are right on the money about the OS. I would take that line of reasoning even further, you need really advanced artificial intelligence, tied to really advanced neural network computers to analyze the data.

There is such a thing as a collection of data so large that a human mind cannot see enough of the patterns within it at one time to be able to recognize the patterns exist, or the patterns may be too complex.

You also want the logic to be as fast as the data analysis, or you could never keep up with the data flow.

It really makes you wonder what they have in their computer rooms.

I don't think it is all just so some government uber weatherman can look at a 3D hologram of the earth, and make weather forecasts which are no better than they have ever been.



posted on Apr, 11 2009 @ 12:27 AM
link   
The OS is most likely a Unix kernel.




will be tackling questions about climate change,


The most powerful computer in the world is being used to "tackle questions" about the biggest BS, NWO-contrived scare tactic in the world...interesting



TheAssociate

Edit: grammar

[edit on 11-4-2009 by TheAssociate]



posted on Apr, 11 2009 @ 12:42 AM
link   
The OP states that one of the problems that the big bad computer is trying to solve is the "structure of water".

I thought that we settled that question a long time ago? Unless there is something to water beyond its composition of H2O?

My Chemistry 101 class, way back when showed the structure of the water molecule.



posted on Apr, 11 2009 @ 01:16 AM
link   

Originally posted by sunny_2008ny

Science supercomputer tackles first questions


www.newscientist.com

n the real world, a newly built supercomputer that is the most powerful ever dedicated to science will be tackling questions about climate change, supernovas, and the structure of water.

Jaguar is located at the National Center for Computational Sciences (NCCS), part of Oak Ridge National Laboratory, Tennessee, and has a peak operating performance of 1.64 petaflops, meaning it can perform more than a million billion mathematical operations every second.
(visit the link for the full news article)




The problem is not the power of the computer but the oldest problem with computers.

That is garbage in garbage out.

The answer depends on the data the computer is given and if the data is bad the answer will be bad.

If the input data is biased to prove climate change to computer will answer with proof of climate change.

We already have proof the the data on global warming is biased by the researcher trying to prove its real.
www.climateaudit.org...

If you feed the computer there data what do you expect it to answer but that global warming is real.

Garbage in garbage out.




posted on Apr, 11 2009 @ 08:58 AM
link   
reply to post by lunarminer
 


I thought that we settled that question a long time ago? Unless there is something to water beyond its composition of H2O?

Water is one of the very, very few 'materials' which 'expand' when it freezes. The 'scientists' don't really understand at all why this is so. I'm sure they would like to find out why.




top topics



 
9
<< 1   >>

log in

join