It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Giant Leap for Mankind...or World's Largest Ruse?

page: 3
18
<< 1  2   >>

log in

join
share:

posted on Nov, 5 2018 @ 04:13 PM
link   
a reply to: proximo

That's getting into optimal theory, which does involve a lot of computing. A single problem can take weeks for a mathematician to solve. The essential concept is that you define a starting and ending manifold, then establish a cost formula that must be minimized using the Hamiltonian. It's quite complex, but it explains many things in robotic control... one of my projects is the development of a robotic base that is able to use optimal theory to navigate. The exciting thing about that is that, unlike other robotic bases, this one would be able to determine a cost function based on the need to move quickly versus the need for caution, and adjust the LaGrangian term appropriately. By doing so, it would mimic the biological capacity of running versus walking.

For instance, if this base were deployed on, say, Mars, and it sensed no danger, it would amble easily conserving energy. If confronted with sensory input that indicated some type of danger, say a sandstorm, it could quickly adjust its gate to run to nearby shelter. The cost of speed would simply outweigh the cost of the energy. Available energy could also be taken into account, so it could run as fast as possible without exhausting its energy reserves too soon.

Of course, most of the Hamiltonian calculations would have to be pre-calculated and programmed in. How much I am not yet certain of.

Additional uses would be to adjust to damage. If a leg were damaged, the gait could be adjusted to operate with five legs instead of six, still maintaining the Lagrangian but adjusting the functions specifying energy requirements. In other words, it would learn to limp. And that, of course, has additional applications in things like aircraft control, where damage to the body of the plane would change the aerodynamics, but the plane could adjust on the fly to still (hopefully) make a controlled landing.

A computer is still a computer, but the faster speed allows quantum computers to handle linear algebra much easier, thus allowing it to consider more options and solutions.

Wikipedia has a good primer on optimal control

TheRedneck




posted on Nov, 5 2018 @ 05:16 PM
link   
a reply to: chris_stibrany

I started to respond with "Well, not exactly...", but after I thought about it for a moment, yes, that's exactly why I posted my OP as I did. Hence the "ruse" comment.

What they're going to do is a test. So in that context yes, the net result will be far slower, and this is why they've qualified it the way they have. But that's not their point. To run the test NASA will have to ship the data and the software to Google, who will then crunch whatever the data is, benchmark it and then they and NASA will compare results from other methods. The comparison won't include the time it took to move the data and the software, only the time it took to compute. However, notice how they've constructed their argument...

They (Google) can't come to them (their stuff is too hard to move), and they can't come to them (NASA), so use our service to bridge the gap, to get your data to us. (Flag #1) Then, if they like what they see, they can do it this way all the time, BUT (Flag #2), as chris_stibrany noted on ATS, this method is too slow. Soooo, thev've got a deal you can't refuse; they'll come to you and build replicate the HAL 6000 at your house and crunch all the data for them. Of course, they'll need to pay to do this, but they still own the technology including all the storage arrays (because it is integral to the processing), okay?

I think some are missing the forest but for the trees here. Google is only in the data processing business because of the data they can process. In other words, Google is in the storage business, but as you've correctly observed, they know that the classical differences between storage and compute get pretty murky when you get into supercomputing because it's all about access times (on a technical level). Traditionally, there has always been data and compute and then a connection between the two (a pipeline if you will). As compute capability has increased so has the storage capability, but something else happened which wasn't so obvious (interestingly, this is why I've always said the Cloud was a ruse, but that's another thread). What happened was storage had to be physically moved closer and closer to the compute. Used to be it could be separated by scores of miles, but then it had to be in the same building. You start getting into supercompute and it has to be in the same room, and then in the same cabinet, and then in the same box, and before you know it it's all become one.

Google is smart, and they see this. But they're looking at things from the value of the data, not the value of the technology leap. They want the data. Then they can mine it, they can parse it, they can sell it, and what they can't sell they want to be so far in the middle of the owner's business that they cannot be easily extracted or replaced.

So yes, it's all about the data.

Information is power! And data is King.

ETA - My 'not exactly' part at the beginning was going to address differences in the Cloud (i.e. there being lots of different flavors of the Cloud; there's Cloud compute, Cloud storage, Cloud networking, etc.)
edit on 11/5/2018 by Flyingclaydisk because: (no reason given)



posted on Nov, 5 2018 @ 05:43 PM
link   
a reply to: Flyingclaydisk

Two developments are likely to decentralise quantum computing.

It's not so widely discussed, but the quantum internet is developing faster than quantum computing. In fact I believe the last technical hurdles were overcome recently, so it's just a matter of engineering the infrastructure. Data can then be transferred while maintaining its quantum state.

The other is less certain. It is not at all clear at this stage which technology will prevail, but several labs have taken promising steps in the development of room temperature quantum computer chips.



posted on Nov, 5 2018 @ 06:30 PM
link   
a reply to: Flyingclaydisk

The goog is NSA

What else would you expect to happen?



posted on Nov, 5 2018 @ 06:38 PM
link   
This would be remarkable, especially considering 18 qubits was achieved only this July, and even that is far from a working computer.

phys.org...



posted on Nov, 5 2018 @ 06:45 PM
link   
Are they trying to create a Rickymouse?


This computer won't be good at telling jokes, just like me.



posted on Nov, 5 2018 @ 06:45 PM
link   
a reply to: EvilAxis

Quantum internet is nothing more than de-centralized networking. Compute is a far different matter. Super-compute is different even still.

I've actually been inside data centers from MS, Google, Amazon and even the IRS (which has a G component). I've never been inside a supercompute center (as most will never be allowed to either). The amount of energy these guys use per square foot is staggering! I've seen upwards of 30kW per rack, which is what?....about 2.5kW per square foot??? Can you believe that??? It's HUGE!!

Put 100 racks in a data center and you have THREE MEGAWATTS!! We put in 1mW generators on a regular basis. For one of our "little" data centers we have two (2) 1mW generators running in parallel, fully redundant. Transfer switching and 100% UPS backup for the transition. A nuke might take it out, but nothing less.



posted on Nov, 5 2018 @ 06:47 PM
link   
a reply to: zardust

BINGO!!

It's a symbiotic relationship!

Anyone who can't see this is asleep!



posted on Nov, 5 2018 @ 10:47 PM
link   
a reply to: Flyingclaydisk
I would not worry to much. Technology has not come all that far, or will go all the much farther. Really the only thing higher output supercomputers seems to have done is just get bogged down with more endless spyware. Which is why windows 3.1 is a dozen times more efficient then windows 10.

Computers may be faster and better, but there is also a ton more crap # there processing. Making things, go absolutely nowhere.

I think this will end up like that giant leap for mankind that happened in the 1969. That is in a hundred years or so people may start wondering if anything is going anywhere, and were the moon rocks are at.



posted on Nov, 6 2018 @ 07:55 AM
link   
Maybe I'm missing the point here, but isn't most supercomputing time accessed remotely via the cloud? When people today want to buy time on a supercomputer, they don't actually go to that supercomputer anymore, do they?

How is this different? If it isn't different, then what's the issue?



posted on Nov, 6 2018 @ 08:21 AM
link   
a reply to: proximo

Seems Quantum annealing is the way they get results from the qc.... very interesting indeed..



posted on Nov, 6 2018 @ 08:24 AM
link   

originally posted by: chris_stibrany
a reply to: Flyingclaydisk

Call me stupid but isn't the Cloud just 'file sharing on the net?' if that is so, wouldn't putting the Cloud whatever that is, as an intermediary to move the data from the new chip to NASA pointless? In other words, wouldn't it be far far slower than the real time transfer of the data from the chip to the hard-drive...
Did you read the post by TEOTWAWKIAIFF before yours? I thought it provided an interesting perspective on this question.

www.abovetopsecret.com...

a reply to: Box of Rain
This doesn't sound like just buying time on a supercomputer, it's more like testing a new technology. Again, see the response by TEOTWAWKIAIFF describing the experience of researchers trying to access IBM's computer over the net. It will be interesting to see if NASA's project evolves similarly with them wanting to set up on-site if they find the internet is too slow.


edit on 2018116 by Arbitrageur because: clarification



posted on Nov, 6 2018 @ 06:03 PM
link   

originally posted by: Flyingclaydisk
a reply to: EvilAxis

Quantum internet is nothing more than de-centralized networking. Compute is a far different matter. Super-compute is different even still.


Not sure what you mean by 'super-compute'. I thought we were discussing quantum computing. The quantum internet is nothing less than de-centralized networking, i.e. it facilitates decentralised quantum computing when such computers come online.
edit on 6-11-2018 by EvilAxis because: (no reason given)



new topics

top topics



 
18
<< 1  2   >>

log in

join