It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Google plans 'watershed' quantum computing announcement in December

page: 2
11
<< 1    3 >>

log in

join
share:

posted on Nov, 17 2015 @ 01:38 PM
link   
a reply to: Bedlam

That's exactly right. Remember, the complexity of how we think of algorithms and classify them according to their degree of difficulty and scaling properties, e.g. linear, polynomial, sub-exponential, exponential with a classical (turing equivalent) computer is entirely different with a quantum computer.

In fact, there isn't just one kind of "quantum computer" yet, in the sense that most conventional computers in typical use are theoretically equivalent to parallleled Turing computers in computational capability.

There are multiple kinds of quantum computation (which ironically relies on the near continuum of wavefunction values, vs discrete classical levels in classical computer) and different ways various problems map on to them.

The equivalence classes we know now are busted: what may be all equivalent exponential or polynomial problems in classical algorithms may be rather different across the various quantum computing classes.

Next the question becomes if some clever mathemetician can map certain code-breaking problems into discrete optimization (Sudoku-like?) problems. If you can reduce a problem from a hypothetical full quantum computer algorithm (which we don't have in practical use) to a D-wave compatible problem...... you'll either get a great bonus or polonium in your tea.

A 'watershed' event would be some results from Google (almost surely not on encryption) which map a hard problem into a D-wave compatible problem, and then solve it.


edit on 17-11-2015 by mbkennel because: (no reason given)

edit on 17-11-2015 by mbkennel because: (no reason given)




posted on Nov, 17 2015 @ 03:17 PM
link   

originally posted by: Bedlam
The bigger issue is that the D-Wave isn't properly a quantum computer. It's a hardware implementation of simulated annealing. That's useful for some problems, but it's not actually going to be doing a lot of code breaking.


This is just not the case and I'm sure you got this from a Wired article that was actually pretty good because it looked at both sides of the argument.

Firs off, these are the first quantum computers so they're not going to be more efficient than a classical computer in every way. Things have to be tweaked like with any new technology. This is why the 1,000 qubit D Wave that Google, NASA and Lockheed Martin just got does some things differently than the 512 qubit computer.

So the question is, does it perform in areas faster than a classical computer and is it exploiting quantum behavior to carry out these calculations? For both questions the answer is yes. This is why Google, NASA and Lockheed want the 1,000 qubit D Wave.

It's interesting to note, that Google is using this in their artificial intelligence research and they're getting results that they couldn't get with a classical computer.

At Lockheed Martin, Greg Tallant has found that some problems run faster on the D-Wave and some don’t. At Google, Neven has run over 500,000 problems on his D-Wave and finds the same. He’s used the D-Wave to train image-recognizing algorithms for mobile phones that are more efficient than any before. He produced a car-recognition algorithm better than anything he could do on a regular silicon machine. He’s also working on a way for Google Glass to detect when you’re winking (on purpose) and snap a picture. “When surgeons go into surgery they have many scalpels, a big one, a small one,” he says. “You have to think of quantum optimization as the sharp scalpel—the specific tool.”

www.wired.com...

The problem here is people don't know how this technology will look in the end so it's all trial and error. Here's more from Lockheed.


Ned Allen sent D-Wave a sample problem to run on its system. It was a 30-year-old chunk of code from an F-16 aircraft with an error that took Lockheed Martin’s best engineers several months to find. Just six weeks after sending it to D-Wave, the software error was identified.

In late 2010 Lockheed Martin became the first D-Wave customer. Their D-Wave One system, which is the first commercially available quantum computer in the world, was installed at USC’s Information Sciences Institute so that they could explore its potential.


www.dwavesys.com...



Again, these are some of the first quantum computers. With technology there's always trial and error at first. For instance, here's one of the first early cars.



Now, this car wasn't more efficient than a horse and buggy but nobody will say it wasn't a car. These are the first steps and it doesn't make sense to say, this doesn't do everything faster than a classical computer so it's not a quantum computer. Look at the first airplane ride.


Near Kitty Hawk, North Carolina, Orville and Wilbur Wright make the first successful flight in history of a self-propelled, heavier-than-air aircraft. Orville piloted the gasoline-powered, propeller-driven biplane, which stayed aloft for 12 seconds and covered 120 feet on its inaugural flight


Now would you say this isn't an airplane because it wasn't more efficient than riding a train? Of course not.

The truth is single qubit quantum computers to 1,000 qubit quantum computers from D Wave will be part of the history of quantum computers just like the Kitty Hawk is part of the history of airplanes.

The thing with quantum computers is that you will reach more computational power than there is in the entire universe with very few qubits. Here's more:


D-Wave Systems Inc., the world's first quantum computing company, today announced that it has broken the 1000 qubit barrier, developing a processor about double the size of D-Wave’s previous generation and far exceeding the number of qubits ever developed by D-Wave or any other quantum effort. This is a major technological and scientific achievement that will allow significantly more complex computational problems to be solved than was possible on any previous quantum computer.

D-Wave’s quantum computer runs a quantum annealing algorithm to find the lowest points, corresponding to optimal or near optimal solutions, in a virtual “energy landscape.” Every additional qubit doubles the search space of the processor. At 1000 qubits, the new processor considers 21000 possibilities simultaneously, a search space which dwarfs the 2512 possibilities available to the 512-qubit D-Wave Two. ‪In fact, the new search space contains far more possibilities than there are ‪particles in the observable universe.


www.dwavesys.com...

What changes did they make going from 512 qubits to 1,000 qubits?


Lower Operating Temperature: While the previous generation processor ran at a temperature close to absolute zero, the new processor runs 40% colder. The lower operating temperature enhances the importance of quantum effects, which increases the ability to discriminate the best result from a collection of good candidates.​

Reduced Noise: Through a combination of improved design, architectural enhancements and materials changes, noise levels have been reduced by 50% in comparison to the previous generation. The lower noise environment enhances problem-solving performance while boosting reliability and stability.

Increased Control Circuitry Precision: In the testing to date, the increased precision coupled with the noise reduction has demonstrated improved precision by up to 40%. To accomplish both while also improving manufacturing yield is a significant achievement.

Advanced Fabrication: The new processors comprise over 128,000 Josephson junctions (tunnel junctions with superconducting electrodes) in a 6-metal layer planar process with 0.25μm features, believed to be the most complex superconductor integrated circuits ever built.

New Modes of Use: The new technology expands the boundaries of ways to exploit quantum resources. In addition to performing discrete optimization like its predecessor, firmware and software upgrades will make it easier to use the system for sampling applications.


This is why your getting this announcement from Google. While people are saying this isn't a quantum computer, Google and others are using it as a quantum computer to enhance their research into artificial intelligence.

I'll leave you with a couple of videos:






posted on Nov, 17 2015 @ 03:26 PM
link   
a reply to: neoholographic

I'll be amazed when quantum computing can cure cancer, eliminate most disease and create a plethora of new materials.



posted on Nov, 17 2015 @ 03:33 PM
link   

originally posted by: Freenrgy2
a reply to: neoholographic

I'll be amazed when quantum computing can cure cancer, eliminate most disease and create a plethora of new materials.


That makes no sense.

That's like saying, I will not be amazed until a car can go 0 to 60 in 5 seconds in 1885 or saying I will not be amazed until an airplane can fly around the world in 1903.

Technology doesn't work this way. You're not going to build the first quantum computers and expect them to be a panacea to cure cancer and every other disease. That's just not logical.



posted on Nov, 17 2015 @ 04:05 PM
link   
a reply to: neoholographic

I am pretty curious about that claim of having more computing power than the entire universe.
This is a really dumb claim according to me..

How the **** could anyone know that? We are not even sure of the other life forms accross the nearests galaxies ...

On the other side, i'm pretty sure there are already working quantum computers that are faster than whatever they will announce on dec 8th. It's just that our handlers think we are not yet ready for this.

Peace out



posted on Nov, 17 2015 @ 08:17 PM
link   
Faster than the universe...still used to watch porn...



posted on Nov, 17 2015 @ 08:59 PM
link   

originally posted by: neoholographic
This is just not the case and I'm sure you got this from a Wired article that was actually pretty good because it looked at both sides of the argument.


Actually, no. I never read the article. I understand what they're doing, and it's an annealer. As your cite goes on to confirm.

Your Wired article was dated 2014, right? Here's me commenting on it in 2013...

not from wired!

Not that you can't use annealing in some processing. But it's not going to run many 'classic' quantum computational processes.

It can't do Grover's Algorithm, for example.



posted on Nov, 17 2015 @ 11:28 PM
link   
a reply to: Bedlam

This is kind of silly.

It's a quantum computer in it's infancy. The first cars weren't Jaguar F-Types so of course this computer can't do everything at this point. That's why they're updating it because this is what happens with new technology.

The computer keeps getting better and faster and this is what Google will probably announce.

This is why Google, NASA, Lockheed and others are working with their 1,000 qubit computer. This is why they have raised over 123 million dollars from people like Jeff Bezos and Goldman Sach's and they just closed a 30 million round of financing.

This all didn't happen because they don't have a quantum computer. As I pointed out earlier, Google and Lockheed talked about the areas that work much faster than classical computers. This is how technology works.


D-Wave Systems has broken the quantum computing 1000 qubit barrier, developing a processor about double the size of D-Wave’s previous generation, and far exceeding the number of qubits ever developed by D-Wave or any other quantum effort, the announcement said.

It will allow “significantly more complex computational problems to be solved than was possible on any previous quantum computer.”

At 1000 qubits, the new processor considers 21000 possibilities simultaneously, a search space which dwarfs the 2512 possibilities available to the 512-qubit D-Wave Two. ‪”In fact, the new search space contains far more possibilities than there are ‪particles in the observable universe.”


Again, this is why Google, Lockheed, Los Alamos and NASA are getting these quantum computers.


Greg Tallant, who leads USC’s Lockheed Martin Quantum Computation Center, said in statement the defence contractor would use the new system to “address the real-world problems being faced by our customers.”

D-Wave has also sold the new 1,000 qubit system, which hit the market in August, to an artificial intelligence lab operated by Google, NASA and the Universities Space Research Association.

It also secured a deal last week with the Los Alamos National Laboratory to help it further its research into high-performance computing.


Why buy a quantum computer to address real world problems if it's not a quantum computer LOL??Just get you some real powerful classical computers and solve these real world problems.



posted on Nov, 17 2015 @ 11:43 PM
link   
a reply to: darkbake

Well, it depends on what you call a quantum computer. Dwaves machine while quite possibly revolutionary for solving certain problems, is not what most scholars consider a quantum computer. It is quantum but it can only solve certain kinds of problems and actually one of the biggest issues right now is programming it. Basically it solves math equasions, and the math involved is very complex.

I have been following this company for years, and am convinced they are for real and they will revolutionize certain things, but it will be in helping to develop software, including possibly ai, and used for visual, and speech recognition improvements.

But consumers will never directly own one of these



posted on Nov, 18 2015 @ 12:45 AM
link   

originally posted by: neoholographic
Why buy a quantum computer to address real world problems if it's not a quantum computer LOL??Just get you some real powerful classical computers and solve these real world problems.


Because your problems are amenable to solution by a hardware annealing simulator?



posted on Nov, 18 2015 @ 05:54 AM
link   
Good.

Humans are defunct occupant of this planet and the sooner "A.I. takes over, the better.



posted on Nov, 18 2015 @ 12:00 PM
link   
a reply to: neoholographic

Don't hold your breath Neo, you might suffocate.

A few years ago I read about molecular computation that uses things called oxetanes and Catenanes-and as I remember it could replace binary but as far as I know It never came into fruition.

Quantum computation could be incredibly hard to achieve, let alone perfect. I wouldn't get too excited.



posted on Nov, 18 2015 @ 12:13 PM
link   
I want a quantum computer in every cell phone! Then and only then will I be a little impressed. . . . .



posted on Nov, 18 2015 @ 02:42 PM
link   

originally posted by: Phage
a reply to: VoidHawk
Actually, with binary coding, the 8080 did pretty well.
A real pain doing the input for the code. And of course, the cassette deck...that was another matter.

pastraiser.com...


You are correct, it did do very well! So well it caught me out on my first ever bit of coding. I wrote a bit of code to detect input on the arrow keys which moved a character about the screen. However, the character would only appear on the four edges of the screen!
Any idea's why?
I'd done this many times using Sinclair's Basic without problem.

ok I'll tell you


Written in machine code it was to fast to be able to see it! That was one of those WOW!!! moments. I was hooked


Your comment "A real pain doing the input for the code", I'm wondering what you were using the chips for? I used to input huge strings of hex via the basic.

Loading via Cassette. I managed to speed that up! The Cassette deck that I had was stereo so I broke the code in half and fed one stream to each channel when recording, rebuilding when loading, obviously only worked with my own stuff though.



posted on Nov, 18 2015 @ 02:45 PM
link   

originally posted by: Trillium

it has a 16k extra plug-in module


You'll know all about Back Pack Wobble then!



posted on Nov, 19 2015 @ 12:35 AM
link   
a reply to: VoidHawk



Written in machine code it was to fast to be able to see it! That was one of those WOW!!! moments. I was hooked

I got into "star fields". Left to right, up and down. That was pretty easy bit manipulation.
The expanding field from a central point, that got trickier.


I used to input huge strings of hex via the basic.
Yup. Poke...poke...poke. Goddam membrane keyboard.



edit on 11/19/2015 by Phage because: (no reason given)



posted on Nov, 19 2015 @ 12:45 AM
link   

originally posted by: FormOfTheLord
I want a quantum computer in every cell phone! Then and only then will I be a little impressed. . . . .


Quantum computing isn't sequential processing like an Arm core, it's more for number crunching.



posted on Nov, 19 2015 @ 02:16 AM
link   

originally posted by: FormOfTheLord
I want a quantum computer in every cell phone! Then and only then will I be a little impressed. . . . .



originally posted by: Bedlam
Quantum computing isn't sequential processing like an Arm core, it's more for number crunching.
Don't the quantum computers run a few degrees above absolute zero? By the time you got the cooling and insulation in place, I think it would have to be even larger than the old "bricks" people used to carry around for portable phones, and probably a lot larger than that unless there's some cooling and insulating tech I'm not familiar with. You might fit that cooling and insulation in a cell phone of this size:


Probably not what you had in mind for your next cell phone, eh FormOfTheLord?



posted on Nov, 23 2015 @ 06:18 PM
link   
There have been great advancements made in on-chip cryogenics that will probably be the solutions to hand-held quantum devices in the future. Still in infancy, they have already been able to get these "cryogenic wells" to extremely low temperatures. Not quite close to absolute zero C. (−273.15°C) , but getting there.


Researchers at UT Arlington have created the first electronic device that can cool electrons to -228 degrees Celsius (-375F), without any kind of external cooling. The chip itself remains at room temperature, while a quantum well within the device cools the electrons down cryogenic temperatures.


Cryogenic Quantum on-chip cooling
edit on 23-11-2015 by charlyv because: s



posted on Nov, 23 2015 @ 06:32 PM
link   
Google to announce the technological singularity happened when their Quantum computer started improving itself ?



new topics




 
11
<< 1    3 >>

log in

join