It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

future of computing..

page: 1
0
<<   2 >>

log in

join
share:

posted on Feb, 24 2005 @ 01:50 AM
link   
ok so this is how it is.. according to Moore's Law, computing speeds will double every 18months and this will go on for another 20years. THEN, we will prob have a problem such that silicon chips will be DIRT cheap, like a few pence a chip! As opposed to lie $600+ for a top of the like fx-55 processor at the moment

Moores Law "Moore observed an exponential growth in the number of transistors per integrated circuit and predicted that this trend would continue."
www.intel.com...

silicon will be obsolete, and im guessing its from there that we will move onto quantum computers. They have made, i heard, a prototype quantum conputer, but not a true on.

The quantum computer relies on the properties of how quanta behave to carry out calculations and whilst currently we have computing info measure in bits, in the quantum computer, it will be in qubits. ok now im confused, basicalled wikipedia says:

A qubit (quantum + bit; pronounced /kyoobit/ [1] ) is a unit of quantum information. That information is described by state in a 2-level quantum mechanical system, whose two basic states are conventionally labeled |0 \rangle and |1 \rangle (pronounced: ket 0 and ket 1). A pure qubit state is a linear quantum superposition of those two states. This is significantly different from the state of a classical bit, which can only take the value 0 or 1.

en.wikipedia.org...

i would like it if someone could help explain this to me! thanks.

neways, i can also imagine that when the silicon chip stops growing and becomes obsolete, if Quantun computers havent come of age, i can imagine it having grave economical consequences!

1 last thing, this site has a really interesting roadmap for processor (silicon)development going right up to 2006 and even 20xx for anyone thinking whens the right time to upgrade or buy a new pc/notebook.
of course, under Moores Law, no matter when u buy, it will be outdated in 18months, lol.

freespace.virgin.net...

thanks for reading



posted on Feb, 25 2005 @ 10:37 PM
link   
Moore's law does say this, but have we gone up twice the speed scince sepetmber 2003? i think not, thats around when we got our computer and its still just around the average speed as everyone elses, its a pentium 4 2.67GHz the top speed we have now for a home or office pc beside like a xeon multi processor is around 3.6 maybe 4.0 with all the nicest parts with alot of overclocking, and thats not nearly double of even mine, so as for moore's law, the only thing he really has right is that they will get faster, and thats a givin'. also a quantum computer would be fast but wouldnt it also confliict with som laws if it got too fast like the unwritten law of creating your own supernova in your bedroom, i dont think your allowed to do that. what is the computer got so fast that it collapsed into a balckhole or something and then what?LOL just playin but they couldnt possibly get any faster than they will be by say 2020 could they?



posted on Feb, 26 2005 @ 01:31 AM
link   
that 2.67 u have runs at 533 fsb...the new intels run at 1066 fsb and some of the new amd's run at 1600 fsb (fsb=front side bus)
Intel

AMD
the silicon is really limiting the ghz levels, new materials should make things interesting in the coming cpu ages.

sooo technically they doubled and then some i think


apc

posted on Feb, 26 2005 @ 01:59 AM
link   
Don't forget when they thought more than 500mhz was impossible because they couldn't fit enough transistors onto the chip even at .18micron. They just made the chips bigger



posted on Feb, 26 2005 @ 02:01 AM
link   
In addition to the FBS, both IBM and AMD have just announced they are begining production on their dual core 64 bit processors.

www.eet.com...

www.technewsworld.com...



posted on Feb, 26 2005 @ 04:44 PM
link   
Well silicon will be dropped because of heat specifications it has been hinted at that diamond is the next silicon due to its amazing heat resistance and that we're begining to fabricate 100% pure diamonds. Also dont forget about CELL chips. Im putting my money in those as the next huge leap in performance for the cpu. The ps3 is clocked at 4ghz. Cell is going to be the first to give 16 Terra flop computer cabnits and Peta flop server rooms(according to the article bringing the power for AI within reach... Singularity?)
www.theregister.co.uk...
www.wired.com...
81 Ghz's Semiconductor anyone? www.geek.com...



Dual core cpu's are neat put it's unlikely to boost performance in areas that wouldnt be boosted with a dual cpu setup. So Gamers and your everyday user wont get much bag for your buck unless you plan on playing HL:2 and encoding your entire mp3 colletion at the same time...unlikely. But if your into folding then this is your baby. If you wanna talk about Moores law and the "Fiery end of our world because of evil tech" i suggest you look here. yudkowsky.net... . Its far fetched for me but who knows what ppl on this board will beleive. Cpu's are limited by Hard drives your ram is working over 1000x's faster than your hardrive creating the biggest system bottle neck in your system. I say we move to flash drives : )



posted on Feb, 27 2005 @ 11:03 AM
link   

Originally posted by Civil44
Also dont forget about CELL chips. Im putting my money in those as the next huge leap in performance for the cpu. The ps3 is clocked at 4ghz. Cell is going to be the first to give 16 Terra flop computer cabnits and Peta
...snip...


Careful there.. just go back and look at the promised specs for the ps2.. and then look at the reality... Also just because a CPU is capable of a huge number of oeprations per clock doesn't mean that you can write code that utilizes them.

The only way to achieve those kinds of numbers is with massive ammounts of parallelization possible in the code. Sure you can push polygons like mad in parallel, but doing physics, is another matter entirely.

It's all about IPC (instructions per clock). Today's best CPUs don't break (on average) about 1.7, even though most are capable of 4 or more. In short.. while it may be capable of some ungodly numbers, it remains to be seen if it can be realized. The PS2 Emotion engine for example, is a nightmare to program for and it's potential has never been truly tapped.

Osiris (ex-cpu designer)



posted on Feb, 27 2005 @ 07:21 PM
link   
True good points and i didnt even think of the IPC. So after some research this is what i came up with. basically copy paste from the forum where i got this from. also i know forums arnt the best places to list sources but w/e he seems to know what hes talking about and i know most of what hes talking about to be true.

The PS3 is going to have 9 64-bit cores which will operate at 4.5GHZ. Of the 9 there will be one main core that will control the other 8. XDR Memory will be used for the ram.

The Amd K7's had an IPC of 9. one cell SPE = One Athalon 1 Ghz. How did he get that? Performance = Clock x IPC. SPE's Performance = 4.5GHz x 2 = 9GHz. Athlon 1GHz's Performance = 1GHz x 9 = 9GHz. 9GHz SPE = 9GHz Athlon.

So 8 SPE's buttodays games only use 1 spe/core. so playsing doom3 on a ps3 will be like playing it on a 1 ghz athalon.

However the Ps3 has 32 SPE's and 256mb of XDR RAM(wtf is that need 512+ id think in todays day and age). If all cores were used itd be the same as 32 1 Ghz Athalons with 8Mb's of ram each. Ram might b a prob but there not going to use all 32 Spe's right off the bat.

Now thats just the Ps3 if you had Cell chips in your Case right now with over a gig of ram and of course the right software and programming code youd b tearing it up.

And Cell also incorporates Distributed computing right? So everyone is helping everyone else. Or was that just early Rumors from when Cell first hit the tech news sites?

peace and thnx for the reality check : )



posted on Feb, 27 2005 @ 08:05 PM
link   
Anyone heard of the ChaOS Chip yet? ... it's freakin' amazing.. it utilizes the laws and rules of chaos to make the chip super fast.... instead of using only certain paths it can use any path for any task .. I'm not a genius on it.. but you might wanna look it up..



posted on Mar, 11 2005 @ 08:59 AM
link   
damn there's some smart people on here! Had me lost though, i think i need to spend some more time on wikipedia,lol.
found this on chaos theory and cpu, waaay back from 1999:

Order from Disorder
Researchers from many areas of study look to chaos theory for progress:
www.business2.com...

form here onwards the article gets juicy:
Inside a small laboratory at the Georgia Institute of Technology in Atlanta, a revolutionary method of computing floats in a small plastic dish underneath a microscope's lens. The brain of this computer is quite literally that — not silicon but brain matter itself: neurons, the basic building blocks of the brain, dissected from leeches.

Digital computers that work by turning bits on or off don't make the grade when it comes to "creative" tasks like handwriting or speech recognition. What's needed are computers "that are more like human brains and can rewire themselves" to find solutions, says William Ditto, head of Georgia Tech's Applied Chaos Lab.

"We're hoping within five years to have a chip with living matter on it to solve specialty problems in ways that conventional computers can't," Ditto says. "These problems range from pattern recognition to simulating extremely complex real-world physical systems like economic and environmental models." (See "Big Ideas from Small Creatures," Jan. '99, p100.)

Still a technological infant, the biocomputer consists of just two leech neurons, chosen because their relatively large size makes them simpler to wire. Eventually, the biological neurons may even be replaced with silicon substitutes, Ditto points out. Completely organic or not, though, in this early incarnation, the wetware computer currently isn't doing anything a 4-year-old can't. The impressive thing is that it's doing it in much the same way that a 4-year-old does.

Once the leech neurons are interfaced via electrodes to a PC, each one can be programmed to represent a specific number. The PC then "tunes" the neurons to undertake a specific task. Addition is its only trick thus far. After the PC stimulates them, the cells link with one another, talking to each other very much like the neurons do in a human brain. Within several seconds, the neurons come to a solution. Not a groundbreaking result, but it's the brain-like way in which the leech computer tackles a problem that may revolutionize data processing, Ditto explains.

"A Pentium computer does really dumb calculations very quickly," he says. "But a 4-year-old can come to decisions based on partial information and figure out how to do additional operations."

The key to making a computer more brainlike, explains Ditto, is that "each element can't behave randomly or you'd just get gibberish. You need the elements, neurons in this case, to behave chaotically so they can interact with each other adaptively."

And that's an entirely new direction for computing, veering off the traditional path that "faster is better." According to Ditto, the key is not to increase the speed of the processor but to increase the number and flexibility of connections between its elements. Then, he says, we'll be closer to mimicking the chaotic complexity of our own biocomputers.

"Living brain cells are like nature's transistors," he says. "We're not really smart enough to know what they're doing, but we can still use them."



posted on Mar, 11 2005 @ 09:11 AM
link   
2 things im waiting for: CELL proc. and the physics engine proc. Anyone heard of these last one?

. Supposedly ATI and NVIDIA are both thinking of putting them on the next gen graphic cards. Demos show an incredible ammount of rigid bodies interacting, grass bends under the feet of characters with ease, snow as well...100 ragdoll characters on screen at once...amazing stuff for a 3d man like myself.

Its not the speed, its the intelligence of design


pao

posted on Mar, 11 2005 @ 09:59 AM
link   
man all of this new stuff and im stuck with a p3 733



posted on Mar, 11 2005 @ 10:55 AM
link   
i was looking aorund of the graphics thing and found this:

CPU and GPU, meet the PPU:
arstechnica.com...

starting off nicely with:
"By performing advanced physics simulations in real time, the PPU can respond to gamer actions as well as environments contributing to pervasive interactive reality. By introducing dramatic amounts of physics, games can now react uniquely to each input, adding a tremendous variety of game play. Physics will offer a host of advanced features including universal collision detection, rigid-body dynamics, soft-body dynamics, fluid dynamics, smart particle systems, clothing simulation, soft-body deformation with tearing, and brittle fracturing for destruction of objects in gaming environments."



posted on Mar, 11 2005 @ 10:59 AM
link   
oh yeah, about the dual core cpu's... i was wondering if i should buy them (but wait ages i bet) or get an fx55 when the fx57 is out, and theres a price drop, or is the fx57 going to have developments that i should get?
I saw some benchmarking tests where they had to overclock a 4ghz intel cpu (which isnt going to be released anymore incidently) to over 5ghz, i think 5.2ghz to be exact to JUST (only just) beat the fx55 amd processor which only clock at 2.6ghz itself!

here's the article and the results:
www.hardwareanalysis.com...



posted on Mar, 15 2005 @ 02:52 PM
link   
Well, if you think about it, there is a field of computing where the performance has increase by OVER TWO TIMES EACH YEAR for a long time now. This field is the video processing power/speed by video cards. For example the famous ATI Radeon 9800 XT had less then HALF the processing power then the newer ATI Radeon X850 XT and this card is said to have less then ONE-THIRD of the power of the upcoming ATI r5xx chipset. That is about a 6x increase in power in about 2 years... Much faster then the increasing processor computing power rate.


[edit on 15-3-2005 by beyondSciFi]



posted on Mar, 15 2005 @ 03:20 PM
link   
I think we're going to see a revolution in computing in the next decade. Binary has severe limitations, boundaries transcended artfully by wetware.

The future of computing is trinary computing, three state processors..MY three state processors. hehehehe

You could do it with plasma, you could do it with Q., you could do it with magnetism - one way or another it WILL be done.

If we can get the computer to think beyond yes/no, off/on, 0/1, we have a real data crunching monster, I mean a bit muncher like pacman, yum yum.

-1/0/1 That's the way of the future.

Of course I'll probably perfect it Dec. 20, 2012, and the world will end the next day when I push the power button on my prototype..assuming Zero Point doesn't eradicate existence first.

It's a charmed life we Icari lead...



posted on Mar, 15 2005 @ 03:34 PM
link   
The importance of the price in chips becoming dirt cheap is software will be able to be hard coded in the chips. It used to be called firmware. Think of it no limit to the performance of any software doing any type of computing. Wow, now artifical intellegence can really take off. Hum, it is unfortunate that the human brain become obsolite. In fact replacing organs with systems will be the rage. Don't forget non stop. The old Tandem could now become the new person nervous system with non stop organs. The only problem is we now do not need any of us. Oh well, just put us in the ground with the 286s.



posted on Mar, 16 2005 @ 01:04 AM
link   
Ya i saw those PPU reveiws on some tech sites i check. Id gladly shell out my money on this thing. I dont think were very far away from photorealistic gaming and such.

I havnt even heard of Trilnary computing and id say its a ways off from any type of main stream application's. Like id say decades cause youd have to fabricate a completely new prog language, new os's, new everything really. Every computer part in existence is built off of the 1/0 logic.

Ive heard of Biol-computers but that also is far far away but still cool : ). What a day when you have a brain in a jar on your desk writting out your spreadsheets : P.



posted on Mar, 16 2005 @ 01:22 AM
link   
The future of computers will be the same buggy software and corporate crap that we've had for years, plus a lot more Big Brother-type additions. They make so much money by selling antiquated junk that they have no reason to make any major changes. Those who say the market rules can take a look at how many PCs still have 1.4 floppy drives. We are supporting the tooled-up factories and they have a schedule for their next craptastic flavors of the same junk. notice how the trickle of CD burners took way too long to complete itself? Planned obsolesence at its best.

I've worked in tech support for the last decade and over that time I have grown to hate Micro$oft even as I make my living milking their buggy cow, Windows. The future looks controlled and grim, if you ask me. UNIX is not a solution due to the learning curve. A suitable rival to M$ would have to be as user-friendly as they are.

Personally, I want Sony to develop an O/S for the Playstation platform. Their hardware is far superior to the DOS PC. I realize there are people hacking their PS2s for better use, but I think a fully featured O/S would chop into M$'s dominance. With the recent sale of IBM to China, however, that's not looking likely.

A man can dream can't he?


d1k

posted on Mar, 16 2005 @ 01:57 AM
link   

Originally posted by Schmidt1989
Moore's law does say this, but have we gone up twice the speed scince sepetmber 2003? i think not,


Just because the public does not have it does not mean it does not happen. Chip developers probably more then double their technology every 18 months.




top topics



 
0
<<   2 >>

log in

join