It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

The current state of GPU technology

page: 1
2

log in

join
share:

posted on Jan, 12 2011 @ 09:25 AM
link   
The technology of our computers never ceases to astound me. It always seems that we think the visuals of things like video games and CGI movies just can't get much better, and then in a couple years we look back and talk about how we don't know how we ever thought that some particular game or movie looked so good. These youtube videos showcase just how incredible our current GPU (Graphics Processing Unit) capabilities are that these can be rendered real-time on a high-end consumer PC.

www.3dmark.com...

P.S. No, I'm not just tryin to push their product or anything. Their tests are pretty much the standard for PC performance and the things their artists manage to do constantly impress me.



posted on Jan, 12 2011 @ 09:32 AM
link   
Is that global illumination i see !
Cant wait until games look like this, I will never leave my apartment



posted on Jan, 12 2011 @ 12:06 PM
link   
reply to post by R3KR
 


Global illumination has been used in games for a while... still, there's ALWAYS room for improvement



posted on Jan, 12 2011 @ 01:58 PM
link   

Originally posted by john_bmth
reply to post by R3KR
 


Global illumination has been used in games for a while... still, there's ALWAYS room for improvement


hm.. was it? I believe metro 2033 had some global illumination available (hidden in console commands, not normally), but I am not aware of any other games using it.

S+F OP, interesting topic!



posted on Jan, 12 2011 @ 02:20 PM
link   
And just think, in a couple years time we'll be looking back on the technology showcased here and say how we dont know how we ever thought it looked incredible.



posted on Jan, 12 2011 @ 02:44 PM
link   

Originally posted by warbird03
And just think, in a couple years time we'll be looking back on the technology showcased here and say how we dont know how we ever thought it looked incredible.

Hah! I remember being blown away by Wolfenstein 3D



posted on Jan, 12 2011 @ 06:05 PM
link   
i am actually at times still impressed by how good some older games look



posted on Jan, 12 2011 @ 10:23 PM
link   
i cant remember where i saw this but they are developing new tv's that go about 4 times the resolution of our current hdtvs. ive been trying to find this for a while now. but if the whole 3d thing doesn't take off, this would be the successor. they say it is so detailed that its like looking through a window.

ah wait just found it, its uhdtv and were looking at an eta of 2020.
UHDTV

edit on 12-1-2011 by mvirata because: (no reason given)



posted on Jan, 12 2011 @ 11:47 PM
link   

Originally posted by warbird03
The technology of our computers never ceases to astound me. It always seems that we think the visuals of things like video games and CGI movies just can't get much better, and then in a couple years we look back and talk about how we don't know how we ever thought that some particular game or movie looked so good. These youtube videos showcase just how incredible our current GPU (Graphics Processing Unit) capabilities are that these can be rendered real-time on a high-end consumer PC.

www.3dmark.com...

P.S. No, I'm not just tryin to push their product or anything. Their tests are pretty much the standard for PC performance and the things their artists manage to do constantly impress me.


Okay this does not quite have to do with GPU but it does computers. What I am interested in is how far on the horizon quantum computers are. I imagine what at least 30 years?



posted on Jan, 13 2011 @ 01:17 AM
link   

Originally posted by tomcat ha
i am actually at times still impressed by how good some older games look
Me too.

Two games that come to mind specifically are Far Cry, and it's successor, Crysis.

A huge effort was made by crysis developers, in conjunction with GPU developers to take crysis to the next level above Far Cry. The new game was so resource hungry at the highest settings you needed massive upgrades of CPU and GPU to keep up with it. Yet with all that, when I compared how good the new features of Crysis looked with the older Far Cry, yes it was slightly better, but for me not enough to justify upgrading my existing system for one far more expensive.

The differences actually seemed very subtle to me. Of course, Far Cry was already a pretty amazing game for its time if you ask me.

So yes they are making improvements but am I the only one who sees them as kind of marginal and maybe not worth spending a fortune on upgrades?



posted on Jan, 13 2011 @ 02:58 AM
link   
reply to post by tomcat ha
 




Originally posted by tomcat ha i am actually at times still impressed by how good some older games look


Yeah, for example Doom 3. The engine was very advanced for its time, even new engines like UT3 do not have all lightning dynamic and shadowed. Check out the Sikk graphical mod for Doom 3 - it adds soft shadows, and some modern shader effects (HDR, Bloom, motion blur, depth of field). With this mod, the game from 2004 looks almost like the abovementioned Metro 2033.


Another example is FreeSpace 2. It was released in 1999, and praised by all critics as the best space shooter game of all time, but failed to sell well due to poor marketing and distribution. So the developer decided to release the game source code. Modding community immidiatelly catched up, and now the engine (which was also very advanced for 1999, featuring all dynamic lightning, just without shadows) looks wonderful. If you are into space games, definately check Freespace Open out, along with all total conversions and mods based on the engine.

edit on 13/1/11 by Maslo because: mistake



posted on Jan, 13 2011 @ 09:19 AM
link   

Originally posted by d00d557

Originally posted by warbird03
The technology of our computers never ceases to astound me. It always seems that we think the visuals of things like video games and CGI movies just can't get much better, and then in a couple years we look back and talk about how we don't know how we ever thought that some particular game or movie looked so good. These youtube videos showcase just how incredible our current GPU (Graphics Processing Unit) capabilities are that these can be rendered real-time on a high-end consumer PC.

www.3dmark.com...

P.S. No, I'm not just tryin to push their product or anything. Their tests are pretty much the standard for PC performance and the things their artists manage to do constantly impress me.


Okay this does not quite have to do with GPU but it does computers. What I am interested in is how far on the horizon quantum computers are. I imagine what at least 30 years?


How does it not focuse on the GPU? That's one of 3dmarks main purposes, to measure the performance of the most advanced GPUs. Everything from lighting to polygons to textures and sometimes even physics are handeled on the GPU.



posted on Jan, 13 2011 @ 10:00 AM
link   
reply to post by warbird03
 


I've been saying for a few years now that we'll soon be at a point that we can do in realtime what used to take render farms weeks or months to accomplish. I think we're just about there and we can do it on a single desktop computer.

My perspective goes back to the Commodore 64 that I grew up with. Technology has jumped further than I could have imagined since then. I can't imagine where we'll be in another 30 years...hell, I'm not even sure about the next 10 years.



posted on Jan, 13 2011 @ 10:28 AM
link   
As they get closer to reality things will get more and more complex. For every step ahead there will be twice as many changes as the step before. Nature has so many little things that together form our view of reality. Ultimately from now on it's just the computing power that is in the way.

Hats off to 3D game creators. Some of the games in 3D era were truly masterpieces in the sense they had to adapt and compensate for the CPUs available at the time. They were always biting the edge of what can be done.



posted on Jan, 13 2011 @ 10:43 AM
link   
GPUs would be so much more amazing right now if we had those graphene CPUs ( 1,000 gigahertz capable). GPUs are majorly bottlenecked by the silicon barrier. Developers are still not utilizing multithreading effectively. Many multithreading games will only do 2 cores, and Intel and AMD are at 8, going on 16 cores now. Is it just too impractical to write software for that many cores?




edit on 13-1-2011 by sliceNodice because: (no reason given)



posted on Jan, 13 2011 @ 10:54 AM
link   
reply to post by sliceNodice
 


Actually, graphics processing is one of the tasks that scales very well with number of cores (processing cores in GPU are called "stream cores"), because pixel rendering in one area of the screen is independent from results of the rendering in other areas, thus the task can be easily paralelized. Thats why modern graphics cards have cca 1600 stream cores (Radeon 5870), whereas CPUs have only 4-6 cores.


edit on 13/1/11 by Maslo because: (no reason given)



posted on Jan, 13 2011 @ 10:58 AM
link   

Originally posted by sliceNodice
GPUs would be so much more amazing right now if we had those graphene CPUs ( 1,000 gigahertz capable). GPUs are majorly bottlenecked by the silicon barrier. Developers are still not utilizing multithreading effectively. Many multithreading games will only do 2 cores, and Intel and AMD are at 8, going on 16 cores now. Is it just too impractical to write software for that many cores?




edit on 13-1-2011 by sliceNodice because: (no reason given)


It's difficult to write software that is multithreaded, especially games. But some newer games like Bad Company 2 are starting to utilize multiple cores very effectively. Expect more of this in the future, scaling to an even greater amount of cores. Also, AMD and Intel are at 6-cores for their fastest desktop processors, some server products have more but generally each core on those products will be slower hence performance for desktop users will go down because the software cannot utilize all the cores. A mix between per-core throughput and total-core throughput is needed. Quad Core is definitely that point IMHO. 2nd generation Core i7 (Sandy Bridge) has the highest per-core throughput of any processor right now, and there's four cores.

edit on 13/1/11 by C0bzz because: (no reason given)



posted on Jan, 13 2011 @ 11:05 AM
link   
I wrote that all wrong ... I meant CPUs are bottlenecked by the silicon barrier. Also, a lot of what I was talking about was CPU related.. *wants the ground to open up and swallow him*



new topics

top topics



 
2

log in

join