It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

People “probably won’t” need discrete graphics cards anymore – Intel

page: 1
2

log in

join
share:

posted on Apr, 14 2008 @ 07:09 PM
link   
www.tgdaily.com...



NO MORE PAYING 500 DOLLARS FOR A VIDEO CARD =P.




Fosner told us that multi-core CPUs are more than capable of rendering complex scenes that used to be reserved for top-end graphics cards. He argued that Intel processors offered “more bang for the buck” and that it was more economical to go from single to multiple core processors versus popping multiple graphics cards into a machine. “The fact of the matter is that you’re going to have one graphics card, you may have a dual graphics card, but you’re not going to have a four graphics card or eight graphics card system,” said Fosner.

Another advantage to CPU graphics and physics programming is that people won’t need to continually keep up with the latest programming techniques of all the newest cards – this means futzing around with shader models and DirectX programming will be a thing of the past. Fosner said that “everybody” knows how to program for a CPU and that this new way of programming will “get rid of” a path of graphics obsolescence.




[edit on 4/14/2008 by die_another_day]



posted on Apr, 14 2008 @ 07:13 PM
link   
Sounds good. They just need to make sure they do it powerful enough, and soon.



posted on Apr, 14 2008 @ 07:51 PM
link   
8 Cores Nehalem. 16 Multipliers. This thing can be overclocked from 2.133ghz (Stock) to atleast 6 GHZ.

2.133/16 = .1333

So... with ram going to over DDR-1000... and some hardcore cooling systems this CPU has a bright, I mean literally bright future.



posted on Apr, 17 2008 @ 03:41 AM
link   
I guess that's possible, but I'd hope they'd give the cores intended for that their own ram controller and ram.

Anyway, I sure hope they can do better than their GMA-750 integrated graphics like my parent's computer has.



posted on Apr, 17 2008 @ 06:16 AM
link   
Yeah, let's see if that thing can come close to 10% of what a dedicated grahpics card can do.
And who the hell pays $500 for a videocard? And who the hell buys a DELL screen? LMAO! There is a VERY good reason why in applications such as folding, specialised GPUs and the Cell kick a PCs CPUs arse. They're SPECIALISED, you should have a look at the folding@home numbers.

And the chip won't have a multiplyer of 16. LMAO. You mean it can process 16 THREADS in parralel, much like hyper threading. And 6ghz... please don't pull numbers out of your arse.




2.133/16 = .1333

Care to explain?

Typically clock speed is dictated by FSB times Multiplyer. Nellie does away with the front side bus and is using something similar to AMD - 'quick connect'. Anything guessing at the multi, and the FSB (that it does not have), is pure speculation.


It's also not just about the memory - you can get DD3-1800 and beyond right now - it's also about the latancies. For example, CAS 4 DDR2-1066 has EXACTLY the same latency as CAS 3 DDR2-800. When you get into speeds of DDR3-1440 or beyond with latencies of 6.... then you've got something that's seriously quick.

Thank god Intel did away with the FSB that was limiting bandwidth.

Cooling. On newer processors such as the QX9770, you can overclock the bastard from 3.2 to 4.5 with a $50 cooler.

[edit on 17/4/2008 by C0bzz]



posted on Apr, 17 2008 @ 06:32 AM
link   
Hmm this sounds nice and all but i'll believe it when i see it.
What is shown in the video doesnt impress me and it isnt very nice looking.
Sure it runs smooth but it's not exactly eyecandy. That's where gpu's come in.
Running stunning eyecandy at acceptable framerates.

[edit on 17/4/2008 by Jeroenske]



posted on Apr, 17 2008 @ 07:17 AM
link   
completely false. A GPU is much better at processing gfx then whatever processor. A simple processor can no way take over all the things a quite large 3d card can do.



posted on Apr, 17 2008 @ 08:42 AM
link   

Originally posted by tomcat ha
completely false. A GPU is much better at processing gfx then whatever processor. A simple processor can no way take over all the things a quite large 3d card can do.


True, thats why they originaly made the gpu, because the cpu wasnt suited to calculating floating point calculations (or something to that effect) - That demo wasnt graphicaly intence, from what i could see there were minimal textures, basic effects etc... they emphasized the physics side of things too much - if they cant show off a good tech demo and show how much better their system is compared to at least mid range systems then they are going to loose the attention of gamers very quickly.

-fm


apc

posted on Apr, 17 2008 @ 08:58 AM
link   
Video used to be done by the CPU back in the 80x25 days. Hell I use a chip in one of my products that is basically a tiny multi-core processor running at about 80mhz and three cores I use to drive an RGBHV output. But in PCs the whole point of having a graphics card is to take the load off of the CPU freeing the CPU up to do CPU things. So what if it has a couple idle cores. If you need them they're there to be used. If they're busy running graphics, they're not. Sounds like Intel is just trying to get people to develop hardware using only Intel parts.

And I've never paid more than $100 for a video card. I'm happy staying a couple years behind bleeding edge to let obsolescence work its magic.



posted on Apr, 17 2008 @ 04:40 PM
link   

Originally posted by C0bzz
And who the hell pays $500 for a videocard?


I used to make that mistake


Now it's more like 200 bucks



posted on Apr, 17 2008 @ 04:41 PM
link   
I hope this 500 dollar savings isn't just factored into the price of the new processor



posted on Apr, 18 2008 @ 12:38 AM
link   
What really is impressive about Nehalem, is when it's paired up with a next generation Graphics card. Then combine the graphics,,, AND physics! That would be pretty sweat.



posted on Apr, 18 2008 @ 01:03 AM
link   
reply to post by C0bzz
 


Well, Intel bought Havok, so it could be a reality. Intel has made huge strides with its 45nm development and 32nm will be coming on line soon. That means more room to fit the CPU/GPU/PPU all on the same die. Technology is the only thing that I am still optimistic about.



posted on Apr, 18 2008 @ 07:47 AM
link   
Well usually spending 500 for a gfx card is money better spend then 500 for a processor.



posted on Apr, 18 2008 @ 09:23 AM
link   
i try to keep under $300 for my gpu. $200 seems to be a sweet spot lately. i've been a pc gamer since the 286. i can see how unused cores may HELP performance by assisting some of the rendering, but i still don't think it's a good idea to use them alone without a gpu. sure, it's able to do the calculations, but think of it as a general purpose processor. it can't do as good a job as something that was built with graphics rendering in mind. another example - hi-def video requires a pretty hefty cpu when playing on a pc, but a far less powerful but specialized processor powers all your blu-ray players. intel likes to believe they can provide an all in one solution for gamers, but their gpu's have always been sub-par. if games are going to start taking advantage of multiple cores for physics and AI calcs then that sort of throws out their idea of using unused cores for graphics processing.



posted on Apr, 21 2008 @ 09:43 PM
link   
With the popularity of integrated graphics nowadays I don't see the need for a discrete video card among the general population. Most people that I know of arn't even aware of what a video card is and definitely not the difference between integrated and discrete cards.

They think they're fine until they get into the gaming business. I think Nehalem will be a success for Intel.

However I became suspicious when Intel claimed that they will have a new video card that beats both Nvidia and ATI.



posted on Apr, 22 2008 @ 05:15 AM
link   
reply to post by die_another_day
 


I dont think integrated cards are very popular. They might be popular in the sence that lots of people have them, but then how many people that have them actualy know that they have a graphics card.

-fm


sty

posted on Apr, 22 2008 @ 06:46 AM
link   
There will allways be applications that cannot be handeled by the standard on-board graphics. The graphic cards will get more cores per processor too , so I guess it is not realistic to expect the death of the high-end video cards.



posted on Apr, 22 2008 @ 07:05 AM
link   
All I can say is - thank god for my 8800GT.


Cheap, fast, and quick. No friggen way even dual nehalems come close to what this thing delivers!



posted on Apr, 22 2008 @ 07:17 AM
link   
Being an avid gamer and daily reading theinuirernet and tomshardware, i can tell you that the physics simulation that has all 8 cores of nahelem, thats 16 threads, were running a particle simulation with 65000 particles at around 18 FPS.

Nvidea did the same test, and were running the simulation at 300+ FPS...

If intel bring more cores to the table then im sure it'd have a higher impact. But as of yet, no, nahelem will not be the graphics saviour from discrete graphics cards, and nor will Larrabee whch will have about 16 CPU's, 32 threads...

And BTW PhysX was brought out by Nvidea. And has already been pretty much sewn together with nvidea drivers. Any graphics card from 8800 onwards will be able to calculate physics on the GPU.

Sure, one day in the far future you will be able to play latest games at decent resolutions on a CPU only, but i can tell you that a dedicated graphics card from nvidea, or ATI (AMD) will be able to do it all much faster and at much higher framerates. And i sure as hell bet you that the picture quality will look way better too.

[edit on 22-4-2008 by DaRAGE]




top topics



 
2

log in

join