It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Intel Shows 48-core x86 Processor

page: 2
1
<< 1   >>

log in

join
share:

posted on Dec, 4 2009 @ 11:02 AM
link   
The programs will definitely have to be changed to take advantage of this chip. While data transfer will be rapid between processors on the chip, how fast will it be between chips? Bus design will be limiting.
It should be apparent that while I am involved with using parallel machines, I am not a computer designer or systems person, so I appreciate all of your comments and discussion.
Rejected heat is the limiting factor in the 2,000 processor blade systems I deal with and AC casualties can shut down the works. Using less power is really a big advantage.



posted on Dec, 4 2009 @ 11:43 AM
link   
Well, first up, you guys won't ever see something like this in the home, or anything close to it.

When they finally migrate over to the Cloud, you won't need anything more than a fancy Net PC, that handles all the inputs and outputs.
A crazy interpretation of this is Onlive.

www.onlive.com...

Based on what I've heard from beta testers, this thing will kill consoles and PC gaming as you know it...and the beta-testers love it. Now you can play the great games on your less than great PC's, since there are no hardware upgrades and nothing to install. Its like cable for video games.
When you look at it from a developer perspective, there's only one system to develop for, no boxes to ship, no software to steal and copy.

Companies right now like Microsoft are trying to build up Cloud-based real estate to handle future computing.
news.cnet.com...

news.cnet.com...

But if you want, just keep dreaming about those CPU's that will never come to you.



posted on Dec, 4 2009 @ 11:49 AM
link   
This amount of power will allow small artist/film production teams or even single artists to produce movie scale graphics and at almost no price compared hollywood. And it consumes very low power to boot!



posted on Dec, 5 2009 @ 03:51 AM
link   
Nice, it certainly makes my Q8200 pale in comparison (doesn't even support CPU level OS virtualization).

A lot of software releases are starting to take advantage of multiple CPU cores, Adobe Photoshop just being one notable example. I'm sure that it is harder to develop software around a multhreaded platform, I'm fairly convinced that things are going that direction. As some people have already pointed out, simply pushing for a higher and higher Ghz barrier is an exercise in failing at cost benefit analysis. For now, a technically slower but more efficient architecture seems to be the solution.

As far as cloud computing, is anyone else not really convinced of its greatness? With more and more talk from internet service providers about throttling bandwidth, I'm not sure its logical to consider moving all of the core functionality of a PC to a connection dependent platform. That is to say nothing of the potential security pitfalls that such a shift could bring.

I mean, I'm sure that part of the industry is going in that direction, between the money Microsoft has thrown at it and Google's development of the Chrome OS that much is obvious. I just think that there will always be a place for independent computing of some kind...



posted on Dec, 5 2009 @ 02:14 PM
link   

Originally posted by Shadow
Nice, it certainly makes my Q8200 pale in comparison (doesn't even support CPU level OS virtualization).

A lot of software releases are starting to take advantage of multiple CPU cores, Adobe Photoshop just being one notable example.

As far as cloud computing, is anyone else not really convinced of its greatness?


Excellent post, I agree 100%

Software is the biggest limitation for most users to truly benefit from multiple cores. I have some software that can use multiple cores but on the most CPU intensive task, it can only use one core and that's frustrating! I have to wait hours or sometimes days for my results. So even 48 cores won't speed that up one bit without better software.

And I'm also unconvinced of the greatness of cloud computing, though I can see the applications for it. Most users will still prefer to have localized applications. And the security aspects you mention as related to cloud computing have to be a big concern for many users.

Those who may benefit most from cloud computing are undemanding or recreational users who have no need to spend big money for dedicated local software, don't plan to use private information, etc. But even if cost is a concern, I've tried the free openoffice.org suite and it's as good as the paid versions though not 100% compatible in some of the more esoteric functions. So users may still opt for something like openoffice type apps in favor of cloud computing.

In any case, those who use supercomputers and already use software designed for such massively parallel computing will see the biggest benefit from these 48 core CPUs. The energy savings will be incredible!



posted on Dec, 6 2009 @ 06:09 AM
link   

Originally posted by pteridine
"Intel unveiled a completely new processor design today the company is dubbing the "Single-chip Cloud Computer" (but was previously codenamed Bangalore). Justin Rattner, the company's CTO, discussed the new product at a press event in Santa Clara and revealed some interesting information about the goals and design of the new CPU."
www.pcper.com...

For the computer folks out there, this thing is amazing. It can behave as 24 dual core processors with different operating frequencies and will use between 25 and 125 watts. It is a small cluster on a single chip. Imagine a unit with 1,000 or more of these.


This sounds great, but as a computer guy, my only concern would be heat. Though the wattage is low, i can still imagine it being very hot.

Is there an ETA?



posted on Dec, 6 2009 @ 06:14 AM
link   
Bring on organic chips


a bug has "MORE" power than this lot lol

good post tho but stil far to primitive

left out "MORE"

[edit on 6-12-2009 by 13579]



posted on Dec, 6 2009 @ 08:01 AM
link   

Originally posted by mossme89
This sounds great, but as a computer guy, my only concern would be heat. Though the wattage is low, i can still imagine it being very hot.
Low wattage means low heat, high wattage means high heat, they are pretty much synonymous regarding CPUs.

In light bulbs it's not as close because some of the watts are radiated as visible light which isn't heat, but even with light bulbs, especially incandescent bulbs, most of the watts are directly correlated to heat.

[edit on 6-12-2009 by Arbitrageur]




top topics



 
1
<< 1   >>

log in

join