backtop


Print 69 comment(s) - last by Tsuwamono.. on Jan 14 at 5:41 PM


Intel's dual quad-core "V8" system
Intel makes 8-core processing the new playground

At CES 2007 today, Intel reaveled a proof-of-concept PC designed specifically to counter AMD's 4x4 platform for gamers. Dubbed the "V8" system, Intel demonstrated a system running on a pair of quad-core Kentsfield Xeon processors for a total of eight physical cores.

The system runs at 2.4GHz utilizing a 1066MHz system bus and is loaded with FB-DIMM memory. The graphics card is supported by a single NVIDIA 8800GTX. According to Intel, the "V8" system dished out a score of 6089 on 3DMark CPU bench.

DailyTech previously reported on AMD's 4x4 platform, which was later given an official name called Quad FX. AMD broke news of its dedication to the gaming community early in 2006 and received good praise from the general enthusiast community. Dual-processor systems have not been as popular as they were several years ago due to the advent of multi-core processors and the cost and complexity of the systems in general.

Unlike AMD's Quad FX platform, Intel's "V8" system will require FB-DIMMs while Quad FX will work with regular unbuffered memory.  The AMD Quad FX platform can also support multiple GeForce video cards in SLI configuration, while the "V8" is currently limited to a single graphics card.

Despite the enourmous amount of processing power packed into Intel's "V8" system, AMD is not far away from releasing something along the lines of an "8x8" system utilizing quad-core Opteron processors. AMD demonstrated its Barcelona in November of 2006 and mentioned that the new Opterons would be making a showing in mid-2007.

Apple launched its dual-xeon Mac Pro platform earlier this year.  The Mac Pro slots two Core-based Xeon DP processors, and is dual-core ready.  However, only weeks after the Mac Pro launch my former employer was spotted running the system with quad-core processors instead.  Support for official quad-core support in the Mac Pro will launch shortly, presumably at the MacWorld convention tommorow.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Will this be an end to multi-bit?
By Comdrpopnfresh on 1/9/2007 12:47:00 AM , Rating: 2
I don't know much about the subject, but with 64-bit not being utilized very much by majority of people, will the use of multiple processors mean an end to performance increases by scaling the bits? Say going from 65-128? And is it true that using a 64-bit system disables some architecture features in current chips (like the intelligent look-ahead on c2d?), I think I remember reading a blurb about that.




RE: Will this be an end to multi-bit?
By IntelUser2000 on 1/9/2007 1:04:44 AM , Rating: 2
quote:
by Comdrpopnfresh on January 9, 2007 at 12:47 AM

I don't know much about the subject, but with 64-bit not being utilized very much by majority of people, will the use of multiple processors mean an end to performance increases by scaling the bits? Say going from 65-128?


The performance increase coming from 64-bit comes mainly from the two following reasons:
1. Support for RAM greater than 4GB(that includes virutal memory)
2. 64-bit support means new programs need to be compiled for it. That means CPU manufacturers can put additional enhancements, like more registers, since it needs to be re-compiled anyway

NONE of the 64-bit CPUs support 64-bit memory addressing(the highest is Itanium 2 with 50 bits, that is support for 1 Petabytes). Simply nothing requires that much. 64-bit memory addressing equals to 16 Exabytes(Giga-->Tera-->Peta-->Exa). It took us 20 years to go from 16-32 to 32-64. Considering many users aren't that starved for memory capacity as they were back 20 years ago, I bet it'll take much longer than 20 years to go to 128-bit.

quote:
And is it true that using a 64-bit system disables some architecture features in current chips (like the intelligent look-ahead on c2d?), I think I remember reading a blurb about that.


Not the intelligent look-ahead. Its Macro Op Fusion. For most part the performance impact is minor(less than 5%).

(I think there are two things people are overreacting nowadays. One is the system requirements for Vista and the other is the 64-bit support for Core 2 Duo)


RE: Will this be an end to multi-bit?
By masher2 (blog) on 1/9/2007 9:29:10 AM , Rating: 2
quote:
"The performance increase coming from 64-bit comes mainly from the two following reasons:
1. Support for RAM greater than 4GB(that includes virutal memory)
2. 64-bit support means new programs need to be compiled for it. That means CPU manufacturers can put additional enhancements, like more registers

And a third reason, extremely important for some scientific and simulation/modeling applications...native support for numbers larger than 2^32.


By jak3676 on 1/9/2007 4:36:14 PM , Rating: 2
this will probably be the reason we'll eventually move past 64-bit processing too. I don't think it will be for a lack of addressable memory.

I can already see where native support for numbers bigger than 2^64 or even 2^128 would help cryptography.


RE: Will this be an end to multi-bit?
By rippleyaliens on 1/9/2007 1:05:00 AM , Rating: 2
Well to answer a few questions.
1- With regards to the thin client guy. YES we still use that, it is called Windows terminal services, with CITRIX being the head technology. More cores=more users on 1 server, = cheaper installation.
With multi cores now, and virtualization, business's can run the same amount of services, on fewer and fewer pnysical servers.

Now with regards to home users. I see a lot of people here are young, and under the age of 30 easily. A True personal computer runs in the 3500-4000 dollars. That is how much a decent computer cost in 1991, 1996 2001, and 2007. If you have a 1000 computer now, well that is a entry level machine by standards. These arent 1year investments, but 3-4 year investments.

With piss poor programers to blame today for the sluggish of MS windows. Not neccessarly windows itself. Read your history, EVERY 18 months, Computational power DOUBLES. PERIOD Quit fighting the grain. What can i do with 4 or 8 cores. LOL, what do you need 1TB hard drives for?
infact, MS win2k3 server only needs 800mb to actually install. XP, 600 MB. Why do you need all the extra GB,s TB's-- MP3's.. lol.. EASY
Because it is evolution.

My personal machine is a e6400, 4gb ram, 7950gx2, 1x36gb 15k drive, 4x 250gb drives, 2x 400gb drives, and i can fully saturate the CPU, Easily, with the apps i run. I welcome dual quad cores, 32GB of ram, etc...



RE: Will this be an end to multi-bit?
By Pandamonium on 1/9/2007 4:00:45 AM , Rating: 2
1) "True" personal computers don't run in the $3.5-4k range. It's more like $1-2k.

2) Computational *power* does not double every 18 months. Intel's founder said that processor *complexity* will double every 18 months, and the company has tried to honor its founder by adhearing to that statement. Intel can roughly double complexity by doubling L2 cache, for example.

3) I need 1TB drives for porn, dvd backups (I like virtual mounting better than loading physical discs), and cd archives. I can fit all important documents on one CDR or less. I would argue that this is true for most people, but few are bold enough to admit it.

4) I don't need extra stuff "because it's evolution". I don't need extra stuff, period. I *want* extra stuff because I am a packrat for files. I *need* extra stuff to play the some of the latest games.

5) Nobody cares what your PC's specs are. We might, however, wonder what apps you run. Way to be specific!


By Pandamonium on 1/9/2007 4:04:02 AM , Rating: 2
We really need edit buttons. Preview is just a hastle.

adhearing = adhering

*need* was supposed to have quotes around it. None of us need new technology, we just want stuff badly enough that we think we need it. That was the message.


By rippleyaliens on 1/13/2007 8:29:55 PM , Rating: 2
What apps,
Maya, VMWARE, Games..
I play my game, with Anti-virus runnig, fraps, instant messaging, etc.. More CORES= Better response. I am not a Cheap Sucker. I pay, so that when i do anything on my PC, it is relayed to me, how i desire it. Fast and effecient.
When i say a pc is in the 3500-4000 range, i do mean LEGAL software included. hell my laptop cost me 4000.. a measly $3 a day. having a pc that cost me $4 a day is an acceptible cost.
If you cant afford it, then you dont need it.


Tick Tock fashion
By crystal clear on 1/9/2007 1:07:51 AM , Rating: 2
*The momemt AMD Ticks -Intels Tocks.

*The momemt AMD comes up with something,there comes Intel
with something better.
Back again to square one.
It appears Intel is sitting on whole bunch of New CPUs-just waiting to be released ,in a TICK TOCK fashion.
From high end to entry level-Intel has one ready for release.

If thats not enough it responds with pricing the same way.

This brings me to my point-

Recently there was an article
in "DIGI TIMES" which really gave the facts/situation/conditions in reality in the market today .

Intel CPU launches too frequent for second-tier mobo makers

Mobo, chipset | Dec 28, 16:52

Intel's fast and frequent launches of CPUs put Taiwan-based second-tier motherboard makers under pressure for their product development and prices, according to industry sources. Read more

also this

Mobo makers: CPU wars to enter new level with 45nm Penryn in 2007

Mobo, chipset | Nov 29, 15:56

In response to the news that Intel has started making prototypes of 45nm Penryn processors, motherboard makers said they expect CPU wars to enter a new...

and-
Intel to launch pricing campaign in 2Q 2007

Mobo, chipset | Dec 21, 14:37

Intel will launch a pricing campaign in the second quarter of 2007, with the price for the Core 2 Quad Q6600 processor falling to US$530, according to industry sources.

Unquote-
I have not given the link as its only for paid subscription

In short -they Intel barely give other component manufacturers & ofcourse OEMs to catch up,only to find a new
CPU to be released-sometimes not mentioned in their road maps at all.
This applies also to pricing-price cuts come
on a regular basis.
Theorotically speaking it sounds great for the consumer-looks good on paper,but infact in practice this creates a sought of instability in the market both for the buyer & manufacturers.

It started with Duo then Quad all in a short span now 8 Cores & more............
They appear to be entering IBM & SUN territory with 8 cores plus.

Its like flying through bad weather...




RE: Tick Tock fashion
By Comdrpopnfresh on 1/9/2007 2:58:32 AM , Rating: 2
agreed- and what about on the mobile market- how they're changing sockets there? At least with amd 939 was around for a long time, and revisions are being made to am2 to make am3- but the socket remains the same.


RE: Tick Tock fashion
By crystal clear on 1/9/2007 3:51:48 AM , Rating: 2
Thats a very valid argument you put forward.
Food for thought.


RE: Tick Tock fashion
By Master Kenobi (blog) on 1/9/2007 8:34:13 AM , Rating: 2
LGA 775 Has been around since the later P4 models. It's been around for several years now. So what if they are changing it?


RE: Tick Tock fashion
By Comdrpopnfresh on 1/9/2007 3:44:37 PM , Rating: 2
here is a situation in which a socket change would have been needed. Because it spans such a wide era some mobos can't handle this or that- lv, ulv, c2d, EE.


RE: Tick Tock fashion
By IntelUser2000 on 1/10/2007 12:35:54 AM , Rating: 2
quote:
agreed- and what about on the mobile market- how they're changing sockets there? At least with amd 939 was around for a long time, and revisions are being made to am2 to make am3- but the socket remains the same.


The problem on mobile is really only apparent to the laptop makers. Most people couldn't care less because they buy a new laptop to fit their purpose anyway.


RE: Tick Tock fashion
By crystal clear on 1/10/2007 2:40:06 AM , Rating: 2
"Intel now offers a total of nine quad-core processor versions in the desktop and enterprise market segments."

http://digitimes.com/mobos/a20070109PR200.html

Unquote-

Just look at the time span from Duo to Quad & the amount of
processors put out (launched).
Its amazing, barely gives the industry time to digest the Duo,up comes the quad.


One ups
By qwerty1 on 1/8/2007 10:29:39 PM , Rating: 2
When the world is just starting to brace itself for the upcoming onslaught of quad core, out comes octa. Guess the 80 core prototypes really aren't that far away from consumer use after all. But what's the point when most software developers can't even catch up with the physical prowess?




RE: One ups
By slashbinslashbash on 1/8/2007 10:52:55 PM , Rating: 2
Yeah, it's to the point where it's kind of absurd. The number of cores is getting way past the point where a single person can generate enough input to fully utilize them.

Which brings me to wonder whether we'll see a return to the thin client (as Sun and others have been predicting for a while). When I was in college, we logged into 16 and 24 CPU Sun SPARC servers to read email, edit and compile programs, etc. It seems that a small office could be serviced with one or two 8-core machines. I don't know if it'd be cost-effective, but it's just about the only way I can think of to really utilize this kind of CPU power in everyday life.


RE: One ups
By qwerty1 on 1/8/2007 11:00:52 PM , Rating: 2
It's definitely cost-effective. In fact, I've got a friend who provides network-computing consultations. He says that they've found that assuming no heavy graphics nor computations are involved, a group of 30-40 office workers' needs can be fully satisfied by networking with one above average server. No matter what, 30 computers are going to cost a lot more than a server+networking.


RE: One ups
By BigToque on 1/8/2007 11:21:33 PM , Rating: 2
Makes sense to me...

Each home can have one main server, and every one or every room can have it's own terminal. Access whatever content you want from where ever you want in the house.


RE: One ups
By ADDAvenger on 1/9/2007 2:01:48 AM , Rating: 2
I was just thinking about that actually. You could even put a terminal in front of the crapper so you'd never have to duck out of a great conversation (or game or whatever suits your fancy) because you're about to make a mess in your pants.


RE: One ups
By VooDooAddict on 1/9/2007 12:42:03 PM , Rating: 1
quote:
I was just thinking about that actually. You could even put a terminal in front of the crapper so you'd never have to duck out of a great conversation (or game or whatever suits your fancy) because you're about to make a mess in your pants.


Yeah ... you've never been an MMO addict. :)


RE: One ups
By marvdmartian on 1/9/2007 9:48:38 AM , Rating: 1
I'd just like to see AMD push out a CRAY supercomputer with a half dozen 8800 video cards spliced to it, and say, "Match this, beyatch!!" LOL

Granted, while the one-upmanship game goes on, eventually we'll see real-world applications that take advantage of the technology, and, as always, the prices will fall to the point that people (other than the ultra-rich or uber-1337) can afford to utilize it.

Eventually, we'll build the super computer, put it in charge of the nuclear arsenal, it'll achieve self awareness, we'll try to pull the plug........well, you know where that's heading, don't you?? ;)


Quad FX still pushs ahead
By KillerNoodle on 1/9/2007 1:25:47 AM , Rating: 2
An important aspect to look at IMO is the fact that the I/O subsystems are so different. AMD basically has 2 motherboards in one with the ability to have multiple GFx Cards (2xPCIe x16 and 2xPCIe x8) Let alone the Storage aspects (12 onboard SATA ports). If Intel is just showing that they can put 2 Quad core processors onto one board it lacks a lot of options to me.

Honestly I believe that there is a limit to the number of cores a consumer will eventually need. Though you might have 8 cores can you feed the 8 cores fast enough...Meaning having a fast enough IO and Storage (SSHDD Solid State Hard Disk Drives).

Also the fact that the AMD has implemented a NUMA(Non-Uniform Memory Architecture) It will see better performance when VISTA is released, because each of the processor's cores won't have to "Fight" for all the system's memory but will have 1/2 the memory directly for each processor while still being able to request memory in the other if need be.




RE: Quad FX still pushs ahead
By ADDAvenger on 1/9/2007 2:15:14 AM , Rating: 2
Very likely more important than options is the FB-DIMMs, they're just high latency and expensive. They're good for having tons of high-speed memory, but plain DDR2 supports more high-speed memory than Quad FX's target market even wants. So basically it's an unnecessarily expensive system with advantages that you don't care about.


RE: Quad FX still pushs ahead
By Master Kenobi (blog) on 1/9/2007 8:40:09 AM , Rating: 1
Lack of FB-DIMM's is why the AMD system did so poorly on the memory side. And NUMA is as of yet unproven, AMD simply says it will do better. I'd be willing to bet that it's not going to make the big difference AMD says it will make. Now on to the other side of this, the AMD system is a dog because it literally is 2 motherboards glued together. Intel here looks like they took a XEON workstation board, changed the sockets, and bam done. This makes it highly interchangeable. FB-DIMM's are standard on any Intel server/workstation. Socket 775 Processors are dime a dozen, so you can stick whatever you want in here.


From a platform perspective, Intel's is better because it uses tried and true parts in a tried and true design. AMD's is a shot in the dark.


RE: Quad FX still pushs ahead
By masher2 (blog) on 1/9/2007 9:59:01 AM , Rating: 2
> "Lack of FB-DIMM's is why the AMD system did so poorly on the memory side..."

As of right now, FB-DIMMs are more about capacity and reliability than any performance advantages. The situation may change in the future as speeds increase, but its what we have today.


By Master Kenobi (blog) on 1/9/2007 3:41:45 PM , Rating: 2
I would argue that while true, it takes some of the overheard off of the Memory Controller, or in AMD's case the processor. Which might lead to a performance advantage when using 2 memory controllers (AMD) or one integrated with the north bridge (Intel). But thats all in theory.


RE: Quad FX still pushs ahead
By IntelUser2000 on 1/10/2007 12:33:10 AM , Rating: 2
quote:
Very likely more important than options is the FB-DIMMs, they're just high latency and expensive. They're good for having tons of high-speed memory, but plain DDR2 supports more high-speed memory than Quad FX's target market even wants. So basically it's an unnecessarily expensive system with advantages that you don't care about.


Well the Intel 5000X chipset and FB-DIMM has very good performance, except we don't see that performance advantage in the market that this V8 system is targeting at.

quote:
Lack of FB-DIMM's is why the AMD system did so poorly on the memory side. quote>
Uhh... no. AMD system always had good memory system thanks to the IMC. No review showed lack of FB-DIMM was AMD system did so badly. You may be right on much higher-end like database servers but certainly not for what Quad FX is target at.

quote:
And NUMA is as of yet unproven, AMD simply says it will do better.</


That I agree to an extent. On the comments for the Quad FX review, Anandtech said there wasn't really an advantage for going NUMA. PC apps can't care about NUMA, its made for the target market, way higher up in the server range.

Go see benchmarks for SCSI hard drives. Its below the 7200RPM drives at PC apps but ahead in workstation/server apps, since its optimized for workstation/server apps.



For those saying Quad FX doesn't need more than 4GB: That just proves a point why Quad FX systems and V8 systems are worthless. Every market segment workstation and above is about memory, memory, memory. That's why the 32-bit Xeon systems tolerated the Page Address Extensions even though it slowed the performance down, since greater capacity potential outweighed the performance difference.

Quad FX/V8: for people who need a cheap workstation, even more true for V8 with its FB-DIMMs

Core 2 Extreme X6800 is far faster at PC enthuasiast apps.


Dual socket systems
By IntelUser2000 on 1/9/2007 2:27:32 AM , Rating: 2
Ok, I seriously think this dual socket thing has to die off for PC market. We BARELY , and I mean BARELY need dual core for PC market, and we are talking about dual socket quad core systems??

Come on!! I hope the AMD 4x4 system dies off so Intel's V8 system dies off and we all go back to 1P systems.




RE: Dual socket systems
By Choppedliver on 1/9/2007 3:10:01 AM , Rating: 2
Why do people continually ask "who will ever need this much power". You ALL WILL eventually and some of us need that much power NOW.

WHY? Because if it aint "Instant" it aint "Fast enough".
Obviously there is more to life than running MS word and Internet Exploder. Especially in this increasingly digital, increasingly hidef world.

To quote a line from "Field of Dreams"...

"If you build it, they will come".

When more power is available, programmers will create more elaborate and/or bloated programs to make use of it. Your f'in operating system will be so freaking huge in a few years it won't even start with less than 16 gigs of ram and 4 cores.

Some of us need it NOW, everyone will need it EVENTUALLY. Unless you want to run windows 98 still in 2018, then maybe you wont. The rest of us will be chillin on our holodeck with holobabes generated by our 4096 core cpu and 128 Terabytes of RAM and playing HALO 14 while removing the DRM crap from our latest 3D HOLOMOVIE and ripping them to our 128 Exabyte holocube that I keep on my keychain which also doubles as an FTP Server and a Pez dispenser. The RIAA will still be selling CD's for 20 bucks and sueing old ladies, because, lets face it, some things never change :)



RE: Dual socket systems
By IntelUser2000 on 1/9/2007 5:49:45 AM , Rating: 2
People like you are pretty rare. What we need is more single thread performance. Dual core is enough.

quote:
Why do people continually ask "who will ever need this much power". You ALL WILL eventually and some of us need that much power NOW.


Situation is DIFFERENT for multi-core products. It's like saying bring on more extensions(like SSE), cause it'll be faster in the future. Unless you run hell of a lot of demanding apps, you won't notice going to quad from dual core. Unlike single-thread performance, which we ALL benefit.

Rather than scaling Core 2 Duo to 3.33GHz/1333FSB and remain 95W as in the original intentions, we get the useless Core 2 Quad which is slower for most of the people, and raising to 130W, great.

As a poster says couple of posts below you, if you need multi-core, GET A WORKSTATION.


RE: Dual socket systems
By Anosh on 1/9/2007 3:18:16 AM , Rating: 2
That's a really stupid thing to say and points out your usage of a computer.

Going from single to multi core is a natural process enabling process of several heavy tasks at the same time. The point of going to this sort of parallel processing is more aimed at business than consumer but eventually will also benefit the mainstream and like many other evolutionary changes will force the developers to a new level of programming that will be more demanding to achieve but in areas such as warehouses, banking etc it will greatly benefit the users.

Here's a scenario some may relate to:
Remember when Amazon had that xbox360 for 100 USD(?) special were all of the units were sold in under a min? Allot of people were complaining that the page wasn't responding.

Web servers with the ability to handle more processes at the same time would help solve this kind of problems.

Saying we should go back to one processor systems is like saying we should go back to the inefficient cars of 30-40 years ago.


RE: Dual socket systems
By Choppedliver on 1/9/2007 3:19:47 AM , Rating: 2
Why do people continually ask "who will ever need this much power". You ALL WILL eventually and some of us need that much power NOW.

WHY? Because if it aint "Instant" it aint "Fast enough".
Obviously there is more to life than running MS word and Internet Exploder. Especially in this increasingly digital, increasingly hidef world.

To quote a line from "Field of Dreams"...

"If you build it, they will come".

When more power is available, programmers will create more elaborate and/or bloated programs to make use of it. Your f'in operating system will be so freaking huge in a few years it won't even start with less than 16 gigs of ram and 4 cores.

Some of us need it NOW, everyone will need it EVENTUALLY. Unless you want to run windows 98 still in 2018, then maybe you wont. The rest of us will be chillin on our holodeck with holobabes generated by our 4096 core cpu and 128 Terabytes of RAM and playing HALO 14 while removing the DRM crap from our latest 3D HOLOMOVIE and ripping them to our 128 Exabyte holocube that I keep on my keychain which also doubles as an FTP Server and a Pez dispenser. The RIAA will still be selling CD's for 20 bucks and sueing old ladies, because, lets face it, some things never change :)



...
By Frank M on 1/9/2007 8:08:19 AM , Rating: 5
Begun, the Core Wars have.




RE: ...
By johnsonx on 1/9/2007 11:25:29 PM , Rating: 3
LOL! I'd vote you up for that, but you're already at 5. Well done sir!


Xeon Quad + 5000X chipset = 3 month old
By Elian on 1/9/2007 3:14:34 AM , Rating: 2
Hi,

So, it's just a Xeon workstation with a 5000X chipset, is it worth a news ?




RE: Xeon Quad + 5000X chipset = 3 month old
By lufoxe on 1/9/2007 8:53:06 AM , Rating: 2
quote:
is it worth a news

because this is the internet anythign is news worthy. But I agree with the posts above, it is just a marketing ploy. Actual proof of this, is they stuck 2 quad core xeons into a desktop, added an 8800 and said... WE CAN DO IT TOO! I normally would not have a problem with this except (like the original extreme edition) it was done half a**ed. anyone arguing with me just has to look at one minor detail that no one has touched upon. It used FB-DIMMS, at least AMDs 4x4 uses unbuffered DIMMs, c'mon Intel, if you're gonna play marketing games do it right. The latency issues with FB-DIMMs are what revolts gamers. Now if they found a way to make 2 core 2 duos (quad core) work together or get a Xeon to work with unbuffered DIMMs, then we can have a true copy-cat


By hstewarth on 1/9/2007 10:32:52 AM , Rating: 2
This system is basically a Dual Clovertown systems - Xeon X53xx system with high end graphics card. That is why it uses FB-Dimm's.

The real news here if Intel has found make these cheaper - I have dual 5160 and the system is almost $10,000 - but its fastest single core computer on the market. Yes the quad cores are faster with multithread applications - but the 5160 single core speed is faster.

It would be a copy cat if they found wa way to use unbuffer dimms and dual xeons - it should means that Intel would adapting to enviroment that doesn't need the extra reliability of buffer dimms.

I was hoping that Intel was annoucing future 8 core cpus - which I would not doubt with spring/summer refresh than one would come later - similar to Core 2 duo -> Core 2 quad. Next fall is my guess.


Proof of Concept ??
By nosuchuser on 1/9/2007 3:17:33 AM , Rating: 2
Surely this is just a 5000X based Dual-Xeon workstation ??

I would have thought that the HP XW6400/XW8400 are essentially this spec, and have been around (and quad compatible) for ages.

This sounds a bit like a marketing dept at work to me.




RE: Proof of Concept ??
By Marlowe on 1/9/2007 5:26:58 AM , Rating: 2
Yes it's excactly the same.. It's nothing new at all.. just the costumers its aimed at.. so it's all marketing of course. Yes sorry for repeating what you just said :)

AMD don't need to wait to launch any 8x8 platform.. they already have that.. in their 4-socket servers of course.. It's just like the QuadFX otherwise.. They also have a 16x16 machine and 32x32 manchines that they are selling to ppl right now.. What are they waiting for? Bring it on to us POWER USERS! :P

So stupid. Why don't they just point these "power-users" in the direction of their server/workstation departments instead of making these BS products.


More than 1 processor?
By encryptkeeper on 1/9/2007 11:28:54 AM , Rating: 2
So does this mean Vista will license more than one processor? The XP license is supposedly only for one processor. And I would assume that if you were going to buy one of these systems you wouldn't skimp on the OS.




RE: More than 1 processor?
By Wonga on 1/9/2007 12:20:16 PM , Rating: 2
XP Home has a licence for 1 CPU socket, XP Pro for 2 sockets. Doesn't matter how many cores are put in the socket.

I assume Vista will go along the same lines; Basic will only support 1 socket. Business/Premium et al will support a few more.


heh
By Polynikes on 1/9/2007 2:29:17 PM , Rating: 2
Sounds like they took a server board, put in two Kentsfields and an 8800 and said "Look, it's 8 cores!" So what? Most enthusiasts don't want or need a server mobo, nor do they want to pay the premium for FB-DIMMS. But hey, it's a proof of concept, so of course they could eventually put out a more enthusiast-oriented solution.




By Acanthus on 1/9/2007 6:48:34 PM , Rating: 2
The PhysX Api in "software" mode spawns 4 threads on launch.

That means PhysX enabled titles can use at least 5 cores. (If they arent BSing, i havent personally tested this as i dont have any PhysX enabled titles, UE3 games will likely be my 1st taste)




lol no applications for it
By theteamaqua on 1/8/07, Rating: -1
RE: lol no applications for it
By nerdtalker on 1/8/2007 10:56:40 PM , Rating: 4
I cannot possibly wait for the contents of your post to become as laughable as Bill Gates' infamous "Nobody will ever need more than 640k RAM!" statement.

No applications? Try multithreaded applications, there really is more out there than just playing games, and even those are becoming more and more multithreaded.


RE: lol no applications for it
By feelingshorter on 1/8/2007 11:03:21 PM , Rating: 2
Well, Gates didn't know that porno was going to be such a big industry and you'd need more than 640k to store it. Since finishing high school, I haven't touched a game on my computer. As long as my computer is fast enough for web browsing and some multi tasking, then I'm fine. Not everyone plays high demanding games (nor do some have time). Some might mess with mp3s/photoshop but thats about it. As long as the price remains cheap, I can see reasons for higher cores though, because some people like to use Folding@home and such. But I refuse to buy some 600-800 watt psu to power some darn quad core/dual core system to just surf the web and do minor things that the average person does. What I want is more power efficiency and more bang for your buck.


RE: lol no applications for it
By ADDAvenger on 1/8/2007 11:33:25 PM , Rating: 4
It'll take some time to filter down, but in two years I'm sure quadcore will be as common as dual is today.


RE: lol no applications for it
By theteamaqua on 1/9/07, Rating: 0
RE: lol no applications for it
By Locutus465 on 1/9/2007 12:26:03 AM , Rating: 2
Quad would be nice for me as I typically run a couple Visual studios, few web browsers, Web and Database server, Trillian, and iTunes for my pod casts on my machine all day long at work... Oct might be a bit of an over kill for what I do at the moment, but then again visual studio isn't very well threaded ATM. Once MS get's their act together with their dev tool perhaps Oct will look a bit nicer.


RE: lol no applications for it
By ADDAvenger on 1/9/2007 1:58:40 AM , Rating: 2
The electric thing is an utter red herring. How many light bulbs you have in your house? Assuming 150W bulbs you're looking at five of them using as much power as the computer's PSU is rated for.

Stuff rarely, if ever pulls 100% load. Also, these quadcore chips pull as much power as a highend dualcore, they do that by lowering the voltage and clocks of each core. Nowadays the CPU is hardly to blame for highly rated PSUs, it's SLi-ing, raid 1, and all that other stuff that's the cause, but mostly it's the graphics card(s).

And, even if your computer pulled 1KW continuously, it'd be maybe twenty bucks a month. Electricity is simply cheap, it only gets to be a problem when you're running a server farm like big businesses do. The only reason I care about TDP is cooling, the cost has nothing to do with it.


RE: lol no applications for it
By Wonga on 1/9/2007 4:25:49 AM , Rating: 2
Trying not to sound like a hippy, but if everyone did that then there sure be a lot of extra CO2 going into the atmosphere that didn't need to.

Plus, I've got to say... 150W light bulbs?!? Back in the day, when people I live with didn't know any better then we'd buy the odd 100W light bulb, but even that was overkill. In this day and age, for most household tasks, why do you need more than a 20W energy saver? Unless it needs to be really bright, an 11W one will suffice.

Anyway, I agree with you that SLI cards are what is pushing this energy bar higher now, but AMDs Quad FX certainly didn't help matters either.


By masher2 (blog) on 1/9/2007 9:20:20 AM , Rating: 3
> "even if your computer pulled 1KW continuously, it'd be maybe twenty bucks a month..."

Assuming 10 cents/kW-hour, it'd be $72/month. And that's assuming a 100% efficient PSU. Most run in the 70-80% range, which means its drawing more like 1250w to supply your computer with 1000w. So you can assume closer to $90/month. In months where you run your air conditioning, it'll be closer to $120/month, just to pump out all that extra heat you're generating.



RE: lol no applications for it
By gumpster on 1/9/2007 10:18:29 AM , Rating: 2
uhh...$20 in your dreams. Even at the national average of just over $0.08 per KWH your monthly bill would be over $70 for a PC running at 1KW.


RE: lol no applications for it
By Ringold on 1/9/2007 5:15:48 PM , Rating: 2
1) I get 1kwh x 24hrs x 31 days x $.08 = $59.52 for running.. ClimatePrediction@Home 24-7 all month, or one god-awful massive video project.

2) Assume more realistic usage, say, on for 12hrs a day. $29.76

3) Assume, like most computers, it idles 90% of the time @ around 200 watts, like, again, even the beefiest systems. But I'll add 100 extra watts for fun. $11.01

My bill really would run around $60 for one of those, since I let it go 24-7 doing various BOINC projects or F@H, but the average person? Negligible.


RE: lol no applications for it
By Clienthes on 1/9/2007 4:53:12 AM , Rating: 2
If they don't build the hardware, no one will bother building the apps. They may not be common yet, but they will be.


RE: lol no applications for it
By leidegre on 1/9/2007 4:42:20 AM , Rating: 2
Even though the first post might seem to be "laughable" think about it, how many applications actually benefit from concurrency?

Running several single threaded processes on the same machine is not concurrency, but it’s becoming more and more feasible with dual core CPU desktop solutions. Good multi-threaded software is rare, and difficult to engineer, and I feel as if the technology is out running the engineering.

We know what to do, but few realize the potential difficulty in multi-threading.

Today, a dual core system should be the way to go. Quad is nice too, but still, you would end up spending more money than you would actually be using. If I had the money, I would spend it on a more expensive/performing dual core CPU.

8 CPU core desktop solutions are just insane. I can’t think of why the put it on the market, this was simply just a publicity stunt.

A well designed multi-threaded application is rare. The only games being built today with multi-threading in mind would be Alan Wake and Half-Life 2 - Episode 2.
The new source engine in Half-Life 2 utilizes lock-free concurrent algorithms to actually distribute rendering passes between cores. Now that is the way to go, but very rare and extremely difficult. Simply putting sound on one thread, and rendering on another, and AI on a third (Supreme Commander) is not good multi-threading design, it’s just a cheap trick to impose multi-threading.


By masher2 (blog) on 1/9/2007 9:24:11 AM , Rating: 2
> "how many applications actually benefit from concurrency?...8 CPU core desktop solutions are just insane..."

Any application in the fields of simulation, 3D modeling, financial analysis, scientific calculation, transcoding, or a dozen others. Most of these can benefit from 8 or more cores. I've had a four-cpu machine as a desktop for over six years...and its seen good, hard use all that time.

Don't make the mistake of thinking every usage pattern is just like your own.



RE: lol no applications for it
By ralith on 1/9/2007 9:50:51 AM , Rating: 2
I've been writing multithreaded apps for 6 years now and I don't think it is significantly harder than writing a single threaded app. It really just seems to be a different frame of mind. I'll admit that it takes a little getting use to, but it was MUCH easier than learning a new programming language.

Also, there are new tools coming out all the time that serve to insulate the coder from the multithreaded issues (OpenMP, Boost, Intel Compiler, etc). In fact, since you cited it, Half Life 2 apparently has a very well designed multithreading library as they said that with very little training or even no training the average valve coder can start writing safe multithreaded code with it. Their design was really quite fascinating, and it is work like their libraries that will make it easier for everyone else that doesn't want to learn the multithreading mindset.

As for your last comment:
Simply putting sound on one thread, and rendering on another, and AI on a third (Supreme Commander) is not good multi-threading design, it’s just a cheap trick to impose multi-threading.

My response to that is:
1. It does buy you a performance increase with VERY little effort.
2. From what I remember about Alan Wake they just have their major tasks broke out on to threads i.e. sound, rendering, physics etc are all running on their own thread.
3. You have to remember that these companies generally are not willing to redesign and build new libraries as the Valve folks did when they've already started a project so the natural result from that mind set is this simple task splitting multithreading.
4. Since there are next gen projects like Valve and Alan Wake it will only be a matter of time before most games can fully utilize all the cores on you box. Exactly how much time is still debatable, but I thought there was a couple scheduled for this year.


By Ralph The Magician on 1/9/2007 1:18:24 AM , Rating: 4
quote:
I cannot possibly wait for the contents of your post to become as laughable as Bill Gates' infamous "Nobody will ever need more than 640k RAM!" statement.
You do know that Bill Gate never actually said that, right?


RE: lol no applications for it
By greenpea on 1/9/2007 1:26:02 AM , Rating: 3
About the 640KB quote:

http://en.wikiquote.org/wiki/Bill_Gates

Look under 'Misattributions'


By masher2 (blog) on 1/9/2007 9:26:21 AM , Rating: 2
True, Gates never said it. However, I do clearly remember an editor for DDJ who was actually angry about PCs being released with more than 48K (not 48M) of RAM. He felt it would lead to massive, bloated code, as any program written should be able to easily fit within that.


RE: lol no applications for it
By slickr on 1/9/07, Rating: 0
RE: lol no applications for it
By Felofasofa on 1/9/2007 5:07:48 AM , Rating: 5
You really don't know how big digital content creation is. I use Lightwave and have produced scenes in which a single frame has taken 14 hours to render. We are crying out for cpu horse-power, and still we'll bring these twin quads to their knees. As for the Apps, try all the 3D apps Lightwave, Softimage, Maya, 3ds and a bunch more. Then the compositing Apps, like Inferno, Flame, Flint, Combustion, AfterFX, Shake, plus a bunch more. The editing Apps, Avid, Edit, Premier etc. This isn't even touching on the Engineering and Simulation apps that need loads of grunt. There are thousands of people out there who need this grunt, and most of us don't play Games! You Sir don't know shit!


RE: lol no applications for it
By Donegrim on 1/9/2007 6:04:51 AM , Rating: 2
I'm with this guy. Im studying visualisation at uni and need to render XSI and lightwave scenes that take a long long time on one or two cores. 8 cores would just be amazing to have on your own desktop. For "embarrasingly parallel" stuff like rendering, more cores is always a good thing, just fling cores at it, its all good. We have a 100 cpu render farm at uni, and even thats small for rendering.


By retrospooty on 1/9/2007 10:51:51 AM , Rating: 2
Yup... Many games (and other apps too) are already being worked ont that will require an absolute minimum of 2 cores, and wont be able to thrive unless there are 4. Check back in 1 year time.


RE: lol no applications for it
By Tsuwamono on 1/14/2007 5:41:35 PM , Rating: 2
Most high tech servers(not these kiddy things you run your CS servers on) that airports use are multi core because air traffic control software as an example is multithreaded. Multi threaded is the future and the sooner we get it into our PCs the better.

Remember how everyone thought that the computer on Enterprise was impossible because everyone could use it at the same time and it was able to output voice and such? Well think of it this way. When Gene thought that thing up he invisioned Clustered computers attached to a main core(server) The server in that ship is actually like 4 decks. We have 8 cores now in a 6 inch space. Imagine how many cores, hard drives, RAM among other soon to be invented PC parts we could fit into a 4 deck area.


RE: lol no applications for it
By elegault on 1/9/2007 10:47:53 AM , Rating: 2
I didn't realize you had to buy such a computer. People only buy computers if they need it and can afford.

There are a number industrial/scientific applications for WORKSTATION computers.

One such application that I've with is CFD modelling that can take day(s) to calculate solutions depending upon the resolution of the meshes.


RE: lol no applications for it
By hstewarth on 1/9/2007 1:44:52 PM , Rating: 2
Currently there are applications right now that would take advantage of multi-core processing. Every one doesn't just play games on machines - it also use other things like 3d graphics and database engines. I run my dual 5160's with Ligfhtwave 3d 9.0 and Vue 6 xStream and both applications take full advantage. My machine is 9.5 times faster than my 3.2Ghz P4. With 8 cores it would likely be 14 to 15 times faster - since the quad core processors are slower ( per core ) than my 5160's.

By the way looking at this picture - it looks close to my machine, a Supermicro 7045A-3, but with a lot more fans on it - probably unnecessary because these chips run very cool - my system has passive heat sinks on the memory. My p4 machine runs a lot hotter than my Xeon 5160 system.


By the way my Xeon system uses 367 Watts of power, even though it has a 645 watt power supply. With 1000 watts - you could easliy run 2 of these machines - Supermicro actually has a 1U server that does that now.


"And boy have we patented it!" -- Steve Jobs, Macworld 2007

Related Articles
Update: AMD Demonstrates Native Quad-Core
November 30, 2006, 2:27 PM
AMD 4x4 Named Quad FX
November 29, 2006, 3:25 PM
Apple Mac Pro Running Dual Quad-Core Spotted
September 13, 2006, 2:22 PM
Apple Launches the Mac Pro
August 7, 2006, 1:08 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki