backtop


Print 45 comment(s) - last by TheMailMan78.. on Apr 18 at 12:07 PM

Intel prepares to take on AMD's Quad FX

Intel hinted at the possibilities of a high-end gaming system based around its Xeon workstation processors earlier this year at CES 2007. The Intel V8 system featured two Xeon Clovertown processors clocked at 2.4 GHz and a single GeForce 8800 GTX graphics card. Although it was a proof of concept to counter AMD’s Quad FX enthusiast platform at the time, Intel has announced the V8 concept will become a reality at its Spring Intel Developer Forum in Beijing, China.

Intel has named its new enthusiast platform Skulltrail. Intel has remained very tight-lipped about its Skulltrail platform except a few basic features. Skulltrail will arrive later this year and accommodate two Core micro architecturequad-core processors. It is unknown if Intel will develop a new platform to accommodate dual LGA775 socket processors or rework its LGA771 socket Xeon workstation platform to accommodate unbuffered DDR2 memory. Intel’s Xeon workstation platform only accepts FB-DIMM memory.

On the graphics processing side of things, Skulltrail will sport four PCIe slots for graphics expansion. Intel doesn’t divulge details on which multi-GPU graphics technology Skulltrail takes advantage of, however, Intel’s current 975X Express supports AMD’s CrossFire technology.

Intel has not released any details on the chipset powering Skulltrail yet. Nevertheless, Intel expects to unveil its next-generation single processor enthusiast chipset, X38 Express, later this year.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

blar
By hughlle on 4/17/2007 6:04:11 AM , Rating: 2
total overkill, totally useless, i don't really care tbh :)

a g80 and even an e4300 will suffice for pretty much anything you want




RE: blar
By mlau on 4/17/2007 6:31:22 AM , Rating: 2
Probably intended for graphics designers who can put the raw
processor and gpu power to good use (and I certainly wouldn't
mind 8 cores if they can speed up compiler runs)
For gaming purposes it is total overkill, I agree.


RE: blar
By Hare on 4/17/2007 6:44:02 AM , Rating: 2
quote:
For gaming purposes it is total overkill, I agree.
There are many companies that sell "overkill products" to a niche market and are very succesfull. There are also plenty of people with money to burn who buy equipment like this. I'm sure Alienware etc would be interested in these systems and would sell more than a few computers.

This proof of concept also has decent PR-value. Flagship products always get press coverage and increase brand recognition.


RE: blar
By mlau on 4/17/2007 8:11:15 AM , Rating: 2
quote:
There are many companies that sell "overkill products" to a niche market and are very succesfull.


That shows these companies sell to people with too much money
and/or no brains. Just by waiting ~6 months the price of such
a high-end system usually halves and games that can actually
USE its power are 6 months closer to release.


RE: blar
By Hare on 4/17/2007 9:23:05 AM , Rating: 2
True.

But to be perfectly honest. If I had ridiculous amounts of cash I might do the same...


RE: blar
By MrTeal on 4/17/2007 10:42:17 AM , Rating: 2
You can't take it with you. If you have enough money and enjoy gaming, why wouldn't you buy the top of the line?


RE: blar
By JamRockaz on 4/17/2007 1:00:35 PM , Rating: 2
Well, u either have no/little money and/or no brains to know that for some people money is not an issue, while u and i have to scrape the pot bottom for that extra $20, they spend $20K like its $2........or maybe its just me


RE: blar
By Xietsu on 4/17/2007 1:20:35 PM , Rating: 2
Seriously, even if you're single and you're a motivated gamer with at least 60K+ annual income, you could obtain in minimum five of the best the market has to offer so you and some buds can coagulate into the one nerd team to rule them all -- albeit, provided that this is a primary and almost sole hobbyist recreation of the "gamer" in question. All private PC tech distributors pretty much have large purchase financing built in to grade at at least some frequency. I think the question isn't "if you can/will/would afford it", but for those who don't know, "how can applications of the generation's highest throughput systems reach the boundary, if even possible"? This is what beholds centricity in such a particular scenario, and, IMO, only that. ;]

Beyond such simple construances of fact, there is then only the arena of whether one has or has not the case for speculation to grade superioriorty -- if Intel or AMD will eventually encompass the general reign. Personally, I think this is something you can only perceive with too great variability and subjectivity, as even AMDs X2 6000+ is said to at least a bare on par in relation to C2D 6600s. It seems to me AMD will be able to fashion a technological solution with enough of a robustness to challenge the current scenario of dominance, but again, I surely wouldn't know and would only allot weight in AMD's capacity for the aforementioned given the force of which their current prior-gen relatively has in the market.


RE: blar
By typo101 on 4/17/2007 5:22:37 PM , Rating: 4
completely unnecessary masturbatory use of vocabulary


RE: blar
By wushuktl on 4/17/2007 6:45:34 AM , Rating: 3
what avid gamer ever says that they don't really need the extra power?


RE: blar
By subhajit on 4/17/2007 7:39:56 AM , Rating: 3
I think these Systems are overkill (for gaming) because no game developer will ever think of developing their games based on this kind of hardware. So, the only benefit you get is insanely high frame rate, which you really don't need. Another benefit is of course that it is future-proof. But a new generation of product is always able to touch the same performance with lower cost and power-consumption.


RE: blar
By OrSin on 4/17/2007 10:21:32 AM , Rating: 2
What planet have you been on. I remember when Doom3 and Oblivian came out and they over taxed every system out and every dame website was riding thier jocks.

Look at how great a system you need for these games. They are that good. I thought everyone alive was complete fools for having to spend 600 in upgrades to play 1 game. People bashed me this site for say the saem think.

Now high end sytems are over kill because the new must have game didn't come out yet. People are so flanky.


RE: blar
By kamel5547 on 4/17/2007 11:29:43 AM , Rating: 2
Yes but.... mostly what people ened is GPU power and not CPU power. Most games would take advantage of more GPU power, but have 2 quad-cores vs one dual core is not going to net you much gain. Even if games go more multi-threaded 1 quad core will be more than enough for the near future, while a new GPU iwll be a must have.


RE: blar
By bobobeastie on 4/17/2007 11:44:39 AM , Rating: 2
quote:
and every dame website was riding thier jocks

I don't know what THAT type of website has to do with Oblivion or Doom 3. I kid, I kid.


RE: blar
By mlau on 4/17/2007 8:07:50 AM , Rating: 2
Heh, so true :)


RE: blar
By Vanilla Thunder on 4/17/2007 1:10:39 PM , Rating: 2
quote:
Intel hinted at the possibilities of a high-end gaming system based around its Xeon workstation processors earlier this year at CES 2007.


By the looks of this quote, I wouldn't say it's "Probably intended for graphics designers...."

Vanilla


RE: blar
By GoatMonkey on 4/17/2007 8:46:27 AM , Rating: 3
I would love to have one as my HTPC. Especially if they can ever work out the cable card issues. 8 cores would do a pretty good job of recompressing video to divx.


RE: blar
By Phynaz on 4/17/2007 10:38:45 AM , Rating: 2
quote:
a g80 and even an e4300 will suffice for pretty much anything you want


How the f**k do you know what will suffice for what I want?


RE: blar
By JamRockaz on 4/17/2007 12:52:25 PM , Rating: 2
I see ur reasoning, however, a gaming enthusiast is not the only kind of enthusiast around. Granted, if u like playing high demand games with that setup you will most likely be forced to upgrade or overclock well over the top by the end of this year to Q1 of next year


RE: blar
By Visual on 4/18/2007 5:06:54 AM , Rating: 2
since the price of the lowest quad-core xeon or core2 will be just $266, a desktop mobo that can take two will be a big hit, esp. if it can overclock some.

it's not important whether it's necessary, overkill, useless or whether you care or not. the important thing is that it will be available, and even relatively affordable, and that's a good thing. i'm sure people will manage to find some use for it :)


2 Chips, 4 Dies, 8 Cores?
By TheDrD on 4/17/2007 9:49:42 AM , Rating: 2
I want 1 Chip, 1 Die, 4 Cores

None of this 4 Die crap




RE: 2 Chips, 4 Dies, 8 Cores?
By KristopherKubicki (blog) on 4/17/2007 10:06:21 AM , Rating: 4
Why?

http://www.dailytech.com/article.aspx?newsid=6928

There's no performance difference and it's cheaper for multi-chip packages.


RE: 2 Chips, 4 Dies, 8 Cores?
By Alexvrb on 4/17/2007 7:52:19 PM , Rating: 2
Although I would tend to agree with that sentiment in general, I did not see any conclusions about performance differences in that link. It said it was more efficient from a development standpoint, as you don't have to redesign the whole chip, just the die in question. This does allow for cheaper, faster development of new chips, as some dies may be upgraded and others completely unchanged.

However, with regards to performance, that very much depends on the specific implementation of the multi-die chip (or multi-chip module). With multiple dies, your inter-die communication has to be well thought out. Some chips may do it just as well as a single-die chip, and there would be equal performance. However, if the two dies had to communicate over the FSB, this could provide a performance disadvantage. In the future I'm pretty sure all MCM packages will have a better approach and there will not be a difference in performance. But that doesn't mean there is never a difference, because it really depends how they do things.


RE: 2 Chips, 4 Dies, 8 Cores?
By deeznuts on 4/17/2007 1:15:52 PM , Rating: 1
As stated, why? Because you are a sucker for marketing?


who cares if it's overkill...
By livelouddiefast on 4/17/2007 11:02:55 AM , Rating: 1
I'm not about to drop 5k on a new comp by any means, just because i'm confident with myself as a male and don't need ridiculous specs to prove it. However, if i had 5k+ to waste on a new machine, why not go for this?

Just think how much you could make your fellow nerds drool... "I have 8 cpus, 4 gpus, and while in vista i can just barely manage a playable framerate in bf2142 in 1680x1050 with settings maxed..."




RE: who cares if it's overkill...
By Munkles on 4/18/2007 9:06:01 AM , Rating: 2
quote:
who cares if it's overkill...
By livelouddiefast on 4/17/07, Rating: 2

"I have 8 cpus, 4 gpus, and while in vista i can just barely manage a playable framerate in bf2142 in 1680x1050 with settings maxed..."


Spoken like a guy who either doesnt have a sufficient computer to run a vista gaming rig or someone who's never really used vista.

As a guy in the "IT" line of work and a life-long PC gamer I can most assuredly tell you that a C2D and a 8800gtx on vista ultimate is enough to run EVERY game maxed out, on even a 1600x1200 res monitor.

I dont know why some people persist on trying to trash a very well built OS that runs smoothely, is more secure XP, and has a lot more features and eye candy to boot.

I just installed Dreamscapes beta yesterday, and while running that on my 1600x1200 display, I has instances of CC3, WoW, and Supreme Commander running all at the same time and without hiccups and at rock smoothe frame rates.

Im not saying dont have an opinion, just that if your going to make claims about a product being poor at least make sure you are qualified to make that claim.


By TheMailMan78 on 4/18/2007 12:07:15 PM , Rating: 2
I am not sure what your doing but I can run 2142 maxed out on my rig and I do not have half of what you claim to need. But I can see where more cores and a better GPU might help with STALKER. After all that game doesn't even move maxed out my rig. Read my specs below.

OS: Vista Home Premium
CPU: AMD X2 4200+
Ram: 3 gigs
GPU: ATI X1900GT 256


May I just say...
By Trippytiger on 4/17/2007 7:26:53 PM , Rating: 2
Best. Codename. Ever.




RE: May I just say...
By superkdogg on 4/18/2007 11:51:41 AM , Rating: 2
I read the headline and expected this to be some sort of April fool's joke (albeit a bit late).

Way to come out of your shell, Intel. More please.

As to price/overkill, the fact is there are those out there who will buy it because it's the best. Ferrarri, etc. have been capitalizing on this principle for decades. They will be overkill for most applications, similar to high-end sportscars. The "Who cares if your car goes 200 MPH, when the speed limit is 65?" argument is applicable. So is the, "but it's nice to know that you have the power to get to 65 in 5 seconds and I look better going 65 than you." retort.

Honestly there's no good reason to argue the point here. Argue it with your wallets-either buy it or don't.


I say...
By suryad on 4/17/2007 8:56:00 AM , Rating: 2
did anyone read the article about the octacore Mac Pro benchmarks? You know it shows that most of the gains are not that impressive even in multicore software because the antiquated FSB is not enough to feed the cores! So before worrying about how it is overkill lets talk about how there will be no performance increases even by jumping to 1333 Mhz since the cores will be eating up data faster because these are the new Penryn cores. Now if the processors are based on the 1600 Mhz as the new Xeons are supposed to be, I could see that alleviating the problem somewhat....but only somewhat...thoughts?? Comments??




V8?
By dice1111 on 4/17/2007 9:27:35 AM , Rating: 2
quote:
Intel has announced the V8 concept
I wonder if it will be liquid vegitable powered?
http://www.v8juice.com/ (v8juice.com)
More and more things are these day's...
http://www.dailytech.com/article.aspx?newsid=6635
Either that or the people at Intel are taking their health too seriously and it's crossing over into their work.




Bring it
By russki on 4/17/2007 12:17:23 PM , Rating: 2
I work with CAD software and use a dual core machine with 4gb ram, some of the things I work on are like a slide show. I'd love to have a V8 at work...




Current Relations?
By Xietsu on 4/17/2007 12:31:27 PM , Rating: 2
With how much efficacy can current solutions churn out top-end, unjittered output when it comes to throwing just tier upon tier of graphical and processing input? Gamers could use this if they intend to have the max amount of model renders decked in as high of tier skinnings and affectual appendings available for visual quality within some of the latest MMORPGs (or, alternatively, MMOFPSes). Whether any tests have been conducted as to the threshold for reaching perfomance on contemporary and/or outcoming titles I don't know, but due to that, I believe this stands to be an envisionable usage of such power.




Nothing New!
By ButterFlyEffect78 on 4/17/07, Rating: -1
RE: Nothing New!
By rippleyaliens on 4/17/2007 9:42:36 AM , Rating: 2
NO such thing as overkill. When you need it, and don't have it... you sing a different tune.

Nothing new, but instead they will use non-ecc memmory.
AND copy-ing stuff.. WEll for all amd riders out there. (riding on somehitng)..
Every cpu they make, they still pay a license fee to Intel. So who is copying who/...

THEY are in compitition numb nuts.. more extreme they get, faster it is for us...


Nothing New!
By ButterFlyEffect78 on 4/17/07, Rating: -1
RE: Nothing New!
By lplatypus on 4/17/2007 6:50:39 AM , Rating: 3
quote:
...to only see it grow to 8x8 in short time
8 GPUs? please no! :-)


RE: Nothing New!
By FITCamaro on 4/17/2007 7:52:42 AM , Rating: 2
8 CPU cores. Not GPU. 8x8 would be two 8 core processors totaling 16 cores.


Nothing New!
By ButterFlyEffect78 on 4/17/07, Rating: -1
More Glue-Blobs from InHell !
By cornfedone on 4/17/07, Rating: -1
RE: More Glue-Blobs from InHell !
By suryad on 4/17/2007 9:48:44 AM , Rating: 3
You sound more like a fanboy yourself....but giving the benefit of the doubt...if Intel is putting out processors with glue blobs and they keep beating the AMDs, sounds like they are using one heck of a glue blob!


By retrospooty on 4/17/2007 10:31:59 AM , Rating: 5
"but giving (cornfedone) the benefit of the doubt"

You must be relatively new here, that is far more then he deserves ;). Cornfedone is best described as an fanboytrollcreepchild. he pretty much is here to bash anything and everything Intel and MS do without much forethought or reason.


RE: More Glue-Blobs from InHell !
By JamRockaz on 4/17/2007 1:56:59 PM , Rating: 2
Aahahahaahah, nice one, that should shut them up


O god.
By Metroid on 4/17/07, Rating: -1
RE: O god.
By JamRockaz on 4/17/2007 2:18:11 PM , Rating: 2
Oh crap!!! There still comming!!
Quick someone bar the doors, shoot them up with comments filled with facts. We take no prisoners...


"We basically took a look at this situation and said, this is bullshit." -- Newegg Chief Legal Officer Lee Cheng's take on patent troll Soverain

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki