backtop


Print 30 comment(s) - last by lplatypus.. on Mar 7 at 5:37 PM

Say hello to the new Quadro FX 4600 and Quadro FX 5600

NVIDIA today released three new Quadro products – the Quadro FX 4600, Quadro FX 5600 and Quadro Plex VCS Model IV. The new Quadro FX 4600 and Quadro FX 5600 feature NVIDIA G80-derived graphics processors tweaked for CAD/CAM and visualization applications.

With the G80-derived graphics processor, the new Quadro FX 4600 and Quadro FX 5600 have 128-unified shader units. The new Quadros are also compatible with CUDA technology too, NVIDIA’s answer to AMD’s Stream Computing technology. DirectX 10 compliance and support for shader model 4.0 are also feats of the new Quadros.

Differentiating the Quadro FX 4600 and Quadro FX 5600 is the amount of memory. The lower Quadro FX 4600 features 768MB of video memory – similar to NVIDIA’s GeForce 8800GTX. A whopping 1.5 GB of video memory is available on the Quadro FX 5600; besting ATI’s 1GB of graphics memory endowed FireGL V7350. Both Quadros have 384-bit memory interfaces though.

Although NVIDIA announced the Quadro Plex VCS Model IV at the same time as the Quadro FX 4600 and Quadro FX 5600, there are no details of the Quadro Plex VCS Model IV in the press release or Quadro Plex VCS product pages. However, expect the Quadro Plex VCS Model IV to feature the new Quadro FX 4600 or Quadro FX 5600 graphics processors.

NVIDIA prices the new Quadro FX 4600 at $1995 and the Quadro FX 5600 at $2999.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Joke Question...
By Nightmare225 on 3/5/2007 8:07:43 PM , Rating: 1
How's gaming with these? :P




RE: Joke Question...
By livinloud on 3/5/07, Rating: -1
RE: Joke Question...
By Nightmare225 on 3/5/07, Rating: -1
RE: Joke Question...
By lplatypus on 3/5/2007 9:19:48 PM , Rating: 2
Jokes aside, can someone explain the differences which make these cards bad for gaming? I know they contain extra features which aren't useful to gamers, but why would they be worse than a GeForce based on the same core?

Obviously a $500 low-end workstation graphics card would perform much worse in games than a $500 high-end gaming graphics card due to the disparate pricing structure. I'm suspicious that this has led to the rumour that all workstation cards are bad for games.


RE: Joke Question...
By Trippytiger on 3/5/2007 9:41:11 PM , Rating: 2
Very often the only difference between a regular gaming card and a workstation card is the firmware, which allows for them to work with different OpenGL-optimized drivers. Well, that and the profit margin.

At least, that's how it is with my 9800 Pro.


RE: Joke Question...
By keitaro on 3/5/2007 9:44:09 PM , Rating: 2
Clock speed differences is one thing I can think of... it's likely that the cards won't be as fast as the speedier 8800 counterpart. Also, I seem to remember that these cards can do anti-aliased lines accelerated by the chip, which would help in CAD design as well as 3D modeling when viewing wireframes.

I forgot what other differences there are to it... I haven't followed the workstation-class graphics cards in ages so someone will have to fill in the blanks for me.


RE: Joke Question...
By smitty3268 on 3/5/2007 10:26:54 PM , Rating: 3
The firmware and drivers are optimized for different uses, which means they end up a bit slower in games for a lot more money. Technically, they could be made just as fast though.


RE: Joke Question...
By theapparition on 3/5/2007 11:37:49 PM , Rating: 5
The Quadros are not clocked slower than the GeForce lines, as someone suggested (reference designs). The workstation cards from Nvidia differ only in three areas. They support hardware based anti-aliased lines which smooth the "jaggies". This has almost no benefit for games since most objects are texture wrapped. The quadro's also support hardware overlay clipping. For example, multiple windows overlapping. Once again, something that has almost no value for gaming. The third feature is driver support. It takes a lot of testing to certify the drivers to work with a specific workstation application. This effort is passed onto businesses by higher prices. If Oblivian is locking up at a specific scene, you can complain all you want about your GeForce across the internet and Nvidia will fix "when they get to it". If a cad model is crashing because of a hardware problem with your quadro, Nvidia has a team of engineers working on a fix for the next day. You pay for that service.

The short version is there is not much difference between the consumer and workstation cards. Games will run a "tiny" bit faster than the workstation bretheren for a similar series of card, and cost a lot less. However, for large cad assemblies, visualization, or rendering, the extra price of the quadros can be justified and performance can be signifigantly improved over the Geforce line.

ATI has basically the same model for their Radeon/FireGL series.


RE: Joke Question...
By Ecmaster76 on 3/6/2007 12:17:17 AM , Rating: 2
That and the precision of all operations are higher. Close enough only works in games, not for pro stuff.


RE: Joke Question...
By Alpha4 on 3/6/2007 12:42:35 AM , Rating: 2
Thanks a million for the clarification. The explanation makes a lot of sense. In the case of GeForce cards being used for CAD modeling though, do they wire frames remain aliased or is anti-aliasing forced on a software level? Or does it depend on the app?


RE: Joke Question...
By theapparition on 3/6/2007 9:46:22 AM , Rating: 2
It depends on the app, but if the app implements AA lines, it would be software based, hence the potentially HUGE reduction in performance. Some apps will be drawn(<--pun intended) to their knees with a Radeon or GeForce. Others only have a small performance hit. You'd have to look at your application to see if there a cost/benifit ratio. While the FX5600 may get the majority of "press", at $2999 its hard to justify for anything but the highest performance applications. Personally, I'm going with the FX4600 line. The extra 1000 for 768MB of memory gets me almost no performance increase for my applications.

Just an aside, back in the GeForce2/Quadro2 days, all you had to do was swap resistors on the board to turn a Geforce into a Quadro. For the GeForce4/Quadro4 days, the register was set in the chip package, however, a hacked driver had enabled all the quadro features. I had a few boards from PNY and compared 2. One was a Geforce4600, the other a Quadro950. The part number on the pcb's was the same, and they had the same components on them, which shows you how related they were. I don't know of anything to turn a 5/6/7/8x series into a Quadro.

Most of my experience for workstations has been with Nvida because they simply had much better support for OpenGL than ATI for my applications. But you have to look at what you use to make the right determination. It's hard to go wrong with either these days! Long live competition.


RE: Joke Question...
By leidegre on 3/6/2007 3:21:30 AM , Rating: 2
Here's an article at Wikipedia about the ATI FireGL cards, http://en.wikipedia.org/wiki/FireGL

I would guess that the diffrences between the Quadro/GeForce cards are roughly the same.

As for the cost, I think someone made a point about customer care, which would mean that these cards come with some serious customer support for the workstation applications, hence the asked price, as well as the obvious smaller demand.


games be runnin fine
By wetwareinterface on 3/6/2007 12:19:44 AM , Rating: 2
actually to clear up a few misconceptions...

the only differences are clock speed, extra ram, firmware i.d., and drivers.

the clock speeds are adjusted up or down depending on the model from the gaming version as needed. for workstation non-3d vis parts down. all others typically up.

the only difference in the firmware is the amount of ram listed and reported and the id being different than the gaming version. the firmware is not really any different than the gaming version other than the id which the drivers won't work with unless it's in a certain range. which brings us to our next part the drivers...

the drivers for the workstation cards are the same as the ones for the gaming cards only with the addition of the cuda features and 3dstudio max/autocad certified drivers. these additional drivers have the anti-aliased line code and are designed to accelerate the respective apps in what they each need. the cards themselves don't suffer at all from these additional drivers and in fact some workstation level cards in the past have benefited in performance in games because of the extra features in the cards handling tasks the cpu normaly does if present.

the workstation drivers are able to be "unlocked/hacked" to work on non-workstation gaming based cards due to the faking of the firmware recognition. hence the drivers for games are there with the workstation cards as well as identical hardware with the exception of a few extra resistors for the firmware to check against inside the gpu itself.

so in summary these would play games fine especially that 1.5 GB model, however for the money you could buy a quad cpu and quad gpu setup instead and get better performance...




RE: games be runnin fine
By theapparition on 3/6/2007 10:14:47 AM , Rating: 2
quote:
the workstation drivers are able to be "unlocked/hacked" to work on non-workstation gaming based cards due to the faking of the firmware recognition.

This worked up to the GeForce4 line with Nvstrap and Softquadro 4. Past that, the chips themselves are slightly different. The hardware AA lines and clip overlay hardware is not even present in the GeForce chips (according to Nvida), so it is impossible to turn them on using the firmware hack. I've seen a few hacks that allow the quadro drivers to install, but performance does not equal that of a real quadro, and they are buggy and often crash.


RE: games be runnin fine
By emboss on 3/7/2007 1:59:14 AM , Rating: 3
Close ... but it's the NV40 (6-series Geforce cards) that was the last that can be softquadro'd. So the best softquadro card you can get is a 6800 Ultra and turn it into a Quadro FX4000. Performance of the softquadro'd card is actually slightly higher than the real Quadro (in relevant apps like 3DSMax) due to slightly higher memory and core speeds. The only thing you lose are the dual-link DVI ports.

The 7- and 8-series Geforce cards (G70 and G80 respectively) can't at this point be softquadro'd or hardmodded to enable the additional features. NVidia has essentially locked the chips in the same way AMD/Intel do. There is no indication that the functionality has been physically removed from the die.

And indeed, if would be silly for NVidia to do so. The cost of having another die mask for Quadros would almost certainly be more than the cost of a few extra transistors on all the consumer chips.


hot!
By lplatypus on 3/5/07, Rating: 0
RE: hot!
By caater on 3/6/2007 3:24:51 AM , Rating: 2
fyi: ati's new offerings will go well over 200w.

what really interestes me, is the perfomance.
as we know, workstation stuff mostly likes vertex shaders. in earlier generations, amounts like 4, 6 or 8 were standard.. now, suddenly, there's 128 of them.
i wanna see the benchmarks :)


RE: hot!
By defter on 3/6/2007 5:18:28 AM , Rating: 2
Why compare high end video cards to "most CPUs"? High end CPUs from AMD and Intel have TDP of 120W (Kentsfield, FX-70), thus GPUs in new Quadros are actually consuming slightly less power...


RE: hot!
By lplatypus on 3/7/2007 5:37:51 PM , Rating: 2
How did you conclude that 120W TDP CPUs use more slightly more power than 134W and 171W TDP GPUs?

The comparison with CPUs is just a reference point that emphasises how high the power usage of these Quadros is. The FX-70's that you mention are gaming CPU's... AMD's workstation CPUs are available at 68W, 95W or 103W. There were the 125W 1220SE/2220SE models, but now you can also get 95W 1220/2220 models which are the same speed.


Dispelling the false association
By scrapsma54 on 3/5/2007 9:50:47 PM , Rating: 1
These are not for gaming, these vaguely will even play your games, all these do is eliminate cpu utilization and improve 3D development. These cards are meant to parallel its counter part and also develop games to run on its non-workstation counterpart. Enough of the joke threads, you guys are misinformed. Reason for being expensive is because it cost so much to develop cuda products, and the rendering has to be done on heavy hardware




RE: Dispelling the false association
By jabber on 3/6/2007 6:09:31 AM , Rating: 2
I have used both FireGL and Quadros for gaming over the years. I've had several of the lower spec versions and they have played Doom3 etc. Nothing too special though.

The main reason for buying them was I have found at times slightly sharper image quality from them. My girlfriends PC has a bottom spec PCI-E Quadro in it and it looks great with Photoshop etc. Worth checking out the passive bottom of the range ones on Ebay. Quality kit.


By scrapsma54 on 3/6/2007 4:41:04 PM , Rating: 2
Lol, I love how I get flagged for criticizing people who don't like facts. Really, you don't deserve to be posting comments if you cannot be informed or flagging people down either way.

anyway...

Quadro does serve quality benefits, and thats why it is so expensive, however, they will not benefit in games.


Don't wanna be an "old school arse" ...
By DeepBlue1975 on 3/6/2007 8:07:52 AM , Rating: 2
But I really liked it more when consumer graphic chips and pro chips were designed independently and had a tendency to be optimized for the task they were up to, right from the core.
For example these quadros do include a lot of "gaming friendly stuff" in their cores, stuff that's not needed for pro applications at all and could be got rid off.
Well, yes, economically my argument just plain sucks, because I'm negating "big scale economy", but well, I'm a high tech die hard and not an economist :D
Forgive my "there was a time when..." kind of post, but this era is not so good looking to me as it was when there were a lot of players around competing, making lots of very specific and optimized stuff... Now we just have like 2 big boys in every IT area and competition is looking every time more like political negotiations and marketing strategy than it is about technological diversity and innovation.

Just a nostalgic post, don't take it 2 seriously as I might switch my mind off this kind of thinking in something like 5 minutes :)




By theapparition on 3/6/2007 10:03:54 AM , Rating: 2
Interesting way to look at that. I remember the old days too. Workstation cards were specialized beasts that cost several thousand dollars. Like 3D labs Oxygen series. they had specialized separate chips on them to do Geometry and Lighting and separate memory. Along came the Geforce line, which integrated geometry and lighting (T&L in DX parlance) with unified memory, and now those cards were more powerfull than anything out there. And because of scale, they were cheap to boot. The FireGL series were not based on game chips, initially, but eventually followed Nvida's model. Because of the cheap Quadros, with better peformace, 3D Labs eventally collapsed, with 3Dfx not too far behind. Now prices are climbing right back up to where they were and higher. I'd like to see a little more competition myself and see someone come out with specialized hardware, lightning quick for workstations, yet at a "bargain" price of 1000. 2999 is getting ridiculous.


wonder why
By fliguy84 on 3/5/2007 11:32:14 PM , Rating: 2
Wonder why Nvidia doesn't drop the 'FX' brand from their Quadro series




1.5 GB of video memory
By knowom on 3/6/2007 4:00:36 AM , Rating: 2
A whopping 1.5 GB of video memory is available on the Quadro FX 5600 384-bit memory interfaces though. So is it pretty safe to assume Nvidia could release a 1.5 GB 8900GTX something once AMD/ATI releases there new video card because that'd rock. It'd be really nice if mmorpg's could actually precache most of the textures on the video card since hard drive lag is awful.




By crystal clear on 3/6/2007 6:51:03 AM , Rating: 1
Latest Forceware drivers for Vista :

NVIDIA_G80.DEV_0191.1 = "NVIDIA GeForce 8800 GTX"
NVIDIA_G80.DEV_0193.1 = "NVIDIA GeForce 8800 GTS"
NVIDIA_G80.DEV_0194.1 = "NVIDIA GeForce 8800 Ultra" ??????
NVIDIA_G80.DEV_019E.1 = "NVIDIA Quadro FX 4600"
NVIDIA_G84.DEV_0400.1 = "NVIDIA GeForce 8600 GTS"
NVIDIA_G84.DEV_0402.1 = "NVIDIA GeForce 8600 GT"
NVIDIA_G86.DEV_0421.1 = "NVIDIA GeForce 8500 GT"
NVIDIA_G86.DEV_0422.1 = "NVIDIA GeForce 8400 GS" ??????
NVIDIA_G86.DEV_0423.1 = "NVIDIA GeForce 8300 GS"

http://www.vr-zone.com/

Note this-

NVIDIA_G80.DEV_0194.1 = "NVIDIA GeForce 8800 Ultra"

NVIDIA_G86.DEV_0422.1 = "NVIDIA GeForce 8400 GS"

THIS IS NEWS!!!!!!!!!!!!!!!!!!

Other minor stuff-

NVIDIA is set to launch mainstream 8600GTS (G84-400) and 8600GT (G84-300) as well as 8500GT (G86-300) on 17th of April. GeForce 8600GTS and 8600GT will have 256MB GDDR3 memories onboard and sports 128-bit memory interface but no HDMI yet. GeForce 8600GTS is meant to replace 7950GT and 7900GS while 8600GT to replace 7600GT and 8500GT to replace 7600GS.




*Joke 2*
By vze4z7nx on 3/5/07, Rating: -1
Joke Question...
By Nightmare225 on 3/5/07, Rating: -1
"The whole principle [of censorship] is wrong. It's like demanding that grown men live on skim milk because the baby can't have steak." -- Robert Heinlein

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki