Print 30 comment(s) - last by lplatypus.. on Mar 7 at 5:37 PM

Say hello to the new Quadro FX 4600 and Quadro FX 5600

NVIDIA today released three new Quadro products – the Quadro FX 4600, Quadro FX 5600 and Quadro Plex VCS Model IV. The new Quadro FX 4600 and Quadro FX 5600 feature NVIDIA G80-derived graphics processors tweaked for CAD/CAM and visualization applications.

With the G80-derived graphics processor, the new Quadro FX 4600 and Quadro FX 5600 have 128-unified shader units. The new Quadros are also compatible with CUDA technology too, NVIDIA’s answer to AMD’s Stream Computing technology. DirectX 10 compliance and support for shader model 4.0 are also feats of the new Quadros.

Differentiating the Quadro FX 4600 and Quadro FX 5600 is the amount of memory. The lower Quadro FX 4600 features 768MB of video memory – similar to NVIDIA’s GeForce 8800GTX. A whopping 1.5 GB of video memory is available on the Quadro FX 5600; besting ATI’s 1GB of graphics memory endowed FireGL V7350. Both Quadros have 384-bit memory interfaces though.

Although NVIDIA announced the Quadro Plex VCS Model IV at the same time as the Quadro FX 4600 and Quadro FX 5600, there are no details of the Quadro Plex VCS Model IV in the press release or Quadro Plex VCS product pages. However, expect the Quadro Plex VCS Model IV to feature the new Quadro FX 4600 or Quadro FX 5600 graphics processors.

NVIDIA prices the new Quadro FX 4600 at $1995 and the Quadro FX 5600 at $2999.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Joke Question...
By Nightmare225 on 3/5/2007 8:07:43 PM , Rating: 1
How's gaming with these? :P

RE: Joke Question...
By livinloud on 3/5/07, Rating: -1
RE: Joke Question...
By Nightmare225 on 3/5/07, Rating: -1
RE: Joke Question...
By lplatypus on 3/5/2007 9:19:48 PM , Rating: 2
Jokes aside, can someone explain the differences which make these cards bad for gaming? I know they contain extra features which aren't useful to gamers, but why would they be worse than a GeForce based on the same core?

Obviously a $500 low-end workstation graphics card would perform much worse in games than a $500 high-end gaming graphics card due to the disparate pricing structure. I'm suspicious that this has led to the rumour that all workstation cards are bad for games.

RE: Joke Question...
By Trippytiger on 3/5/2007 9:41:11 PM , Rating: 2
Very often the only difference between a regular gaming card and a workstation card is the firmware, which allows for them to work with different OpenGL-optimized drivers. Well, that and the profit margin.

At least, that's how it is with my 9800 Pro.

RE: Joke Question...
By keitaro on 3/5/2007 9:44:09 PM , Rating: 2
Clock speed differences is one thing I can think of... it's likely that the cards won't be as fast as the speedier 8800 counterpart. Also, I seem to remember that these cards can do anti-aliased lines accelerated by the chip, which would help in CAD design as well as 3D modeling when viewing wireframes.

I forgot what other differences there are to it... I haven't followed the workstation-class graphics cards in ages so someone will have to fill in the blanks for me.

RE: Joke Question...
By smitty3268 on 3/5/2007 10:26:54 PM , Rating: 3
The firmware and drivers are optimized for different uses, which means they end up a bit slower in games for a lot more money. Technically, they could be made just as fast though.

RE: Joke Question...
By theapparition on 3/5/2007 11:37:49 PM , Rating: 5
The Quadros are not clocked slower than the GeForce lines, as someone suggested (reference designs). The workstation cards from Nvidia differ only in three areas. They support hardware based anti-aliased lines which smooth the "jaggies". This has almost no benefit for games since most objects are texture wrapped. The quadro's also support hardware overlay clipping. For example, multiple windows overlapping. Once again, something that has almost no value for gaming. The third feature is driver support. It takes a lot of testing to certify the drivers to work with a specific workstation application. This effort is passed onto businesses by higher prices. If Oblivian is locking up at a specific scene, you can complain all you want about your GeForce across the internet and Nvidia will fix "when they get to it". If a cad model is crashing because of a hardware problem with your quadro, Nvidia has a team of engineers working on a fix for the next day. You pay for that service.

The short version is there is not much difference between the consumer and workstation cards. Games will run a "tiny" bit faster than the workstation bretheren for a similar series of card, and cost a lot less. However, for large cad assemblies, visualization, or rendering, the extra price of the quadros can be justified and performance can be signifigantly improved over the Geforce line.

ATI has basically the same model for their Radeon/FireGL series.

RE: Joke Question...
By Ecmaster76 on 3/6/2007 12:17:17 AM , Rating: 2
That and the precision of all operations are higher. Close enough only works in games, not for pro stuff.

RE: Joke Question...
By Alpha4 on 3/6/2007 12:42:35 AM , Rating: 2
Thanks a million for the clarification. The explanation makes a lot of sense. In the case of GeForce cards being used for CAD modeling though, do they wire frames remain aliased or is anti-aliasing forced on a software level? Or does it depend on the app?

RE: Joke Question...
By theapparition on 3/6/2007 9:46:22 AM , Rating: 2
It depends on the app, but if the app implements AA lines, it would be software based, hence the potentially HUGE reduction in performance. Some apps will be drawn(<--pun intended) to their knees with a Radeon or GeForce. Others only have a small performance hit. You'd have to look at your application to see if there a cost/benifit ratio. While the FX5600 may get the majority of "press", at $2999 its hard to justify for anything but the highest performance applications. Personally, I'm going with the FX4600 line. The extra 1000 for 768MB of memory gets me almost no performance increase for my applications.

Just an aside, back in the GeForce2/Quadro2 days, all you had to do was swap resistors on the board to turn a Geforce into a Quadro. For the GeForce4/Quadro4 days, the register was set in the chip package, however, a hacked driver had enabled all the quadro features. I had a few boards from PNY and compared 2. One was a Geforce4600, the other a Quadro950. The part number on the pcb's was the same, and they had the same components on them, which shows you how related they were. I don't know of anything to turn a 5/6/7/8x series into a Quadro.

Most of my experience for workstations has been with Nvida because they simply had much better support for OpenGL than ATI for my applications. But you have to look at what you use to make the right determination. It's hard to go wrong with either these days! Long live competition.

RE: Joke Question...
By leidegre on 3/6/2007 3:21:30 AM , Rating: 2
Here's an article at Wikipedia about the ATI FireGL cards,

I would guess that the diffrences between the Quadro/GeForce cards are roughly the same.

As for the cost, I think someone made a point about customer care, which would mean that these cards come with some serious customer support for the workstation applications, hence the asked price, as well as the obvious smaller demand.

"We can't expect users to use common sense. That would eliminate the need for all sorts of legislation, committees, oversight and lawyers." -- Christopher Jennings
Related Articles

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki