backtop


Print 64 comment(s) - last by AstroGuardian.. on Jan 25 at 8:59 AM


The GF100 die

GF100 SM unit (click to expand)

(Click to expand)
Will hopefully give ATI some healthy competition

NVIDIA has been under a lot of competitive pressure over the last year, but especially during the last four months as its primary competitor ATI has launched six desktop DirectX 11 GPUs and a complete Mobility Radeon lineup for notebooks. Over two million DX11 GPUs have been sold so far, all of them from ATI. NVIDIA had a fairly large presence at the 2010 International Consumer Electronics Show in Las Vegas, but also had a series of briefings for select members of the press the week after CES. The embargo date is today, and some of the details that were discussed can be revealed.

There are three things that most enthusiasts have expected from NVIDIA's next-generation gaming GPU based on the Fermi architecture. There was no doubt that it would sport a large die size, run hot, and be expensive. The big question was whether it would be more powerful than anything the graphics division of AMD could muster. NVIDIA has been promising everybody since its GPU Technology conference in October that it would "blow ATI away", but we're been waiting on the hardware while ATI's GPUs dominated the holiday shopping season.

NVIDIA is saying that GF100 chips are in production, but we don't have details on yields or how many wafers are being produced at the Taiwan Semiconductor Manufacturing Company (TSMC). GF100 chips are being produced on the 40nm process, and ATI has had a hard time with the process since it first began the transition in March of last year. The GF100 has over 3 billion transistors, 50% more than the Cypress GPU which is fairly large at 334mm^2. Initial reports are that the GF100 will exceed 500mm^2, which means that there will be a lot of chips that won't be able to run at full capabilities. We can probably expect defective chips to be used in cut-down GF100 variants.

The first boards will be launched at the end of February, with first availability in March. However, volumes will be a problem, and we have been hearing concerns from some board partners that there won't be enough chips to meet demand until April, at best. The initial flagship card will launch with 512 stream processor cores (which NVIDIA is calling CUDA cores), 48 ROPs and a 384-bit bus running GDDR5.

There are sixteen Shader Multiprocessing core (SM cores) consisting of 32 Cuda cores each. Each SM core also has 16/48KB of dedicated L1 cache, four texture units, and a PolyMorph Engine. The PolyMorph Engine handles geometry on the GPU and is responsible for Vertex Fetch, Tessellation, Viewport Transform, Attribute Setup, and Stream Output functions.

One of the biggest features of DX11 is hardware tessellation, and NVIDIA is looking to beat ATI at their own game. Tessellation is one of the few features that are very visible on screen while gaming, and can have a very large visual impact.

NVIDIA has been talking about 3D gaming for a couple of years, even if no one wants to make the hardware. The company was showing off their "3D Vision Surround" concept at CES as being a marked improvement over ATI's Eyefinity multiple display technology. However, while ATI's Radeon HD 5000 series cards all come with support for three monitors on a single card, NVIDIA's version is only capable of supporting two displays on a single card. More graphics cards must be added in order to support three or more monitors.

There is one question that no one seems to be asking NVIDIA: Where are the next generation mainstream DirectX 11 graphics cards? Over 90% of graphics cards sold are priced at less than $200, and NVIDIA will have to come up with something soon if they want to stop losing mainstream market share.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Fermi is...
By Amiga500 on 1/18/2010 10:42:53 AM , Rating: 4
Too big,
Too slow,
Too hot,
Two months+ away.

Nvidia are screwed this round. They better be developing a strong counter to Northern Islands or they are looking at a proper capitulation in the discrete graphics market.




RE: Fermi is...
By shin0bi272 on 1/18/10, Rating: -1
RE: Fermi is...
By redbone75 on 1/18/2010 11:18:52 AM , Rating: 5
By the time Nvidia has any serious volume out with Fermi it will be close to refresh time for AMD. Even if it proves to be the faster overall card, I'm thinking AMD will just play the price war game with them to retain/increase market share. Nvidia are painting themselves into a tight spot here.


RE: Fermi is...
By jurassic512 on 1/23/2010 4:09:33 PM , Rating: 2
By the time Nvidia has any serious volume out with Fermi it will be close to refresh time for AMD.

Except when looking at AMD/ATi 2010 roadmap for discrete graphics... I don't see ANY for ALL of 2010. Same for any new CPU architectures either... aside from stitching 2 quads together to get Bulldozer (a la Pentium D ;)), AMD has nothing new to offer. Why else are they releasing the same Phenom II chips with clock bumps (again, a la Intel Pentium 4 (EE)). Phenom II 975 coming out at 3.6GHz (YAY!), but at 140w TDP/ACP (BOOOOOO!!!). AMD is desperate. The $200 i5 750 demolishes everything AMD has for desktop... and at a much lower clockspeed. Tom's has a nice clock for clock review. Not like you couldn't tell from any other Phenom II 965 vs Q9550 and up chip reviews MONTHS ago. With the time ATi had with 40nm 4000 series chips, you'd think they'd use that to more than just refresh the 4000 series and add DX11 to the 5000 series.

When you're as big as nVIDIA, you don't need to lower prices. Oh, and in case you forgot, its the cards UNDER $200 that make both companies their money, so yea, if Fermi is better, nVIDIA will need to think hard about what they will charge, especially coming to the table this late in the DX11 game. Another bit of news... Remember no AA for ATi in Batman? I just read yesterday they have ONE guy working on that... and its still not done. I think ATi needs to get a lil closer to the developers and work together. Of course the ATi fanboys will say nVIDIA just pays the devs to add TWIMTBP to their games. Of course that's 100% False. Look at motherboards and Zalman CPU coolers for example. Both take forever, if at all, to drop in price. Zalmans never drop (the 9700NT is still $70-80), but ppl still bought them saying they were the best (when other cheaper coolers performed better (my OCZ Vendetta 2 for example ($40cdn)).

But thats the disease of the fanboy/uneducated.


RE: Fermi is...
By jurassic512 on 1/23/2010 4:12:06 PM , Rating: 2
wow, i jumped all over the place with that post. had a lot of stuff to get out, so forgive me... yea like that will happen. Cant wait to see what kinda replies i'll get...


RE: Fermi is...
By AstroGuardian on 1/25/2010 8:59:18 AM , Rating: 2
Wohohooo... i guess no relies for you my friend. But you too sounded like a fan boy. But i share your opinion generally.


RE: Fermi is...
By clovell on 1/18/2010 11:35:04 AM , Rating: 3
While the OP wasn't exactly comprehensive, it was moreso than yours. From a business standpoint, those are all critical factors. Whether they affect your personal purchasing decision is an entirely different issue.

Die size will affect yields and availability, which, as AMD has ironically demonstrated, will kill revenues even with a far better performing product. Power consumption and heat could also limit how many mainstream users choose Fermi as an upgrade path. There are some big issues here, and Nvidia will have to either pull a crazy whodunit here with Fermi, or make up some ground in the next gen.


RE: Fermi is...
By Reclaimer77 on 1/18/10, Rating: -1
RE: Fermi is...
By nuarbnellaffej on 1/19/2010 12:21:01 AM , Rating: 5
Instead of making stuff up, how about you do 2 minutes of research on Wikipedia before you post.

"Obscene amounts of voltage"!?

Obviously you have no idea what you are talking about, because you do not measure power usage in volts, since that is a measurement of potential difference, power usage is measured in watts.

Which by the way the 4870 sucks up ~150 under load, which is not very much, the gtx 280 alone draws about 236 watts.


RE: Fermi is...
By Reclaimer77 on 1/19/10, Rating: 0
RE: Fermi is...
By luke84 on 1/20/2010 2:50:10 PM , Rating: 3
Now that's just a load of c*ap.

Tom's is known for sidding with nVidia, but put that aside and you get complete and utter nonsense from you. 4870 pulls 13W more than the ORIGINAL gtx260 with 196 shaders, which is pummeled by the 4870, so comparing it with the gtx280 isn't illogical.
And btw, those 280W are not all from the 4870, it's probably total system consumption.


RE: Fermi is...
By Amiga500 on 1/18/2010 11:40:59 AM , Rating: 5
quote:
Well with AMD's 5870 and 5970 hard to find for the past couple of months and the lack of dx11 games (well a lot of them anyway) there hasnt been a huge rush to AMD's side of the fence. A lot of people (like me for instance) are waiting to see what nvidia puts out and how it compares (both performance and price wise) before they plunk down their coin.


Well done.

I've made many bad posts here and elsewhere, but even I've struggled to contradict myself within 1 sentence, then go on to add to that contradiction in the subsequent sentence.


RE: Fermi is...
By B3an on 1/21/2010 12:27:40 AM , Rating: 2
I dont know why you said Fermi is too slow. No one knows how fast it will be. But with them specs and transistor count i wouldn't be at all surprised if it's significantly faster than ATI's top end.

Fermi is a pretty big step with GPU's, theres lots of interesting features about it. It could be a great card. People are too quick to judge.

I have two 5870's btw, but i'm SICK of fanboys.


RE: Fermi is...
By islseur on 1/18/2010 11:16:01 AM , Rating: 2
I think you are rushing with conclusions before the battle even begun. Have some patience.

I personally waiting already with my new system only on Nvidia. AMD/ATI don't have good support for linux, while Nvidia's driver support is great. + there are other consideration's like GPGPU for developers and other stuff coming right around the corner like OpenCL which combined with OpenGL is one step ahead of DirectX.

So take a seat and wait for the movie to begin first.


RE: Fermi is...
By mcnabney on 1/18/2010 6:18:11 PM , Rating: 2
Why do you need a high powered graphics card like Fermi if you are running Linux?

The few Linux games don't need the power and there are a lot cheaper ways to do video conversion (and start right now) than waiting for Fermi.


RE: Fermi is...
By HotFoot on 1/18/2010 10:08:27 PM , Rating: 4
My guess would be dual-boot.


RE: Fermi is...
By KoolAidMan1 on 1/19/2010 1:02:24 AM , Rating: 2
I agree. I have time, I'll wait and see on Fermi prices and performance before deciding which card I'm going to go with next.


RE: Fermi is...
By nafhan on 1/18/2010 11:35:41 AM , Rating: 2
I think they should have done the minimum possible to get DX11 running smoothly and then focused on getting a top to bottom DX11 40nm product line. They probably won't realize all the advantages from their new general purpose shaders until the next gen anyway. Time will tell, though.


RE: Fermi is...
By bighairycamel on 1/18/2010 11:39:57 AM , Rating: 4
quote:
Too big,
Too slow,
Too hot,
Two months+ away.

Nvidia are screwed this round. They better be developing a strong counter to Northern Islands or they are looking at a proper capitulation in the discrete graphics market.

Too much speculation.

Looking at the hardware there are some impressive features coming, notably the polymorph engine. There is a reason they took so long to release the cards and as anand says they are clearly looking to outperform 5970. But speculation gets us nowhere at this point so just hold on to your panties.

And did you know that NVidia sold MORE cards during the ATI 5xxx launch then before? The GTX 260s and 280s were constantly in low stock, not because they were ramping down production but because people were buying them up because 5xxx cards were hard to find.

It was actually ATI who shot themselves in the foot this round, whether they won the performance crown or not. They had poor yields which caused low stocks. After waiting for months many (including myself) gave up and decided to wait and see what NVidia had to offer with Fermi. If ATI had the supply at launch we wouldn't have seen prices soar and their market share would have increased. The 5850 was ~$260 at launch and within two months was $310+ causing people like myself to wait it out. Even though we're starting to see 5xxx prices fall again, the NVidia launch is so close we all might as well wait.

After Fermi finally launches I'll be buying the best price/performace/power whether it's ATI or Nvidia. If anything else, the Fermi launch will bring ATI prices back down to where they should be.


RE: Fermi is...
By Mitch101 on 1/18/2010 12:17:16 PM , Rating: 5
AMD/ATI has already sold over 2 Million DX11 Cards. NVIDIA Still hasn't released theirs.

NVIDIA under ordered silicon wafers which is why their chips are in short supply not because of demand. AMD did the same speculating FERMI was going to be released around the same time ATI offered their DX11 chip and didn't want to get caught with a number of chips not selling. Luckily for them NVIDIA can't get the chip out the door.

NVIDIA doesn't appear to have a Low and Mid level DX11 card with FERMI and this is where the money is.

NVIDIA requires 2 graphics cards to output 3 monitors. Something the ATI cards can do out of the box. If you want to drive 3 1680x1050 monitors you get buy a single $160.00 Radeon 5770 to do this well. To drive 3 1920x1080/1920x1200 monitors you could buy a single Radeon 5870. How much will 2 Fermi's cost?

NVIDIA is also has yield problems with FERMI that's why it went through another respin. Also its more difficult to get good working chips from 3 billion transistors than from 2 billion transistor chips. Both have yield problems still but its improving. AMD/ATI has the advantage with higher yields and cheaper chip production.

Prices also soared on AMD/ATI because there is no competition.

I have no doubt that FERMI will be the fastest out there but at what Price? Can they even produce a DX11 Mid-Low end card? While I want competition from NVIDIA to drive prices down the cost of FERMI leaves AMD/ATI with the major advantage.

We still haven't seen the Radeon 5890 which could close some of the gap between Fermi and the current 5870.

Until Fermi actually shows up on a shelf.


RE: Fermi is...
By bighairycamel on 1/18/2010 5:25:43 PM , Rating: 1
quote:
NVIDIA under ordered silicon wafers which is why their chips are in short supply not because of demand. AMD did the same speculating FERMI was going to be released around the same time ATI offered their DX11 chip and didn't want to get caught with a number of chips not selling. Luckily for them NVIDIA can't get the chip out the door.
I have no idea where this information came from, I can't seem to find anything related to this. NVidia's stock price grew a whopping 43% in December from increased sales. As for ATI, low supply was related to 40% yields on the 40nm side. Source:http://www.digitimes.com/news/a20091105PD213.html
quote:
NVIDIA doesn't appear to have a Low and Mid level DX11 card with FERMI and this is where the money is. NVIDIA requires 2 graphics cards to output 3 monitors. Something the ATI cards can do out of the box. If you want to drive 3 1680x1050 monitors you get buy a single $160.00 Radeon 5770 to do this well. To drive 3 1920x1080/1920x1200 monitors you could buy a single Radeon 5870. How much will 2 Fermi's cost?
Again, you have no valid source for this information simply because no cards have been released or manufactured yet!!! No one can claim what Fermi cards are capable of because no Fermi card has been released for testing. Unless NVidia has come out and specifically said you need two cards for three monitors, the information came out of someone's ass.

The CES Fermi demo was in no way indicative of the final products because they were using the GPU on prototype cards. They were demoing the GPU only. Besides, triple monitors is a niche market and like you said, low to mid-end is where the money is, and surely NVidia won't be abandoning that market. There are plenty of things that can be done on the low end, like recycling GPUs with faulty cores or less tolerable clock speeds.

ATI still may very well come out on top, but my overall point is that ATI failed to capture the market share they could have from Oct-now. They were unprepared for the 40nm transition across the board; from 4xxx cards to 5xxx. Now people are waiting with patience to see a comparison before shelling out hundreds for a DX11 card.


RE: Fermi is...
By mcnabney on 1/18/2010 6:24:02 PM , Rating: 3
You can't argue the positive 'likely' features of Fermi and ignore the negative 'likely' features.

Since Fermi does not exist as a saleable product all you can safely say is that Nvidia does not have a competing product in the $150-$500 segment and is currently not capable of supporting DX11.


RE: Fermi is...
By Mitch101 on 1/18/2010 9:41:33 PM , Rating: 3
http://www.anandtech.com/video/showdoc.aspx?i=3721...

Regardless of to what degree this is a sudden reaction from NVIDIA over Eyefinity, ultimately this is something that was added late in to the design process. Unlike AMD who designed the Evergreen family around it from the start, NVIDA did not, and as a result they did not give a GF100 the ability to drive more than 2 displays at once. The shipping GF100 cards will have the traditional 2 monitor limit, meaning that gamers will need 2 GF100 cards in SLI to drive 3+ monitors, with the second card needed to provide the 3rd and 4th display outputs. We expect that the next NVIDIA design will include the ability to drive 3+ monitors from a single GPU, as for the moment this limitation precludes any ability to do Surround for cheap.

ATI 60%-80% Yields
NVIDIA yields are as low as 20% and are not improving quickly.
http://www.fudzilla.com/content/view/17205/1/

As for the NVIDIA shortages its believed NVIDIA was EOL'ing the chips and when Fermi was delayed they had to quickly order wafers to make chips again. Had Fermi been on time this would have been a no brainier and they wouldn't have any excess inventory. You really don't want to get caught with a number of chips you cant sell. So had Fermi been on time NVIDIA would have announced the EOL instead they had to order wafers to continue making the chips that were supposed to EOL. Thus a shortage and a cover up saying they are selling so well. There is always an increase of sales around X-mas Im sure they sold more than usual but were betting on Fermi getting out the door. Would have been a good move if they didn't hit the delay.

Its believed because of poor yields and the size of the Fermi that it costs NVIDIA about $180.00-$200.00 a chip to produce. Thats just the chip and doesnt leave any room to produce a low to mid level card and very little room for high end.

Read this its very well composed about the Fermi's chip size and yield and how that plays out against ATI's chips.
http://www.jasoncross.org/2009/09/30/a-few-thought...

Sure ATI lost some sales but they are still the only DX11 game in town. Like you said no cards have been released or manufactured yet!!! If the card is an ATI killer NVIDIA would be showing benchmarks by now to stop sales of ATI's DX11 but if they are only marginally better then ATI sales go through the roof now.


RE: Fermi is...
By Calin on 1/19/2010 2:34:35 AM , Rating: 2
NVidia will probably soon have "half a Fermi" cards, as cutting them down to size will be easier than with previous iterations. Maybe even "a quarter of Fermi" for lower end. However, they hope to sell to early adopters that would be happy with half a Fermi a full card for more money.


RE: Fermi is...
By Mitch101 on 1/19/2010 9:03:15 AM , Rating: 2
I hope so. Little price war goes a long way in my wallet.


RE: Fermi is...
By jurassic512 on 1/23/2010 10:23:22 PM , Rating: 2
Fermi is much more scale friendly than GT200 was. It was designed that way.


RE: Fermi is...
By Phoque on 1/19/2010 7:29:05 PM , Rating: 2
I don't know if its feasible, but perhaps a Matrox Dual/Tripple Head To Go would be cheaper than a second Fermi.


RE: Fermi is...
By Calin on 1/19/2010 2:31:04 AM , Rating: 2
"NVIDIA requires 2 graphics cards to output 3 monitors."
This is based on the engineering sample shown to Anandtech. While it's possible the production cards to be different, it's a safer bet that they won't be.


RE: Fermi is...
By jurassic512 on 1/23/2010 10:58:57 PM , Rating: 2
Even if nVIDIA, or ATi for that matter, could do 3xDVI on a single card (which Fermi won't, they already said)... does anyone truly think monitor sales and $400+ gfx card sales would gain ground? LCD's are cheap enough for consumers to dump their CRT's or just to get one at a good size for little money, or it just happens to come with the computer they wanted because the price was decent and LCD is the norm. But does that mean they'll grab 2 more the next time they're out? $400 desktop GeForce/Radeon cards make the least amount of money. Again, where is the money in that. Eyefinity depends too much on how much of a gamer you are. What else is Eyefinity even good for that 3 monitors without the technology can't do?

Example:

3 x 24" LCD = $650 minimum (if you're hardcore enough to spend that on 3 monitors for gaming you're not gonna buy garbage)

1 x $400+ ATi 5870 (DX11 Graphics card
that can run at gamer resolutions and detail)

So the average person is gonna spend $1050 on LCD's and a graphics card alone? Amazing... whatever you're on... gimme a double.


RE: Fermi is...
By jurassic512 on 1/23/2010 10:04:38 PM , Rating: 2
quote:
quote: NVIDIA doesn't appear to have a Low and Mid level DX11 card with FERMI and this is where the money is.


ATi ain't billionaires with a few 5000 series GPU's sold so far. nVIDIA didn't have Fermi, but they did get a ton of 40nm wafers for Tegra 2, mobile and desktop parts from TMSC... which is where the money is. 90% of consumers dont even know DX11 is even out, let alone available hardware to run it. What good would it do them? How is that gonna make their photos transfer any faster from their cameras? How is it gonna make their printers run faster with better colour? How is it gonna make Facebook load faster? Their are still too many computers being sold with nVIDIA 8000/9000+ series and ATi 3000/4000 series GPU's still. It's sad people constantly assume comsumers buy computers to play games. If they did, they wouldn't be buying them from HP, Acer, and Dell, making them numbers 1, 2, and 3 in the PC game (no pun intended).

If i wanted a supercar, I certainly wouldn't go to Dodge.


RE: Fermi is...
By jurassic512 on 1/23/2010 10:20:32 PM , Rating: 2
PS. have you even seen DiRT 2 benchmarks on a 5700 series card at 1920x1080 resolution, max detail? It's not pretty. Don't even bother looking at benches with a 5600 series card. :( Your <=$200 cards hard at work pumpin out that DX11 goodness... I don't think so.

Gaming reference i know (after what i said in a previous post), but why would today's consumer need a 5600 or 5700 series DX11 card right now if not for gaming? They don't.

PPS, where are the effing DX11 games?! Or did you forget! ATi still sucks at Folding@Home and hardware accelerated video encoding (Cuda vs Stream), so really your (fanboys) only "win", is you got it (DX11) first... which doesn't mean it's better.


RE: Fermi is...
By The0ne on 1/19/2010 11:45:59 AM , Rating: 2
Many chips are in short supply or even longer being produce until a large demand is made. We're suffering with long lead times with many chips with some being as general as RS232 from TI having lead time of 7 months!

Part of the decision is to reduce inventory and thus have more cast on-hand. This is always a good thing of course but then you also have to be able to meet demand when the orders come in. Given the market situation we're already projecting at least a 4 month lead time to get going. That is not productive at all.


RE: Fermi is...
By Zapp Brannigan on 1/18/2010 1:32:56 PM , Rating: 4
quote:
There is a reason they took so long to release the cards and as anand says they are clearly looking to outperform 5970.


Err, on anandtech's fermi article it states

quote:
NVIDIA is clearly aiming to be faster than AMD’s Radeon HD 5870, so form your expectations accordingly.


I expect fermi will be the fastest single gpu card but not faster then the 5970 and will find itself on it's own in the marketplace, faster and more expensive then the 5870 but slower and cheaper then the 5970. As long as it doesn't have the same thermal characteristics of the 5970 then it'll be a decent enough card.


RE: Fermi is...
By clovell on 1/18/2010 5:21:19 PM , Rating: 3
> And did you know that NVidia sold MORE cards during the ATI 5xxx launch then before?

Just in time for the holiday season, too. People spend money around Christmas...

> It was actually ATI who shot themselves in the foot this round, whether they won the performance crown or not. They had poor yields which caused low stocks. After waiting for months many (including myself) gave up and decided to wait and see what NVidia had to offer with Fermi. If ATI had the supply at launch we wouldn't have seen prices soar and their market share would have increased.

Fantastic point. Now - what makes you think that with such a gargantuntaun die size and transistor count that Fermi will somehow launch immune to these same problems? And what do you think everyone will be buying when they can't find a Fermi card in stock? Certainly not a GTX 260.

You just about come to a good point - IF Nvidia can have enough stock at launch, and IF they can compete in the price/performance race, THEN they may do alright for themselves. More likely than not, though, AMD's yields will have improved, and they'll just slash prices to compete (which is great for consumers).

After reading through Anand's review I think it'll be another generation before Fermi really begins to make progress. Then we'll see the dividends of the investment Nvidia has made. However, I wouldn't count on AMD to stand still in the meantime.


RE: Fermi is...
By bighairycamel on 1/18/2010 5:33:29 PM , Rating: 2
I couldn't agree more. If NVidia fails to produce ample supply, it's still a win-win for consumers because now ATI will be lowering prices.

So in scenario one, NVidia has a better offering and the wait will have been worth it.

Scenario two, NVidia underperforms or undersupplies in which case I can pick up an ATI card at a lower price point and live with the peace-of-mind that I'm definately getting my money's worth.


RE: Fermi is...
By mcnabney on 1/18/2010 6:33:55 PM , Rating: 3
Nvidia may have a huge financial problem in marketing Fermi. Fermi will likely have a huge die and will by definition have lower yield percentages. It is likely that AMD/ATI could manufacture the 2 chip 5970 for nearly the same price as a single Fermi board. That will give AMD/ATI the freedom to cut prices to meet their production and maximize their profit.

I am guessing that Fermi will beat the 5870 by an average of 20-25%. The 5970 should beat Fermi out by a similar figure. That leaves AMD/ATI in the drivers seat on pricing. Let's all hope that AMD/ATI orders a ton of chips from TSMC so they can put the hurt on Nvidia and get prices down nice and low. The PC gaming market can really use a shot in the arm right now.


RE: Fermi is...
By beck2448 on 1/18/10, Rating: -1
RE: Fermi is...
By Calin on 1/19/2010 2:39:59 AM , Rating: 2
Everybody seems to think a quantum leap is big.
Well, to rain on your parade, a quantum leap is microscopic


RE: Fermi is...
By luke84 on 1/20/2010 2:59:34 PM , Rating: 2
Now that deserves a +5, hands down.


RE: Fermi is...
By rzrshrp on 1/21/2010 9:10:15 AM , Rating: 2
It seems that you're right and wrong at the same time. A quantum leap in physics, if I'm reading correctly, is a sudden jump of an electron from one quantum state to another. Even though, it happens on a molecular level, it is still "big" and maybe more importantly, sudden and discontinuous. Because of that, sudden changes concerning anything are often called quantum leaps.


RE: Fermi is...
By jurassic512 on 1/23/2010 3:47:09 PM , Rating: 1
It's amazing how ppl think release dates are written in stone. Not everything runs smoothly, ESPECIALLY when it comes to technology. It's sad that ppl focus so much time (other than analysts/reporters), on stuff that happens quite often (delays etc), but make it look like the beginning of Doomsday.

PS. GF100 has more DX11 AND CUDA hardware than ATi has. Look at both architectures. Seems ppl in IT these days are suffering from ADD or just immaturity. It get's old hearing the crying over stuff that happens EVERYDAY in tech land.

PPS, if TMSC (ATi) 40nm yields were so bad when they were <=40%, what sense would it make for nVIDIA to start production at the same time? Hell using fanboy logic, you should be thanking nVIDIA for only taking 40nm chips for mobile and GTS cards. Shit happens, stores run out of stock, things dont always go as planned... its called LIFE/technology. Deal with it.

Something for the fanboys to chew on...

Phenom I (LATE and sucked)
ATi 2000 series (LATE and sucked)
Fusion (3+ years late)
Hardware Physics (recently with Havok) (Failed)
ATi Driver Hotfixes (way too many of them.
ATi Stream (Still not in the Catalyst Driver wtf!?)

ATi can charge as little as they want. Other than tha fact they have NO choice but to... nVIDIA comes out on top because of what it brings to graphics other than just gaming. That's like saying Linux is better because it kinda does the same thing, and its free. You're business would fail instantly with that kind of thinking.

These are facts, don't confuse em with fanboyism.


RE: Fermi is...
By OKMIJN4455 on 1/24/2010 6:38:38 AM , Rating: 1
http://www.brand-bar.com
New to Hong Kong : Winter Dress

---**** NHL Jersey Woman $ 40 ---**** NFL Jersey $ 35
---**** NBA Jersey $ 34 ---**** MLB Jersey $ 35
---**** Jordan Six Ring_m $36 ---**** Air Yeezy_m $ 45
---**** T-Shirt_m $ 25 ---**** Jacket_m $ 36
---**** Hoody_m $ 50 ---**** Manicure Set $ 20 ... ...
http://www.brand-bar.com


RE: Fermi is...
By OKMIJN4455 on 1/24/2010 6:43:28 AM , Rating: 1
http://www.brand-bar.com
New to Hong Kong : Winter Dress

---**** NHL Jersey Woman $ 40 ---**** NFL Jersey $ 35
---**** NBA Jersey $ 34 ---**** MLB Jersey $ 35
---**** Jordan Six Ring_m $36 ---**** Air Yeezy_m $ 45
---**** T-Shirt_m $ 25 ---**** Jacket_m $ 36
---**** Hoody_m $ 50 ---**** Manicure Set $ 20 ... ...
http://www.brand-bar.com


Fixed...
By XZerg on 1/18/2010 9:25:35 AM , Rating: 3
quote:
There is one question that EVERYONE seems to be asking NVIDIA.




RE: Fixed...
By shin0bi272 on 1/18/2010 11:00:29 AM , Rating: 3
Actually I can think of 3 not one.

1. when is it coming out
2. when can we finally see some benchmark numbers from it
3. How much is it


RE: Fixed...
By BuckinBottoms on 1/18/2010 11:09:31 AM , Rating: 3
quote:
Text

There is one question that no one seems to be asking NVIDIA. Where are the next generation Intel comic strips ?


RE: Fixed...
By Dreamwalker on 1/18/2010 1:08:40 PM , Rating: 2
4. Will it have PAP and bitstreaming support out of the box, maybe even HDMI 1.4


RE: Fixed...
By Developer on 1/18/2010 11:59:40 AM , Rating: 2
Leaving all ATI-NVIDIA "fanboyism" aside, Nvidia is somewhat in a tight position right now. AMD/ATI has nicely been filling it's coffers for the last 7 months with it's next generation dx11 gpu's. Also scored solo with the holiday season.

Nvidias Fermi is still about 2 months away. By the time it comes out, AMD/ATI will presumably lower it's dx11 cards prices. Also most of the cards ATI & its partners are offering right now are overpriced, because there is no real competition when it comes to dx11 and the demand is somewhat greater than availability because the TMSC had problems with its production lines etc.

Also as the author of this aritcle pointed out - about 90% cards sold out there are mainstream offerings ($200 range).

I really don't see any Nvidias next generation cards coming out that cheap at the moment. I believe that the Fermi will be a very good performing card and even beat AMD/ATI top offerings (time will tell), but Nvidia should better come out with it's mainstream offerings soon if they plan to compete with AMD/ATI.


RE: Fixed...
By LRonaldHubbs on 1/18/2010 7:54:17 PM , Rating: 2
Agreed with most of what you said. However, DX11 GPUs have not been out for 7 months, as the 5870 launched in late September. It's been just shy of 4 months.


RE: Fixed...
By Belard on 1/19/2010 2:25:28 AM , Rating: 2
No... you gotta go with speculative time.

4 months now, + 3 months (Feb/March/April) = 7... it'll take 2-3 months just to get production up.


RE: Fixed...
By Calin on 1/19/2010 2:39:04 AM , Rating: 2
"By the time it comes out, AMD/ATI will presumably lower it's dx11 cards prices."
Or, based on the performance figures from GF-100 and its price, might keep the same prices. Hard to think NVidia won't get thru the same ordeal AMD has, with cards selling well above MSRP due to shortage.


RE: Fixed...
By Bateluer on 1/19/2010 1:19:46 PM , Rating: 2
AMD can cut the prices on their DX11 chips, being smaller than the 500mm^2+ size of Fermi. Fermi will be massive, even on the 40nm process, compared to Evergreen.


GTX 395 and power
By nafhan on 1/18/2010 11:29:05 AM , Rating: 2
I'm wondering: if this thing is hotter and more power hungry than Cypress, is a dual GPU solution going to be possible? ATI already has to downclock to meet PCI express 2.0 power specs on their dual GPU solution. That could leave us in the interesting position where Nvidia has the most powerful GPU, but ATI has the fastest single card solution. We'll see...




RE: GTX 395 and power
By Zapp Brannigan on 1/18/2010 1:37:03 PM , Rating: 2
quote:
Nvidia has the most powerful GPU, but ATI has the fastest single card solution.


That's exactly what i see happening.


RE: GTX 395 and power
By SavagePotato on 1/18/2010 4:49:33 PM , Rating: 2
It has 3 billion transistors instead of 2 like ATI, and is built on 40nm as well. Chances are it will be quite hot, and quite expensive if it's yields are as poor as expected.

Competition in the graphics card world is a great thing however for the consumer.

I would have never expected nvidia's last gen chip to make it to a dual gpu solution either, and it needed a die shrink to do it if I am not mistaken. That may be needed for a dual gpu fermi as well.

I really wish nvidia's CEO would die in a fire or be replaced, or possibly open the final can of whoop ass on himself. Nonetheless from a techie standpoint I'm interested in seeing the performance results, and welcome anything that keeps prices competitive.


RE: GTX 395 and power
By FITCamaro on 1/18/2010 6:17:20 PM , Rating: 2
Yeah with the size of this chip and the heat its going to put out, not to mention the power requirements, I foresee them having a difficult time doing a dual GPU card until after a die shrink even with downclocking.


GPUs and Games Consoles
By Aloonatic on 1/19/2010 6:52:52 AM , Rating: 2
As there are many people who say that (on the whole) PC gaming is becoming more and more of a niche market...

When do consoles usually make their decisions about their hardware choices? Also, How will the current state of play with nvidia and amd affect the next generation of games consoles? How do you think that MS and Sony view the 2 companies offerings?

There seems to be a general consensus that the mid-range/value markets are actually the most important for revenuse too, but how much of a difference does a consle contract make?

It seems that consoles are pretty much gaming HTPCs these days. After the RROD hoo-har, how much will thermal characteristics (tho RROD probably had as much to do with overall xBox 360 design in the end) make companies decision for them too.

I hope this rambling comment makes sense. Am tired and seem to be super bsy at work at the moment.




RE: GPUs and Games Consoles
By The0ne on 1/19/2010 11:50:37 AM , Rating: 2
I honestly wouldn't worry about those people making "those" comments. I think PC gaming is healthy and will only grow, due in part mostly to FPS, MMO's and hopefully soon RTS again.

I would like more diversity but I take what I can get if the game is good :) Would mind a grand sequel to Heroes of Might and Magic 3 myself :D


RE: GPUs and Games Consoles
By The0ne on 1/19/2010 5:20:53 PM , Rating: 2
For the console hardcore gamers to drool over purely base on their "graphic" taste, check out Medal of Honor for 2010

http://www.youtube.com/watch?v=C3VLXGMACzs

Simply amazing and realistic graphics above anything out today. PC gaming is dying anytime soon ^_-

I welcome to refreshed remake of medal of honor. Still have fond memories of the first game :)


RE: GPUs and Games Consoles
By Aloonatic on 1/20/2010 5:37:06 AM , Rating: 2
I think that video probably demonstrates the point though. What hardware was needed to produce that? Will you be buying it?

I don't think anyone is arguing that PCs are not the peak of gaming graphical power. However, the number of people who are more than happy to pay for that, compared to those who are willing to "settle" for console gaming, is shrinking it seems. I am mostly going on anecdotal evidence and my own experience however.

Then, many of those who do want to play PC games still are not going to want pay for the best, because they simply can't and the value/mid end market is where their coin will go. When you get the most powerful cards, you suddenly need a more powerful processor to feed it/them, and a good monitor (or 2 or 3) to display their output on and the cost escalates rather quickly.

So back to the nvidia point, all they seem to be able to do is produce and make money on at the moment are cards at the more expensive end of the scale, which is the problem for them it seems.

I know that I would love to be able to afford a PC that could have a quad fermi sli rig in it etc, but it's not going to happen, no matter how good the demo video is. And it's becoming harder and harder to justify the massively increased cost of this, only to play games that are essentially the same as their console counterparts, but looking nicer.


GF100 Ouch!
By Belard on 1/19/2010 3:21:54 AM , Rating: 2
But we're going to see Nvidia do the same thing they have for the past 2 years... Making big heavy chips that costs a lot of money.

I think Anandtech published a review on ATI's game plan when they came out with the HD 3000 & 4000 series because the 2000s pretty much bombed.

Even now, Nvidia doesn't have a smaller version of the GT200 (whatever) chip design. Meanwhile, AMD reduced and introduces new products for the entire market. Not just slap a new label on a card and call it NEW? Anyone want to buy the Geforce GT 310? Anyone? Oh yeah, its 100% the EXACT same card as the junky GT210?! Check it out yourself on their website.

Okay, so Fermi/GF100 comes out (Nvidia's own codes are all over the place), its most likely going to be like the GTX280 with an MSRP of $600, it that. Then ATI kicked Nvidia in the balls with the $350 4870.

But history shows time and time again, its the $100~200 that Gamers buy the most of with best profit margins. When the 8800GT and ATI 3870 came out at $200, people went nuts for these... not as fast as the $500/600 8800GTX/Ultras but sure as hell a lot cheaper.

So, when Fermi/GF100/GTX 480/Voodoo7 whatever they pull out of the hat comes out, it maybe faster than the 5970... it may require a 700watt PSU who knows. All ATI has to do is get the 5850 down to $200 and not many people will care. ATI has done a better job in getting their prices down with Newer tech. (AMD needs this money)

By the way, when YOU watch that Rocket Sled demo being played. It is a DEMO-Game in which you can control how powerful the engine is. So the results can be very different. Looks very neat and all... but the IMPORTANT thing about smooth running demo is...

There are three (3) GF100 cards in the computer in SLI! Thats about $1500~2000 in graphics cards?! W T F ! !

Yeaaah... I see a huge market for that! When games get that powerful, something with that horse power WILL need to be on a single card that costs $100~150. In about 2 more generations or 2014.

ATI does need to come up with a PhysX solution of some sort... who knows, maybe they are doing that in a back room. Both Intel and Nvidia bought such IP, but its not impossible to make something from scratch. And that is where the FGF1-GTX45 strongest weapon. Even thou PhysX is still mostly software that is running on the GPU (where it belongs) and gamer developers won't embrace something like that until ATI has their own and/or its something that added to DX12 / DX13. Who knows...

Competition is good... GF100 may win the crown again, but it won't have much of a kingdom to rule.




RE: GF100 Ouch!
By Developer on 1/19/2010 5:49:54 AM , Rating: 2
I don't undestand why Nvidia showed a 3 x GF100 in SLI with some random demo & no real benchmarks. I think any mainstream card out there in a 3x SLI configuration could play this demo with "silky smooth" framesrates.

Right now I assume that 99% people out there don't see no need for such powerful configurations (3 x GF100 in SLI) for a price $1500~2000 - as Belard pointed out. And probably in a few years time all this power is available if needed in a single card solution with a more reasonable power usage, heat output and mainly price.

Also most of the gamers out there don't really benefit from this "superpower" in terms of real gameplay. As there is actually no real use of Frame rates hovering around 100-200 FPS or even more (all maxed out). This only looks good on the paper - really. The eye needs about 30-40FPS to enjoy a smooth gameplay.

As some review of a HD 5970 showed Modern Warfare running 105 FPS on average (!) at 2560x1600 with 16xAF and 4xAA. http://www.guru3d.com/article/radeon-hd-5970-revie...

So what was Nvidia thinking when they showed 3 Fermi cards in SLI with no benchmarks running some random self made demo?


RE: GF100 Ouch!
By Belard on 1/19/2010 1:13:19 PM , Rating: 2
There will be a need for such power, video game cards are always advancing. Back in 2004, my $400 ATI 9800 Pro was top dog. Now, ATI sells a $35 card that performs just as good.

The big problem with $600+ in Graphics Gaming cards is...
uh, games that actually use the graphics. We're bleeding games to the consoles, and eventually consoles may suffer if the companies like ATI & Nvidia can't make money selling GPUs (for PCs) to learn R&D to make GPUs for consoles.


fermi.......
By alimaamoser on 1/20/2010 12:11:15 AM , Rating: 1
Everything dynamic and very positively!
http://www.articlesbase.com/health-articles/proact...




"A lot of people pay zero for the cellphone ... That's what it's worth." -- Apple Chief Operating Officer Timothy Cook














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki