backtop


Print 57 comment(s) - last by Blight AC.. on Jan 31 at 9:12 AM

From enthusiast to low-end, all new NVIDIA chipsets will feature an enabled integrated graphics core

In conjunction with the Consumer Electronics Show in Las Vegas, NVIDIA launched its NVIDIA Hybrid SLI technology along with its newest chipset. Hybrid SLI is NVIDIA’s first foray into visual computing.

NVIDIA’s Hybrid SLI links NVIDIA integrated graphics chipsets with NVIDIA discrete GPUs, allowing them to work together. The company claims that the new technology lowers power consumption and improves performance.

NVIDIA’s announcement of Hybrid SLI also indicates a major shift in the company’s chipset-feature policy. NVIDIA chipsets with integrated graphics processors (IGP) have traditionally been available only in the lower segment of the market. NVIDIA now decided that all of its new chipsets, low and high-end alike, will come with IGP. 

Of course most core logic includes an integrated graphics processor, albeit disabled. 

Two fundamental components make up Hybrid SLI: HybridPower and GeForce Boost. HybridPower, as the name indicates, is the power-consumption reducing aspect of the technology. It allows for systems to completely turn off discrete graphics cards when their high-functionality is not needed. Instead, the chipset’s integrated graphics takes over.

In order to use Hybrid power, the system must include an NVIDIA IGP and a discrete NVIDIA video card. Under HybridPower, users connect their display to the motherboards graphics outputs. When users require the use of their discrete GPU, the frame buffer contents for the discrete graphics cards are copied over to the integrated graphics processor’s frame buffer. NVIDIA asserts that the second generation PCI specification provides enough bandwidth.

Latency is considered a "non-issue," claims NVIDIA spokesman. 

GeForce Boost combines the power of the IGP -- which NVIDIA calls the mGPU) and the discrete GPU (dGPU) to improve performance. NVDIA told the press that this technology is meant for low-end or mid-range PCs. In fact, the company states that this feature could be detrimental to the performance of high-end PCs.

NVIDIA Hybrid SLI is currently a Windows Vista exclusive.

NVIDIA Hybrid SLI technology will be incorporated into a wide variety of graphics and motherboard desktop and notebook products that the Company is rolling out for both AMD and Intel desktop and notebook computing platforms throughout 2008.

In addition to announcing hybrid SLI, NVIDIA also announced its new nForce 780a chipset.  Naturally, one of the newest features is Hybrid SLI support. In addition, all chipset versions now have embedded GPUs. Currently, the nForce 780a is being launched for AMD processors.

The new chipset supports AMD’s newest HyperTransport 3 link interconnect, and offers 32 PCI Express lanes via an NVIDIA nForce 200 chip.

The nForce 200 comes with a couple of notable features. One of them is a Posted Write Shortcut, which NVIDIA says allows data from one graphics card to be passed directly to other graphics cards without having the data to be sent back through the CPU. The feature is said to improve SLI scaling performance.

As can be expected, the chipset is also ESA certified and supports 3-way SLI.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

usefulness
By xsilver on 1/9/2008 3:01:25 AM , Rating: 3
If they can make hybrid power available for high end cards, I can see this being useful.
a saving of 200w for when you're not gaming will pay for itself (extra cost in MB) in around a year i suspect.

Until then - yawn.




RE: usefulness
By Blight AC on 1/9/2008 8:11:27 AM , Rating: 2
Yeah, that exactly why I was most looking forward to this. I wonder if they have a list of approved cards for this. If an 8800GT is okay, then that'd work for me.


RE: usefulness
By Blight AC on 1/9/2008 8:17:05 AM , Rating: 2
Just checked the nVidia site:
http://www.nvidia.com/object/hybrid_sli.html

Only the 8500GT and 8400GS are Geforce Boost cards... disappointing, and no Intel Chipsets yet either.


RE: usefulness
By Blight AC on 1/9/2008 8:43:21 AM , Rating: 2
There's also a more detailed article on Anandtech.com:
http://www.anandtech.com/tradeshows/showdoc.aspx?i...

quote:
Newer GPUs coming in this quarter will support the full Hybrid SLI feature set, and it’s sounding like the G92 GPUs from the 8800GT/GTS may also have HybridPower support once the software is ready.


Well, really, all I'm interested in is the HybridPower, if you can get that on an 8800GT with a quad core Intel CPU, that would be fantastic, an excellent performing setup that's amazingly low cost and energy efficient. I've just been waiting for HybridSLI to put it all together (I initially heard about it around July/August).


RE: usefulness
By jonrem on 1/9/2008 9:18:36 AM , Rating: 2
Definitely, I feel really wasteful leaving my pc on all day to finish up downloads when it's sucking almost 250 watts just sitting there. Anything that can reduce power consumption is fine by me.


RE: usefulness
By myrealname on 1/9/2008 10:19:09 AM , Rating: 2
What else can I say but 'no kidding'. There's no reason I should be able to fry an egg on my PC case while browsing through Dailytech and lord-knows-how-many other forums. When I'm gaming, sure--I've got to eat sometime.


RE: usefulness
By phusg on 1/14/2008 10:06:14 AM , Rating: 2
There are 2 things you can do:
1) Install BOINC and attach to one or more scientific projects so that your CPU cycles and Watts are not wasted, see http://boinc.berkeley.edu/projects.php.
2) Supply you PC with 'green' electricity which is generated renewably from for example wind or solar generators.


RE: usefulness
By Blight AC on 1/31/2008 9:12:16 AM , Rating: 2
The problem with these distributed computing projects is that they will use 100% CPU while your PC is idle with this software running, you'll be using more electricity then without the software. It also doesn't help your PC be more efficient when your using it for simple tasks, like email and web browsing, or solitaire.

I want more efficiency on my PC, not another way to make it use more electricity.


Sweet
By Master Kenobi (blog) on 1/9/2008 1:26:29 AM , Rating: 2
Now make it work with my 2 monitors. Yea... didn't think so.




RE: Sweet
By mxzrider2 on 1/9/2008 2:28:10 AM , Rating: 1
since when doesnt nvidia support multiple monitors. just set up a pc to run dual monitors the other day. works with their igp and dedicated cards together as well


RE: Sweet
By Master Kenobi (blog) on 1/9/2008 7:52:36 AM , Rating: 5
SLI does not allow the use of dual monitors.


RE: Sweet
By Spuke on 1/9/08, Rating: -1
RE: Sweet
By kkwst2 on 1/9/2008 8:20:43 PM , Rating: 2
People have gotten around this by writing scripts to enable and disable SLI. They just enable SLI for gaming, since very few games support multiple monitors anyway.

Others use a third cheap non-nVidia PCI video card to run the other monitors off of.

Have you tried any of these "solutions"? Not ideal, but better than not being able to use a second monitor, which I couldn't live without. SLI just doesn't seem worth it's headaches to me.


RE: Sweet
By winterspan on 1/10/2008 1:22:53 AM , Rating: 3
I'm not sure, but just because it says "Hybrid SLI", it might be completely unrelated to their "real" SLI technology. Especially since it's (hybrid power) only using one or the other GPU.


A better choice...
By qwerty1 on 1/9/2008 2:10:56 AM , Rating: 3
... would be to put an IGP on a discrete card and enable the power saving function. I for one don't want to be tied down by compatibility issues.




RE: A better choice...
By sweetsauce on 1/9/2008 10:14:36 AM , Rating: 2
Actually a better choice is whatever the company decided would be best for that technology, not what some random john doe thinks would be best in his own personal situation.


RE: A better choice...
By BMFPitt on 1/9/2008 10:15:37 AM , Rating: 2
Power savings wouldn't be as big if you're introducing another card, and cost would (obviously) be a bit higher that way.

And at that point, why have a discrete card? Just make your primary card underclock well. My 7900GT is easily the loudest piece of my system, and I'd love to have it go to like 10% power/passive cooling when not gaming.


RE: A better choice...
By jajig on 1/10/2008 9:09:32 AM , Rating: 2
Use rivatuner to lower the clock speed. That's what I do when I browse the internet.


RE: A better choice...
By qwerty1 on 1/12/2008 1:52:04 AM , Rating: 2
Oops. I should have clarified. I meant combining low power IGP-esque and high powered gpu together on one add-on card. Don't know if making a card underclock well is hard or not, but if it works that'd be a good idea as well.


The most interesting part of this post
By SigmundEXactos on 1/9/2008 1:46:39 AM , Rating: 2
The most interesting part of this post is "Of course most core logic includes an integrated graphics processor, albeit disabled."




RE: The most interesting part of this post
By MonkeyPaw on 1/9/2008 7:54:20 AM , Rating: 2
I saw that too. I was always under the impression that if the board has no video connections, then there is no IGP. This article makes it sound like almost every chipset out there has an IGP. The funny part is how it's stated like it's common knowledge.


By KristopherKubicki (blog) on 1/9/2008 10:15:24 AM , Rating: 2
But it is. P35 contains the G35 graphics core. It's just disabled. The same could be said for NVIDIA 780i or AMD 790X. The vendor doesn't have anything particularly useful to do with IGP on these higher end chipsets though, since its clear a discrete GPU would be used.


Fusion
By Treckin on 1/9/2008 2:38:11 AM , Rating: 2
This is clearly a decision mainly targeted at the AMD fusion project, which id successful, would accomplish just this (especially the power-saving function).

Hopefully for AMD there are unforseen, or at least less obvious benefits to GPU on die configurations.

Good move for Nvidia though




RE: Fusion
By initialised on 1/9/2008 7:23:39 AM , Rating: 2
AMD have Fusion, Intel are looking into real time ray tracing, where is this going to leave nVidia when heterogeneous multicore takes off? Getting into bed with Via?


RE: Fusion
By The Jedi on 1/10/2008 2:49:19 PM , Rating: 2
NVIDIA won't be hurting. They have their own assembly language for their chips. It's how any ForceWare driver can be forced to work with any NVIDIA GPU and still "work".

If software is compiled with a GPGPU aware compiler, any PC with a cheap CPU can get a big boost by just adding an NVIDIA based graphics card. I read an article about this a couple of years ago. They've been building towards this strategy for a while now. Imagine if a cheap-o PC can perform better at lots of computations making it considerably faster, just if it was made with an NVIDIA IGP motherboard?

So, having the right compiler and getting new software using it is the big deal.


By Creig on 1/9/2008 7:58:13 AM , Rating: 2
Before you know it, future generations of 3D accelerators won't have any 2D components at all. Any bets that the name for this new "3D only" line of Nvidia cards will be "Voodoo"? :)




By Chris Peredun on 1/9/2008 8:49:08 AM , Rating: 2
And before you know it, they'll be telling us that "64-bit colour doesn't matter." ;)


Vista EULA
By wordsworm on 1/9/2008 8:31:52 AM , Rating: 2
Since this went into Vista, I thought I'd add my rant.

I think the worst change in Vista is the new EULA. It used to be XP retail was worth it because you might go through 3-4 computers using the same OS (concurrently, not simultaneously - in accordance with the spirit of the document). now, Vista retail can only go on a maximum of two machines. So, why pay $200ish for retail for 2 machines when $110 gets you OEM for one? It's just not enough of a savings for me. Check this out:

http://wendy.seltzer.org/blog/archives/2006/10/19/...

I've noticed that there are some good games out for Linux. I've tried it, and it wasn't bad at all. Maybe one day we'll all be using it instead of this EULA hell.

As far as Nvidia only going through Vista, I can't say as I blame them. It's far superior to XP. It's clearly the best OS they've ever made.




RE: Vista EULA
By jbizzler on 1/9/2008 9:26:14 AM , Rating: 2
Being a developer, I've had Vista since before it was released to the public. As soon as release candidates were rolling out, it was already ready to be my main operating system. I'd personally never buy it retail for a computer with XP on it, but when I get new computers, I order them with Vista. And when I build my own computers, I order them with an OEM Vista disk. It's really not worth an upgrade to, but it's nice to get for a new machine.

Yeah, a lot of people don't realize how different Vista is for developers. The graphics model in Vista is vastly different than XP, especially for driver programmers. I'm sure it's possible they could have this tech be supported by XP, but it would be either difficult or simply take way too much time to do it from scratch.

Vista's like that with everything. It's pretty for regular consumers, but it's the programmers who benefit most form it. DX10 changes little in the way of features, and I'm yet to see a commercial product benefit from it, but I must say it is far easier and more efficient to program for. And, in my own tests, it's faster than DX9, but I don't know why that doesn't scale over to commercial games.

I've never had problems with Microsoft products anymore than I have had with any other software.

I think this HybridSLI tech is most useful for notebooks. In a world with an increasing number of performance notebooks, it would be nice to have one with a great deal more battery life, and possibly a little performance boost as well.


So will the output be craptastic?
By ElFenix on 1/9/2008 11:23:51 AM , Rating: 2
the last two integrated video motherboards i've purchased have had absolutely awful output quality. if the discrete graphics card is outputting through the integrated video, will that make it crappy too?




By mindless1 on 1/10/2008 1:59:56 PM , Rating: 2
Then I take it they weren't DVI outputs since that'll make it visually identical to discrete card unless you're only talking about the amount of eyecandy that had to be disabled for gaming.

So yes, if you run high resolutions with an analog output it may still look crappy, but will be fine with DVI IGP.


vista my dear
By thartist on 1/9/08, Rating: -1
RE: vista my dear
By mxzrider2 on 1/9/2008 2:36:42 AM , Rating: 2
oh shut up. just because u cant handle the change to vista doesnt mean that microsoft, who is trying to make money, shouldnt make deals with other companies. i dont think its their deal anyway. nvidia is using tech designed for dx10 and dont want to bother making it for anything less.

There is very little to dislike about vista that cant be either removed or disabled. and you just a fool that doesn't know his way out of a trash bag


RE: vista my dear
By phaxmohdem on 1/9/2008 3:35:45 AM , Rating: 5
Win98 SE 4 Lyfe


RE: vista my dear
By StevoLincolnite on 1/9/2008 12:08:48 PM , Rating: 2
I have it dual booted with Vista, and with the shiny KernelEx and Windows 98 Revolutions Pack - I can play all those classic games and still have support for those XP only programs!
(Seriously, I missed Playing Stupid Invaders and Dungeon Keeper 2 flawlessly with Windows 98 and would not work right on XP and Vista).


RE: vista my dear
By tjr508 on 1/9/2008 1:20:59 PM , Rating: 2
You are completely missing the point here. MS didn't ditch the win2k folk as soon as XP was released. It's very disappointing to know you have paid for an extremely competent and capable product only to find out that your supplier is INTENTIONALLY devaluing it.


RE: vista my dear
By ApostolicFire on 1/9/2008 6:39:59 PM , Rating: 2
tjr,
Which is exactly why Microsoft isn't releasing a SP3 for XP.... oh wait.


RE: vista my dear
By kenji4life on 1/9/2008 4:29:24 AM , Rating: 3
You know, your argument wasn't actually that horrible until you resorted to using fallacious insults in place of more sound logic.

I agree that I personally haven't had any problem with Vista that I wasn't able to solve, but there are always problems with any new product.

My personal favorite platform is Ubuntu, which conveniently updates to newest versions free of charge (with your consent of course). I would be lying if I told you that my efforts with Ubuntu have gone much further than setting up simple servers, but I never did more than this with Windows anyhow.

I have no major gripe with Vista or any of its predecessors, and it really doesn't seem like a whole lot has changed in a really useful way since Windows 98se. Of course there will always be evolutionary changes, but overall it seems that UI has gone as far as the [I]nterface will allow. Perhaps a throwback to the Nintendo power glove or voice commands. Maybe even *gulp* Johnny Mnemonic implants and VR computer interfaces...

If Nvidia were to move forward in interface that is viable and innovative, a new way to work with computers.


RE: vista my dear
By kenji4life on 1/9/2008 4:32:09 AM , Rating: 2
Err.. then they would find a new way move the computer market.


RE: vista my dear
By tjr508 on 1/9/2008 8:43:35 AM , Rating: 1
quote:
nvidia is using tech designed for dx10


Yet another MS manufactured reason to force an upgrade.


RE: vista my dear
By StevoLincolnite on 1/9/2008 12:11:00 PM , Rating: 2
I miss the Days of Glide... *Sigh* Nostalgia.


RE: vista my dear
By monitorjbl on 1/9/2008 9:06:18 AM , Rating: 4
I've noticed that very rarely do people who forgo punctuation, grammar, and use "u" and "ur" instead of spelling out "you" or "your" have anything of note to say.


RE: vista my dear
By myrealname on 1/9/2008 10:12:09 AM , Rating: 2
If this forum would let me vote yet, I'd rate you up a +1. It gets very old reading kiddie lingo on public forums.


RE: vista my dear
By sweetsauce on 1/9/2008 10:16:47 AM , Rating: 1
Whether you like it or not, kiddie lingo as you put it is here to stay. I suggest you either except and adapt to it, or continuously bang your head against a wall every time you see it.


RE: vista my dear
By SavagePotato on 1/9/2008 10:28:27 AM , Rating: 2
It's here to be ignored. When I see someone communicate like that I assume I am either talking to a 14 year old or a mongoloid gorilla the communicates through a series of grunts.

Automatically I have lost interest in anything the person has to say.

Someday it's time to grow up and learn to type. It's easier, and faster to use proper worlds, than try to come up with that crud.

the other four fingers on your hands are for typing too not just for picking your nose while you peck out indecipherable bullshit.


RE: vista my dear
By Spuke on 1/9/2008 12:58:09 PM , Rating: 2
If the first sentence is too difficult to understand then I'll ignore the rest. But I don't automatically discard any of the posts. I might learn something. I'm not at the point in life where I'm comfortable with what I know. I always look for more. So if someone wants to use "u" and "ur" and has something interesting to say then I'll listen.


RE: vista my dear
By smaddox on 1/9/2008 4:43:45 PM , Rating: 2
Yeah, but how often does that happen?

Like the guy said, no one above the age of 16 uses "u" and "ur" except people with IQ below 80. I find it hard to believe I could learn anything from them, other than to avoid them.

I probably sound like an elitist, and maybe I am, but if you are to lazy to type out "you", you aren't going far in life.


RE: vista my dear
By agentjka03 on 1/9/2008 6:50:38 PM , Rating: 2
wuteva, u dunno wut ur talkn about


RE: vista my dear
By VisionxOrb on 1/9/2008 11:11:29 PM , Rating: 2
I'm 26, have an IQ of 147 and use "u" and "ur" all the time. I started out on the internet in 93 on prodigy and typing that way was/is much faster and easier (consider it internet short hand). Once you get into the habit of that, it's pretty hard to shake.

Hate to break this to you but language be it spoken or written evolves over time ( for example, the term "google" is now part of the dictionary ). You may not like it but that's how it is. Attacking someone based solely on their use of grammar and not the content of their comment doesn't make you and elitist, it makes you a horse's ass.


RE: vista my dear
By mindless1 on 1/10/2008 2:07:22 PM , Rating: 2
Hmm. No, if you type an entire set of paragraphs it certainly does not make typing much faster and easier to avoid two characters at most which is less than 1% of the text typed. Since you obviously can't get away with abbreviating these words in real life uses it's doubtful it's even any faster unless you have a physical disability or are posting from a phone.

On the other hand, people that butt into a conversation just to whine about grammer ARE the horse's ass, are far worse than the minor distraction from how someone spells a word. I'd call them genuine jackasses and in real life if they were having a conversation with someone and stopped that other person they were conversing with to whine that the person had mispronounced a word, that would never be considered good etiquette.


RE: vista my dear
By SavagePotato on 1/10/2008 2:49:32 PM , Rating: 2
I guarantee I can type, how are you today, your abbreviations hurt my eyes, Just as fast as you could type:

How r u ur abreviations hurt my eyes.

Language will not be evolving to include u, or ur in the dictionary at any time, ever.


RE: vista my dear
By VisionxOrb on 1/10/2008 4:36:19 PM , Rating: 2
Thats unfortunate that reading words typed like hurts your eyes, I guess its possible that your brain processes words differently than most. For most people the brain doesn't actually read the words. For example, the text on this page I can read just as easily as if it were correct. http://joi.ito.com/archives/2003/09/14/ordering_of...

In fact the first time I ever read one of these examples I got through have of it before realizing it was miss spelled. For me I would never even notice that some one used "u" and "ur" and other abbreviations if it wasn't for those that chime in to attack the grammar.


RE: vista my dear
By kenji4life on 1/10/2008 9:19:25 PM , Rating: 2
This is a very good point. Another example of this is reading a Japanese newspaper, where you can very quickly read an article by looking at the main characters in the text. Most Japanese people are able to read a newspaper very quickly, because irrelevant "space waste" words and characters are overlooked and ignored because in the greater context it's easy to read without them.

I <3 U

Most people understand this as an abbreviation for I love you.

Just like I could say whatcha doin, which means the same as What are you doing? The difference in English speech may be that the former is not grammatically acceptable, but unless you are at a yuppy party or a job interview, the casual form is perfectly acceptable.

That being said, when trying to convey an argument using typed text, it's much easier for an opponent to 'knock you down' using an ad hominem attack.


RE: vista my dear
By Aikouka on 1/9/2008 8:26:55 AM , Rating: 3
I actually wonder if this is because of the new WDDM (Windows Display Driver Model) that was introduced in Windows Vista.


RE: vista my dear
By ET on 1/9/2008 8:35:12 AM , Rating: 3
I'd imagine that the reason it's Vista-exclusive has to do with the Vista having a lot more advanced core than XP. The Vista driver model probably enables this functionality, and it'd be a kludge in XP, if it's possible to implement it.


RE: vista my dear
By Master Kenobi (blog) on 1/9/2008 9:46:28 AM , Rating: 2
But the anti-Vista crowd doesn't want to listen to such a reasonable and straight forward answer. DX10 has the ability to do it and the driver model in Vista supports it. It's also not like Microsoft pulls DX10 out of a black project and poof it appears, they were working with nVidia and ATI for years on it and what they wanted to see in DX10. This was probably on the list of "we want" from both nVidia and ATI since SLI came about.


RE: vista my dear
By mindless1 on 1/10/2008 2:08:52 PM , Rating: 2
Bull, it could be done in XP with the driver, but it would raise development and support costs.


"If they're going to pirate somebody, we want it to be us rather than somebody else." -- Microsoft Business Group President Jeff Raikes











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki