backtop


Print 26 comment(s) - last by Trisped.. on Jul 21 at 11:27 AM

SLI for low power instead of high performance

Laptop Logic has scored some insider information on the one of the projects brewing in NVIDIA's secret labs. According to sources close to the company, NVIDIA is working on a new kind of SLI technology for notebook designs. Instead of pairing two identical GPUs for increased performance, NVIDIA is going for a pairing of integrated and discrete GPUs for a balance of power and performance.

The concept of pairing up a low-power integrated GPU with a high-power discrete GPU is nothing new. Sony has already treaded through these waters with its VAIO SZ lineup which features Intel GMA950 and NVIDIA GeForce Go 7400 GPUs. This setup requires an external switch to make the transition and also requires a system reboot. NVIDIA's "SLI Power" on the other hand will be done through a combination of hardware and software, but will not require a reboot of the machine to take place.

Considering the complexities involved with Windows XP graphics drivers and the additional level of complication added by Vista's new Windows Driver Display Model (WDDM), NVIDIA surely has its work cut out in making the transition a smooth one. There's no telling how much such a feature would add to the cost of a notebook, but it probably won't come cheap.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

As I asked in the thread on Anandtech ...
By MercenaryForHire on 7/19/2006 3:33:15 PM , Rating: 2
... Why aren't they using pipeline gating?

The "unified shader" architechture of DX10/WDDM would seem to make this easier, as they can simply shut down unnecessary pipelines at the desktop, turn them on to assist with video, and go up to full power for 3D mode.

- M4H




By rrsurfer1 on 7/19/2006 3:46:11 PM , Rating: 2
I dunno. That seems like the best path. Less fooling with drivers and such. Perhaps they are using another integrated solution to cut down on power below even that of a reduced pipeline chip. It's quite possible they can't turn off power completely to the idle pipes resulting in increased power consumption even with pipes not in use.


By ogreslayer on 7/19/2006 4:45:29 PM , Rating: 2
A) DX10 does not require unified shaders... yet, the card just has to be able to except the calls. Thus you can do any shaders however you want as Nvidia looks to be doing with G80

B) If G80 is as it appears then Nvidia's unified architecture is not gonna be mature enough and this becomes a great stop gap to allow for some competion to ATI's r600 dirivatives.

C) Last I checked Vista is not coming out anytime soon.
DX10 is Vista Only... So, looking at it from an XP point of view would be a bit more prudent


RE: As I asked in the thread on Anandtech ...
By bunnyfubbles on 7/20/2006 1:54:11 AM , Rating: 2
because then you can't market it with an "SLI" or "dual chip" tag line ;)


By Garreye on 7/20/2006 12:51:11 PM , Rating: 1
haha good thinking


By FITCamaro on 7/20/2006 8:55:53 AM , Rating: 2
Nvidia's current cards and their first DX10 cards don't/won't have unified shaders. Their first DX10 card is going to be a hybrid design.


By Trisped on 7/21/2006 11:27:02 AM , Rating: 2
Someone already stated the PR reason (they want SLI on the box). There is also the fact that ATI has a major lead in the note book sector. By releasing note books with 2 NVIDIA GPUs it will artificially inflate the NVIDIA numbers, making it look like they are better then they really are.

I wouldn't worry about driver problems though. I guarantee they will have problems.


"I want people to see my movies in the best formats possible. For [Paramount] to deny people who have Blu-ray sucks!" -- Movie Director Michael Bay











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki