backtop


Print 26 comment(s) - last by Trisped.. on Jul 21 at 11:27 AM

SLI for low power instead of high performance

Laptop Logic has scored some insider information on the one of the projects brewing in NVIDIA's secret labs. According to sources close to the company, NVIDIA is working on a new kind of SLI technology for notebook designs. Instead of pairing two identical GPUs for increased performance, NVIDIA is going for a pairing of integrated and discrete GPUs for a balance of power and performance.

The concept of pairing up a low-power integrated GPU with a high-power discrete GPU is nothing new. Sony has already treaded through these waters with its VAIO SZ lineup which features Intel GMA950 and NVIDIA GeForce Go 7400 GPUs. This setup requires an external switch to make the transition and also requires a system reboot. NVIDIA's "SLI Power" on the other hand will be done through a combination of hardware and software, but will not require a reboot of the machine to take place.

Considering the complexities involved with Windows XP graphics drivers and the additional level of complication added by Vista's new Windows Driver Display Model (WDDM), NVIDIA surely has its work cut out in making the transition a smooth one. There's no telling how much such a feature would add to the cost of a notebook, but it probably won't come cheap.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

As I asked in the thread on Anandtech ...
By MercenaryForHire on 7/19/2006 3:33:15 PM , Rating: 2
... Why aren't they using pipeline gating?

The "unified shader" architechture of DX10/WDDM would seem to make this easier, as they can simply shut down unnecessary pipelines at the desktop, turn them on to assist with video, and go up to full power for 3D mode.

- M4H




By rrsurfer1 on 7/19/2006 3:46:11 PM , Rating: 2
I dunno. That seems like the best path. Less fooling with drivers and such. Perhaps they are using another integrated solution to cut down on power below even that of a reduced pipeline chip. It's quite possible they can't turn off power completely to the idle pipes resulting in increased power consumption even with pipes not in use.


By ogreslayer on 7/19/2006 4:45:29 PM , Rating: 2
A) DX10 does not require unified shaders... yet, the card just has to be able to except the calls. Thus you can do any shaders however you want as Nvidia looks to be doing with G80

B) If G80 is as it appears then Nvidia's unified architecture is not gonna be mature enough and this becomes a great stop gap to allow for some competion to ATI's r600 dirivatives.

C) Last I checked Vista is not coming out anytime soon.
DX10 is Vista Only... So, looking at it from an XP point of view would be a bit more prudent


RE: As I asked in the thread on Anandtech ...
By bunnyfubbles on 7/20/2006 1:54:11 AM , Rating: 2
because then you can't market it with an "SLI" or "dual chip" tag line ;)


By Garreye on 7/20/2006 12:51:11 PM , Rating: 1
haha good thinking


By FITCamaro on 7/20/2006 8:55:53 AM , Rating: 2
Nvidia's current cards and their first DX10 cards don't/won't have unified shaders. Their first DX10 card is going to be a hybrid design.


By Trisped on 7/21/2006 11:27:02 AM , Rating: 2
Someone already stated the PR reason (they want SLI on the box). There is also the fact that ATI has a major lead in the note book sector. By releasing note books with 2 NVIDIA GPUs it will artificially inflate the NVIDIA numbers, making it look like they are better then they really are.

I wouldn't worry about driver problems though. I guarantee they will have problems.


And the benefit is????
By Dfere on 7/20/2006 8:01:08 AM , Rating: 2
What exactly. Most notebooks are for business use, and do not have high end cpu's. What about that limitation. As one of the posts put it... "I waited for a while to LOAD games". SLI will not fix that.

Nor will it fix the shortcomings of most notebooks- i.e. there are still units shipping with xp with 256 megs of ram. I know a smart consumer will add more graphics, but a smart consumer who adds memory and SLI will be going for a top end battle rig- not a low power wonder.




RE: And the benefit is????
By Master Kenobi (blog) on 7/20/2006 8:18:05 AM , Rating: 2
Yea, its a pretty stupid idea, rather than making laptops more power effecient, these (Desktop Replacements) are becoming a new trend. Heres an idea, maybe I don't want to play F.E.A.R. on a 14inch laptop LCD.

From a business standpoint, all our laptops sport 2GB of ram, and 2GB procesors, but thats it, no fancy graphics, just the integrated whatever. If we need graphic design and whatnot, we deploy a Tower. Seriously when will companies learn, Laptops are to be MOBILE, desktops and workstations are for heavy gaming.

Seems to be they are looking to build these desktop replacement laptops as a Jack of All Trades. It's mobile, it games nicely, its got a big screen, it weighs in at like 7 or 8 lbs........ Doesnt last longer than an hour, two at best, on battery....

/rantoff


By Master Kenobi (blog) on 7/20/2006 8:18:47 AM , Rating: 2
And that should be 2 GHz processors....... I just woke up, coffee hasn't kicked in yet.


RE: And the benefit is????
By phatboye on 7/20/2006 8:42:32 AM , Rating: 2
What are you talking about? This isn't a stupid idea. Obviously they aren't targeting this technology for people like you who does not need SLI in their laptop. If you don't like or don't need SLI in your notebook then simply don't buy a notebook with SLI in it. I don't see what the big deal is. You will have the option of not having to pay for a laptop with SLI in it.

I personally would never spend the money on SLI in a notebook either but there might be a few gamers who would like SLI in their notebooks so that they don't have to deal with bulky desktops and have a multipurpose notebook that they could easily bring to LAN parties.


RE: And the benefit is????
By bobsmith1492 on 7/20/2006 8:29:12 AM , Rating: 2
"And the benefit is????"

Personally, I would love it. Right now, I use my laptop for gaming while it is set up in my room. I also take it to school to use for projects and programming and whatnot. Therefore, when I bought it, I got one that wasn't very big, had a low-power processor (P-M), and had a decent video card.

Something like this would be perfect. You don't need a power-hungry processor like a P4 for gaming; a P-M is great for both the low-power end and the gaming end, while not making for a huge laptop either.

Now, pair that with a video card that sucks virtually no power, and you can have either a mobile machine or a desktop gaming system all-in-one. That sounds like a good idea to me.


RE: And the benefit is????
By fbrdphreak on 7/20/2006 9:34:33 AM , Rating: 2
The benefit is people can have high end graphics with great battery life. You can't get that right now.

Just because SOME people don't need or want it (kinda like desktop SLI graphics), doesn't mean its a stupid idea.


Voodoo!
By Lord Evermore on 7/19/2006 5:20:46 PM , Rating: 2
This sounds pretty much like what the old Voodoo's did, sort of.

As far as price, it'll end up being more expensive than any board that just used one or the other, and you'll simply be paying for the ability to choose whether you want to use less power when you don't need 3D performance, or use up more power in order to get some 3D.

Seems like maybe they ought to just work on making their chipsets use less power no matter what you're doing, but I suppose this is a way to let people have the option to use the highest performance chipset which just can't ever actually be low-power.




RE: Voodoo!
By jazzboy on 7/20/2006 5:29:14 AM , Rating: 2
So true, if I ever build or get a custom build computer for someone, I will always avoid nvidia chipsets due the heat/power they create - unless of course SLI is essential.


RE: Voodoo!
By abhaxus on 7/20/2006 10:23:19 PM , Rating: 2
???

the 1900XT uses quite a bit more power than the current nvidia cards. hence, you don't see any laptops with an X1900 chip yet there are lots with 7900GTX Go.

sorry to burst your bubble.


By UNCjigga on 7/20/2006 11:06:10 AM , Rating: 2
None! Seriously, what does Nvidia plan to do with the "added" horsepower of a GMA950?? :)

I may be reading into the article wrong, but here's how I understand it: Intel has significant market power and a lot of clout with manufacturers. They've been very successful in convincing notebook makers to include GMA950 integrated graphics and not even offer a discrete option. There are cases (think IBM T-series) where previous models came with discrete ATI or Nvidia graphics and now the only option is Intel. This trend will only get stronger once Intel starts shipping their DX10 part with the upcoming Merom platform.

I think what Nvidia is doing is creating a new "SLI"-type technology where a GeForce Go part can work in tandem with GMA X3000 and offload certain processing tasks. I wouldn't expect this technology to be ready before Merom/X3000 ship, and it might be Vista-only.




By rrsurfer1 on 7/20/2006 12:35:58 PM , Rating: 2
Not at all. What they said they are doing is using a low-power integrated solution along with a high-power gaming chip. If you want long battery life, you "enable" the lower-power chip, if you want great performance, you use the gaming chip. It's really not SLI in the traditional sense, and actually the name doesn't fit the marketing well, since your not using both GPUs at once.


By Garreye on 7/20/2006 1:05:51 PM , Rating: 2
I totally agree with this. I'm sure they'll come up with another name for this rather than SLI and it will probably be big on the "there's TWO GPUs." After a while they might even have this technology along with regular SLI or quad SLI for laptops, so you can turn on the SLI GPUS to play games on high quality if needed or use the integrated graphics processor otherwise for long battery life. After that you'll need to have another GPU for physics in your laptop too, so everyone start saving your money...


Now, this folks...
By killerroach on 7/19/2006 3:33:42 PM , Rating: 3
...is what we've all been wanting to see. The best of both worlds, the power efficiency of the integrated Intel chipsets, and the performance of a discrete nVidia GPU. Probably won't come cheap (at first), but this may be the happy medium that people have been begging for for quite some time.




RE: Now, this folks...
By shabodah on 7/19/2006 5:00:13 PM , Rating: 2
Intel's integrated CPU's are VERY comparable to Nvidia's. The 6100/410 Nividia chipset uses less power than MOST intel NB/SB do (yeah most of that is do to not having to deal with system memory much).


Another "SLI"
By Thmstec on 7/19/2006 4:33:56 PM , Rating: 2
Anyone else getting the feeling that NVidia just wants to stick "SLI" on everything they do? SLI memory was starting to push it, but now this!




RE: Another "SLI"
By phatboye on 7/19/2006 6:02:03 PM , Rating: 2
DUH. Of course NVIDIA wants to find new ways of incorporating SLI, they are a business and their primary goal is to sell as much equipment as possible. I for one don't mind this move by NVIDIA if 1) it is actually benifical to the consumer and 2) it has a reasonable cost. From what I see from this article 1) seems to be satisfied. We will have to wait and see if 2) will ever be satisfied.


We need smarter CPU's and GPU's.
By akugami on 7/20/2006 12:46:03 PM , Rating: 2
What they need is to make better CPU's and GPU's designed to ramp up in power when needed but with ultra low power draw when not in use or when lightly used. For instance, if it's sitting idle for 5 minutes while I go get a coffee or use the restroom it should drop down to a near hibernation state in terms of power draw. When I'm browsing the web it should be smart enough to know to ramp up the pwoer draw but not to full power. When I'm rendering 3d models, playing a game or doing something else that's intensive then ramp it up to full power.

I see CPU's moving towards this direction but GPU's seem to be getting more and more power hungry. I don't believe we'll see true power conscious GPU's until after the R600 and G80 cores. Presumably the R700 and G90 if nVidia and ATI keep the same naming conventions.




By Garreye on 7/20/2006 1:14:48 PM , Rating: 2
Ya this would be a much better idea...it does definetely seem to be moving away from this in the GPU sector, especially when you hear about how DX10 GPU will consume up to 300W....

http://www.anandtech.com/tradeshows/showdoc.aspx?i...


Great news.
By kuyaglen on 7/19/2006 3:50:40 PM , Rating: 2
I've owned a Toshiba sub-notebook that had the Transmetta Carusoe cpu and except for the performance with computational power, I loved it. So loading up games and playing them was an excercise in patience. I'm sure that this will be the way to go in my case.




"If they're going to pirate somebody, we want it to be us rather than somebody else." -- Microsoft Business Group President Jeff Raikes











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki