backtop


Print 69 comment(s) - last by Smilin.. on Dec 13 at 2:59 PM


NVIDIA may have proclaimed its love of Apple, but Cupertino is rumored to be preparing to dump it.  (Source: Engadget)

Apple is rumored to be reuniting with Intel for its latest graphics hook up.  (Source: Engadget)
And to think that just a few months ago NVIDIA's CEO said he loved Apple computers...

The relationship between Apple's notebooks division and Intel's graphics division is looking more and more like a bad soap opera.  Many following the story will recall that Apple in 2008 its decision to dump Intel's integrated graphics processors for the more attractive NVIDIA graphics.  Now the tables have turned, with sources indicating that Apple notebooks and NVIDIA are “on a break”.  

Reportedly, Apple will be saddling back up with Intel iGPUs for the next generation of MacBooks, and it will also be getting some action on the side from AMD for its pricier MacBook Pros.  That must be pretty painful for NVIDIA, considering its CEO recently proclaimed his love of Apple.

The new MacBook models will reportedly feature Intel's Sandy Bridge, the company's first notebook-aimed system-on-a-chip, which features an iGPU clocked at between 650 MHz to 850 MHz, with higher clock speeds available via turbo-boost.  The MacBook Pros will reported get one of the Radeon 63xx/65xx HD discrete GPUs that were launched late last month.

One key reason why Apple may be kicking NVIDIA to the curb is Intel's promise to change.  More precisely, Intel has pledged to push OpenCL -- a GPU computing language -- out for Sandy Bridge in the near future.  Apple's Snow Leopard's performance is boosted by OpenCL, so many had thought NVIDIA -- long the only producer of OpenCL products -- was a lock for future Mac notebooks.

Another reason may be economics.  
AnandTech chief Anand Shimpi is quoted in CNET as stating, "I'd say...we can expect (about) 2x the performance of [Sandy Bridge's] graphics. At that level of performance, I don't see a need for discrete [standalone Nvidia or Advanced Micro Devices] graphics at the very low end."

NVIDIA presumably will still have a place in Apple's desktops, though.

If true, the transition would mark the latest chapter in Intel's long and volatile history with Apple in which rival suitors oft played a part.  While Apple long tried to resist Intel's CPUs, it found itself irresistibly attracted to the company's superior performance and the pair hooked up for the first time back in 2005.  Now with Intel reportedly preparing to give Apple love on both the CPU and graphics front, the pair look to be more committed than ever before.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

smells political.
By Smilin on 12/9/2010 3:41:59 PM , Rating: 4
Something about this smells political rather than technical. Some sales team made a special deal, or some exects set up a partnership. Something like that.

Having "twice as fast" integrated graphics still won't compare to discrete.

Plus we all know intel graphics are crap. Mind you they've put a lot of candy sprinkles on their crap recently but it's still crap.




RE: smells political.
By borismkv on 12/9/2010 3:45:50 PM , Rating: 5
"Apples are the best graphics processing systems!"

Yeah. Not after they get Intel GPUs they won't be. Unless Intel manages to gobble up nVidia they're always going to be in dead last when it comes to graphical performance. But then, nobody buys Apple products for their Performance capabilities.


RE: smells political.
By Da W on 12/9/2010 3:56:54 PM , Rating: 4
Sandy bridge gpu looks to perform like a radeon 5450, enough for a 13 inch macbook with few games available.
They will have discrete AMD graphics for their 15-17 inch models. Its not unusual for apple to switch between AMD and Nvidia.
Can we see a bulldozer MacBook in 2012???


RE: smells political.
By Samus on 12/9/2010 4:13:09 PM , Rating: 3
If nVidia is screwing Apple like they screwed HP with 230,000+ defective chipsets, firing off a recall spanning many model years and costing hundreds of millions for both parties and reputation damage for HP, then its no surprise Apple is ditching them.

I can't tell you how many nForce 6150 series motherboards I've come across dead because of cooked chipsets, in Acer's, HP's, eMachines, and even Dell's.


RE: smells political.
By FXi on 12/10/2010 9:32:48 AM , Rating: 2
I'd bet this is 3 things:

Fallout from the defective chips that I'm betting had to be covered mostly by the PC manufacturers. I'd bet Nvidia only gave the companies pennies on the dollar on defective repair/replacement costs.

Price. Nvidia has always put itself in the premium price position. Intel built in performance is now good enough to get rid of that cost in some machines, and AMD pricing is vastly better, so they are probably catching some of the discrete area.

Attitude. You don't need much explanation here. Nvidia's holier than thou method of dealing with end users and customers has long been a thorn to all said parties.


RE: smells political.
By silverblue on 12/10/2010 1:47:33 PM , Rating: 2
I've always lamented how expensive nVidia cards are. If you don't want 3DVision, CUDA or PhysX, you're paying more to play games, though one might argue that these technologies are gaining traction and providing added value that AMD cards lack (with the exception of Eyefinity, of course).

I still don't think PhysX and 3DVision are helpful for low-end nVidia GPUs in gaming terms, but what do I know? :)


RE: smells political.
By mindless1 on 12/11/2010 6:11:23 PM , Rating: 3
Remember two things:

1) nVidia chips were designed with materials that could not withstand excessive operating temperatures. All I mean by "excessive" is hotter running than they could have been, that prudent cooling subsystem design would not have allowed them to heat up as much.

2) Laptops were designed with inadequate heatsinking on the GPU, and fan RPM tied only to CPU temperature instead of whichever of the two chips was hotter at any given moment.

If it were not for #2, #1 would not matter. #2 was inevitable as we keep shrinking process size and packing more computational ability into GPUs, unless laptop manufacturers quit trying to barely get by with marginal cooling.

nVidia paid a fair amount of money and companies chose which models/submodels would be covered. They (system manufacturers) essentially mitigated losses by leaving lots of customers with no recourse because they had bought a different model of laptop/etc.


RE: smells political.
By semo on 12/10/2010 1:34:07 PM , Rating: 2
I came across 3 HP laptops with one 6 and two 7 series nvidia GPUs. I was able to confirm that video was the problem in all 3 cases.

These companies managed to get away too easy from that mess.


RE: smells political.
By MonkeyPaw on 12/9/2010 5:23:58 PM , Rating: 2
quote:
Sandy bridge gpu looks to perform like a radeon 5450, enough for a 13 inch macbook with few games available<./quote>

Problem with Intel has always been the drivers. Judging by the frequent updates from nVidia and AMD, drivers are not an easy thing to do. Even with such regular updates, there are still issues. Intel seemingly never launches new drivers, but that's probably because it just doesn't matter. I just have my doubts that Intel can deliver on the software side.


RE: smells political.
By SPOOFE on 12/9/10, Rating: 0
RE: smells political.
By Wierdo on 12/9/2010 8:51:13 PM , Rating: 5
The only way Intel's cards are more stable is if you're not actually running stuff on them lol.


RE: smells political.
By Wierdo on 12/9/2010 8:52:07 PM , Rating: 2
edit: meant IGPs.


RE: smells political.
By VooDooAddict on 12/12/2010 12:36:08 PM , Rating: 2
"Sandy bridge gpu looks to perform like a radeon 5450"

Let me fix that for you so it works for the last decade:

Intel's up and coming gpu looks to perform like the upper range of the current low end gpus from nvidia or ati/amd. When in actuality they come out performing like a generation or two behind the low end cards from nvidia or ati/amd. Then Intel promise that software updates will improve performance over the life of the chipset/gpu.

Intel needs to put out a serious gpu before I start considering them really viable. That, or get on with it and merge/buy/license nvidia.

Will it be good enough for most Mac users? Probably. But that doesn't mean it's a good gpu.


RE: smells political.
By Smilin on 12/9/2010 4:52:11 PM , Rating: 2
Apple's reputation is pretty well deserved. While they like to falsely claim they are God's gift to graphics that fact remains they are pretty good. So good in fact that It makes me wonder if they know something we don't?

Did Intel manage to not make some suck this time?

I guess my faith in intel making crap graphics is stronger than my faith in apple making good hardware decisions. I had a doubt for a moment though.


RE: smells political.
By rudy on 12/12/2010 4:03:26 PM , Rating: 2
What are you talking about apple has been running garbage GPUs for the vast majority of its existence. Their very recent switch to nVidia was a glimmer of hope actually. The reputation is only because originally adobe made products for apple and because of that adobe did not bother to use gpu acceleration. There fore the gpu was of little use in apples which had few games anyway. And of little to no use in PCs with adobe CS without graphics acceleration.

The bigger deal here is that apple knows their customers have no clue and anyhow they can force them to buy pros if they do have a clue.


RE: smells political.
By Smilin on 12/13/2010 9:58:58 AM , Rating: 2
I can't believe you're making me defend apple. You suck.

You can't find a bad graphics card on any MBP ever made..
http://en.wikipedia.org/wiki/MacBook_Pro

The only crappy graphics on the iMac is the 17" model from 2006.
http://en.wikipedia.org/wiki/IMac_(Intel-based)#Un...

Now Apple would have you believe that they have the greatest graphics known to man. This is BS. But...they certainly don't suck. I'll rip them up on a thousand other topics but If I can't admit when they have a strenght then I'm nothing more than a fanboy/hater.


RE: smells political.
By rudy on 12/12/2010 4:00:06 PM , Rating: 2
They had intel integrated GPUs for most of the time people were saying that. Basically people do not know what they are talking about.


RE: smells political.
By Breathless on 12/9/2010 3:46:23 PM , Rating: 3
but... it has candy sprinkles - right?


RE: smells political.
By stimudent on 12/9/2010 5:09:35 PM , Rating: 2
Since Intel is involved, it's very probable that some form of ethical misconduct is going on here. Apple is probably getting a huge bonus of some kind similar to what Packard Dell was getting.


RE: smells political.
By Exodite on 12/9/2010 5:20:35 PM , Rating: 3
Considering the kind of discrete graphics Apple usually grace their products with it's not unreasonable to expect next-gen integrated graphics to compare favorably.

Don't get me wrong, Apple makes solid products but their top-tier PCs use mainstream hardware.

The 2010 MBP 13" uses a C2D and an integrated Nvidia 320M, it shouldn't be hard for Sandy Bridge chips to be competitive with that.


RE: smells political.
By chick0n on 12/10/2010 12:37:11 AM , Rating: 2
To most "Apple smart users", as long as Steve Jobs said "Intel's graphic are revolutionary & magical ...", it does not matter if Nvidia or anything else performs 100x better with proofs and everything, these Apple Zealots will still be able to find 1000 stupid ass excuses to counter you or simply say "you just don't know shit" hahaha


RE: smells political.
By SPOOFE on 12/10/2010 5:55:56 AM , Rating: 3
The fact of the matter is most people just plain don't have any use for anything other than a very basic graphics processor. Gamers know who they are and know what they want; the people that don't care don't have any reason to care.


RE: smells political.
By chick0n on 12/10/2010 9:00:54 AM , Rating: 3
it might be true on the "Windows/PC" side

but not on the Apple side.

I know how Apple zealots think because Im pretty much surrounded with a load of them. I always hear stuff like "Apple is the best", when I ask them why, the most common answer I get is "Well, It's from Apple, Steve Jobs design it, so its the best"


RE: smells political.
By michael67 on 12/10/2010 7:50:48 AM , Rating: 2
quote:
Something about this smells political rather than technical. Some sales team made a special deal, or some exects set up a partnership. Something like that.

Na Apple was done whit nVidia about 1.5~2y ago, it only bin taking this long because developing cycle is about 2y.

After bump-gate(*) fiasco Steve was not happy a specially because nVidia blamed everyone except the face in the mirror.
And guise what Steve is vindictive person, and guise what he did blame NV for a lot of problems whit broken deadlines and RMAs that Apple had to pay for also.

nVidia is just to arrogant for its own good, and keeps on burning bridges it still needs.

Dear leader thinks that he is center of tech industry, guise what, most of the tech industry disagrees whit him ;-)

Look at Dell line up its mainly ATi http://www.dell.com/us/p/desktops#facets=52538~0~8...
And not mouths differences whit Acer http://us.acer.com/acer/product_detail.do?slot30e_...

(*)I am my self also a victim of bumpgate as my 2000 euro SLI laptop died on me because of bumpgate, and it was just out side the 2y warranty.
Toshiba did not wanted to RMA him, even do its was a well documented production fault, only after a e-mail of my sister (lawyer) did they accepted the laptop for repair.
So yes i am also one of the people that dose not like nVidia business model, as it will be my second pick from now on if i have to get a graphics product, not that if they have the better product i would not buy from them anymore, just not first pick anymore ;-)


RE: smells political.
By bfellow on 12/13/2010 11:54:24 AM , Rating: 2
Stop whining! You bought an Apple!


RE: smells political.
By BSMonitor on 12/11/2010 3:57:23 AM , Rating: 1
Wrong. Try reading Anand's article on Sandy Bridge. The new Macbooks will be getting better than discreet Mobile GPU graphics currently available in Macbooks today.. And on top of that, all on 1 CPU/GPU piece of silicon.

The power/space/heat savings alone make the switch the perfect fit for Macbook Pros. Not to mention less power, 32nm Sandy Bridge, so more battery life.

Crap is your ignorance.


RE: smells political.
By Smilin on 12/13/2010 1:09:54 PM , Rating: 2
ah well. When you don't have an argument..insult!

Per anandtech the Sandy Bridge will run on par with a $50 Radeon 5450 which puts it's performance BELOW the current nVidia 330m.


RE: smells political.
By AstroGuardian on 12/11/2010 5:32:08 PM , Rating: 3
You say Intel GPUs are crap? So is Apple.
What's the big deal then?


RE: smells political.
By maroon1 on 12/13/2010 10:37:19 AM , Rating: 3
quote:
Plus we all know intel graphics are crap.


Really ? I don't know that

What I know is that Sandy-bridge is way faster than any of the current IGPs, and early review shows that it does perform in par with HD5450 (please read anandtech review).

I think sandy-bridge GPU offers more power than what most of MacBook users need.

You and many others said that intel graphics are crap just upon hearing the word "Intel graphics". Please look at the actual facts before you spread lies, and mislead other people.


RE: smells political.
By Smilin on 12/13/2010 2:59:58 PM , Rating: 2
quote:
Please look at the actual facts before you spread lies, and mislead other people.


Look whose talking.

*IF* the sandy bridge pulls off performance on par with the 5450 then that gives it the awe inspiring performance of a $50 graphics card. That is supposed to impress me?

Benches comparing the 5450 and the current MBP 330m indicate that the intel solution (that releases in the future) would be slower than the current solution today.

So like I said in the first place: we all know intel graphics are crap.

(it's not my fault that YOU don't know this)


Judging from all the comments
By amanojaku on 12/9/2010 5:45:18 PM , Rating: 4
It looks like no one has considered Apple is executing a shrewd strategy. The integrated graphics in Sandy Bridge has been shown to be good enough for certain games. I don't think it would be able to fully replace even a Radeon 5450 without modified game code (I could be wrong), but it will be good enough to compete based on price.

You see, the integrated graphics will likely make the CPUs slightly more expensive, but the overall cost of the motherboard without a GPU will mean cheaper manufacturing costs. No socket, no fan, no circuitry, etc... Apple will pocket the savings, because a cheap GPU doesn't make a lot of profit, but it still costs the same to stick it on the motherboard.

If a user hates the integrated graphics (and what gamer wouldn't?) the Radeon series will fit the bill. That means buying the more expensive notebook.

Either way, Apple makes more money.




RE: Judging from all the comments
By StevoLincolnite on 12/9/2010 9:53:27 PM , Rating: 3
PC's are currently several generations ahead of the consoles which seem to be holding us back a little graphically in games currently.
These IGP's are still IGP's, other than casual games... Don't expect to run Crysis, 3D Mark 11 and such with any semblance of speed.

If you buy a Mac for gaming, you bought the wrong platform.

Personally I will stick to PC, where I, the consumer, not the manufacturer choose the hardware.


RE: Judging from all the comments
By Pirks on 12/10/10, Rating: 0
RE: Judging from all the comments
By Maximalist on 12/10/2010 3:13:54 AM , Rating: 1
I am sure that the poster meant desktop PCs. Having said that, I have replaced several components in PC notebooks over the years. Most notably are business-grade notebooks such as Dell Latitude D- and E-series, and most recently HP EliteBook 8540w. In the HP unit, the graphics card is modular (so is the CPU, memory, optical drive, wireless, and battery). I have just upgraded from nVidia Quadro FX 880m to 1800m. For a person posting so frequently and opinionated, it seems that you need to educate yourself a bit on this subject.


RE: Judging from all the comments
By jimbo2779 on 12/10/2010 5:02:15 AM , Rating: 2
Wow Pirks you really love to be shown to be a totally ignorat and arrogant piece of ***. Why do you froth at the mouth at anything Apple? You have clearly been shown to be an absolute idiot yet again.


RE: Judging from all the comments
By wordsworm on 12/10/2010 8:17:50 AM , Rating: 2
Actually, I found the opinion stated before what Pirks said to be pretty lame. And, Pirks is right: you are very limited when it comes to purchasing a laptop. It's too bad that hasn't changed yet...

Pirks is a Mac enthusiast who has lived for too long amongst the anti-Mac crowd.


RE: Judging from all the comments
By Pirks on 12/10/2010 11:52:45 AM , Rating: 2
quote:
Pirks is a Mac enthusiast
Liar! I don't own ANY Apple products!


RE: Judging from all the comments
By Maximalist on 12/10/2010 5:58:40 PM , Rating: 2
If you mean the post by StevoLincolnite, then I find all his/her points to be reasonable and correct. The poster referred to PC as a platform, not limited to notebooks. However, in his reply Pirks attempted to ridicule nearly all points to make the poster appear unintelligent.

This includes Pirks' assertion that most gamers do not care about a possible penalty that modern PC titles may incur at the expense of much older generation graphics still used in consoles. The original point actually makes sense because it may be more attractive (cheaper, quicker, easier) to develop titles with eye candy and performance limited to the common denominator, which currently happens to be the console graphics. And to assert that consumers do not care for visual enhancements and performance in gaming is ludicrous.

The original poster mentioned lack of (relative) speed *running* Crysis, 3DMark, etc., on IGPs as opposed to discrete GPUs. Again, this is true, but Pirks clearly bent the meaning implying that the poster plays benchmarks.

Finally, a seemingly intentional twist by Pirks on the PC platform in general to notebooks in particular. It is well known that notebooks computers are designed to be portable and mobile by reducing size and weight. This main requirement historically limits notebook hardware choices and upgradeability, which are commonly accepted trade-offs. Users are not expected (or supposed) to change major notebook components such as enclosure, processor, and video card.

However, even with these inherently universal notebook limitations, the PC platform gives more choices to the end-user. As I indicated in my previous post, many PC notebooks can be relatively easily upgraded with a different CPU, memory, storage, wireless, battery, optical drive, and even GPU. I personally have upgraded PC notebook LCDs several times for higher resolution.

I do not see where StevoLincolnite went lame in his/her post.


RE: Judging from all the comments
By Iketh on 12/11/2010 2:47:15 PM , Rating: 2
ughhh the troll ownz you...


RE: Judging from all the comments
By StevoLincolnite on 12/10/2010 8:21:50 AM , Rating: 2
Not your most intellectual post ever Pirks.
Then again I can't expect miracles from someone who would rather sleep with a Mac than there actual partner.


RE: Judging from all the comments
By Pirks on 12/10/2010 12:07:24 PM , Rating: 2
Out of arguments, just as I expected. wordsworm in his post above was 100% correct - you did a lame post, I laughed at it and now you're pouting, poor baby :)))


Snow Leopard uses OpenCL???
By name99 on 12/9/2010 9:12:04 PM , Rating: 2
"Apple's Snow Leopard's performance is boosted by OpenCL"

This is an interesting claim. Please provide a single example of an operation in standard Snow Leopard that is boosted by OpenCL.

There are many indications that Apple cares about OpenCL for the future, and for all I know Final Cut and Aperture use it today. Mathematica has some very experimental support.
I am, however, unaware of anything in the base OS or in any consumer app (outside maybe games) that uses it today.




RE: Snow Leopard uses OpenCL???
By amanojaku on 12/9/2010 9:57:03 PM , Rating: 2
Of course Apple cares about OpenCL: Apple created it. It wants EVERYONE to use OpenCL. Snow Leopard has OpenCL built in, and OpenCL uses the GPU for computations (GPGPU). Thus, the graphics performance won't be any better, but certain computational tasks will be.

Think CUDA and FireStream.

http://www.apple.com/macosx/technology/#opencl


RE: Snow Leopard uses OpenCL???
By name99 on 12/9/2010 10:32:53 PM , Rating: 2
NONE of which answers my question about the claim that Snow Leopard USES OpenCL (as opposed to providing the technology for developers).

Do you not understand English?


RE: Snow Leopard uses OpenCL???
By amanojaku on 12/9/2010 11:33:42 PM , Rating: 1
If you wanted a better explanation you didn't have to be a dick about it. I already gave you what you needed to learn more about OpenCL and how it's being used, so fuck off. It sounds like you're too lazy and stupid to understand it, anyway.


RE: Snow Leopard uses OpenCL???
By Pirks on 12/10/2010 12:01:13 AM , Rating: 1
Chill dude, name99 is 100% right so stop picking on him. There is ZERO trace of ACTUAL USE of OpenCL in OS X 10.6 so whatever dumb bullshit Mick posts here about "Snow Leopard's performance is boosted by OpenBlahBlah" is just marketing paid by Jobs or something. Pass along people, just another usual Mick's dumbness, nothing more. Guy does not know what he writes about, hehe ;)


RE: Snow Leopard uses OpenCL???
By silverblue on 12/10/2010 8:03:22 AM , Rating: 2
"Snow Leopard delivers unrivaled support for multi-core processors with a new technology code-named “Grand Central,” making it easy for developers to create programs that take full advantage of the power of multi-core Macs. Snow Leopard further extends support for modern hardware with Open Computing Language (OpenCL), which lets any application tap into the vast gigaflops of GPU computing power previously available only to graphics applications. OpenCL is based on the C programming language and has been proposed as an open standard. Furthering OS X’s lead in 64-bit technology, Snow Leopard raises the software limit on system memory up to a theoretical 16TB of RAM."

Taken from http://www.apple.com/pr/library/2008/06/09snowleop...

I have to say it's somewhat ironic that Apple pioneered a language containing the word "Open".


RE: Snow Leopard uses OpenCL???
By Pirks on 12/10/2010 12:49:53 PM , Rating: 2
quote:
which lets any application tap into the vast gigaflops of GPU computing power
Where are those applications inside OS X 10.6? What? None? There you go, thanks again for proving my point, much appreciated.


RE: Snow Leopard uses OpenCL???
By silverblue on 12/10/2010 1:37:51 PM , Rating: 2
I agree, it would've made more sense for Apple to properly push this standard by including some apps coded for it, and I expect we're more likely to see this with Lion. However, whilst we're not having OpenCL software come out of our ears, AMD and nVidia are still playing around with it so they're at least serious about it, and Intel say Sandy Bridge will support OpenCL. In any case, a new standard takes time to gain traction, but again, Apple should've done more to promote it if it's that good. By having future systems with Sandy Bridge CPUs, even without discrete graphics you'll have a situation where each model shipped will inherently support OpenCL instead of just ones with specific nVidia GPUs. If Apple can push this standard then they're going to gain a lot of support courtesy of having a common standard. How far this goes remains to be seen. I do support what you're saying, though my reply was more to clarify what Apple's view of OpenCL was as opposed to saying you were wrong.

I'm more concerned with this statement:

"One key reason why Apple may be kicking NVIDIA to the curb is Intel's promise to change. More precisely, Intel has pledged to push OpenCL -- a GPU computing language -- out for Sandy Bridge in the near future. Apple's Snow Leopard's performance is boosted by OpenCL, so many had thought NVIDIA -- long the only producer of OpenCL products -- was a lock for future Mac notebooks."

If this was the case, then ATi/AMD cards wouldn't be on the supported GPUs list.


RE: Snow Leopard uses OpenCL???
By name99 on 12/10/2010 4:22:41 PM , Rating: 2
"If this was the case, then ATi/AMD cards wouldn't be on the supported GPUs list."

Oh for god's sake. ATI can provide OpenCL support for their chips if they want, just like they provide OpenGL support. Or they can not. It's their choice. But it's not like Apple controls OpenCL and won't let other people play. And it's not like the fact that an Intel integrated GPU supports OpenCL doesn't mean that a vastly superior external GPU is a better choice in all situations (which means basically everything but MacBook Air) that allow it.

And what exactly would be the win for Apple in discouraging ATI from making better products for Mac? I honestly don't understand the minds of some of our more paranoid web citizens.

The future is that separate GPUs are dead --- like the memory controller they WILL on die. If nVidia refuses to face that fact and make a deal with Intel to be bought soon, they will be irrelevant in ten years. Ask Weitek how great it feels to ignore an obvious trend in engineering. Like all transitions, there will be a period of confusion and tumult, but the advantages of being on-die and tightly coupled to the CPU are just too large to ignore.


RE: Snow Leopard uses OpenCL???
By Alexstarfire on 12/10/2010 7:27:58 PM , Rating: 2
Separate GPUs will never die. They might disappear in the mobile sector, but desktops will ALWAYS have a separate GPU option. There is simply no reason to not have one on a desktop.


RE: Snow Leopard uses OpenCL???
By Iketh on 12/11/2010 2:54:44 PM , Rating: 2
lol what? I have needs for low-power desktops...


Discrete graphics
By CZroe on 12/9/2010 4:09:56 PM , Rating: 4
Discrete graphics on even the lowest-end Mac Book Air is the only reason I've even been interested in an Apple notebook. They can kiss my interest goodbye.




RE: Discrete graphics
By nafhan on 12/9/2010 4:28:23 PM , Rating: 3
The MBA GPU is an integrated part (part of the north bridge, I think). The Sandy Bridge GPU will probably be slightly faster than the Geforce 320M in the MBA.


RE: Discrete graphics
By CZroe on 12/9/2010 5:57:17 PM , Rating: 2
You're right. I was under the impression that there was an Optimus-like implementation specific to Apple products. Even so, it is much closer in performance to budget discrete parts than to Intel's current solutions.


RE: Discrete graphics
By Exodite on 12/9/2010 5:23:29 PM , Rating: 2
When integrated graphics are good enough to do hardware-decoded HD video and everything up to mainstream gaming, what's the real need for discrete graphics on the thin/light/low end of things?

I mean it's not like the 13" MBP is a viable AAA title gaming platform anyway.


RE: Discrete graphics
By CZroe on 12/9/2010 5:54:54 PM , Rating: 2
The 1st-gen Alienware M11x isn't "a viable AAA title gaming platform" either, but I got one and I play games on it. It was the only game-capable notebook that could fit in my motorcycle tank bag. I would expect the next-gen nVidia integrated graphics to close the gap between 220/320 and 335 graphics capabilities and I would then consider an 11.6" MBA+BootCamp Win7 instead.


By jbwhite99 on 12/9/2010 6:48:37 PM , Rating: 2
Please tell me which nVIDIA chipset supports the new Core i-series processors. There is none - Intel will not allow nVIDIA to develop a chipset that supports the Core i3/i5/i7. The lawyers at both companies are frothing at the mouth on this one - but as of now, nVIDIA doesn't have a single-chip solution that will cover both video and chipset.

Apple has a few choices: stick with n-2 technology and stay with old Core chips, go with Intel integrated graphics on ew Core i3 processors, or go with Intel integrated chipset and add AMD/nVIDIA graphics. The problem with the last choice is that if you do that, you have to give up the super thin notebooks.




By Maximalist on 12/10/2010 3:37:31 AM , Rating: 1
Nice analysis. Recently I heard that Intel and nVidia might be reconciling on this licensing issue.


By Smilin on 12/13/2010 1:12:22 PM , Rating: 2
Why would you have to give up the super thin notebooks just to have discrete graphics?


Should run Adobe Flash 2 just fine......huh?
By kilkennycat on 12/9/2010 4:02:14 PM , Rating: 2
Seems as if Steve Jobs' hate for Adobe will have reached a self-immolation level if this rumour is true. So we are going to have crippled (Intel) integrated-GPUs in Mac laptops just in time for the latest version of Adobe Flash which uses all the available power in the GPU? I expect to soon see pictures of SJ with his nose cut off and the bloody knife in his hand.




RE: Should run Adobe Flash 2 just fine......huh?
By Tony Swash on 12/10/10, Rating: 0
By Luticus on 12/10/2010 9:25:41 AM , Rating: 2
Progress is not killing flash, steve jobs is attempting to kill flash artificially by disallowing it on his platforms.

Personally i still run flash on all of my deviced (including my phone). Granted i block flash unless i'm trying to look at content that uses it but i enjoy my option to run it.

I don't buy apple products because i don't like having my decisions made for me.


Proly
By burnstagger on 12/10/2010 12:04:04 PM , Rating: 2
those greedy Chinese engineers at NVIDIA were stealing Apple's IP. It happens everywhere. Racist bastards.




RE: Proly
By Maximalist on 12/10/2010 10:22:28 PM , Rating: 2
No, it's because Humpty Dumpty double-crossed Alice in Wonderland.


By CrazyBernie on 12/9/2010 3:43:27 PM , Rating: 2
If you're easy, your partner will get bored and leave you.




Dude... Really?
By Smartless on 12/9/2010 3:59:49 PM , Rating: 2
quote:
...with Intel iGPUs...


Sorry that ruffled the Apple-hating feathers on my back for a bit. And man this article reads like it came off MSN's wonderwall. I know that's the point but it kinda felt like Desperate Housewives instead of corporate politics and design choices.

Well either way, I almost hope Intel's lawsuit backfires on them so there will be more "drama".




Sandy Bridge IGP performance
By KoolAidMan1 on 12/10/2010 4:06:25 AM , Rating: 2
Ok, now I am very curious to see how the Sandy Bridge IGP compares with the NVIDIA 320M. For an integrated GPU, the 320M is surprisingly good. There are videos of people playing Starcraft 2, Left 4 Dead 2, and even Crysis on reduced settings, with the 11" and 13" Macbook Air.

The preliminary SB video benchmarks I saw several months back didn't lead me to think that they matched up with NVIDIA's integrated GPUs. Maybe they have improved since then, or more likely they are "good enough" where it finally makes sense to go with Intel's newer CPUs instead of continuing with faster NVIDIA graphics on an old Core 2 Duo. We'll see.




Frankly...
By damianrobertjones on 12/10/2010 10:43:09 AM , Rating: 2
...I want Apple to dump Nvidia. The CEO comes across as a smary dwick and after making bundles over the years with cards, he turns and states that he loves Apple.

He is for sale




"Can anyone tell me what MobileMe is supposed to do?... So why the f*** doesn't it do that?" -- Steve Jobs














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki