backtop


Print 111 comment(s) - last by Smilin.. on Jun 18 at 6:26 AM

Apple is looking to make its ultrathin product line the models to beat amid tougher competition

It's that time a year again.  Love it or hate it, Apple, Inc. (AAPL) no industry player is perhaps as much of a showman as this Cupertino company.  P.T. Barnum once remarked, "Without promotion something terrible happens... Nothing!"

Apple tirelessly promotes itself year-round. But, of all the events on Apple's yearly calendar, one name has come to dominate in recent years -- the Worldwide Developer Convention.

While much of the attention, as always if fixated on the company's iPhone/iOS announcements, at this year's WWDC 2012 keynote address Apple also announced a slew of new additions to its personal computer line, including the release of its new version of OS X, OS X 10.8 "Mountain Lion", new MacBook Pros/Airs (and maybe even new iMacs...waiting on that).

Here's a taste of what new Apple CEO Tim Cook offered up:

I. Laptops

i. MacBook Airs (11-/13-inch)

Of all categories laptops are a place where Apple is most competitive in terms of cost versus deliverables versus its rivals.  While Apple has delivered on certain fronts (e.g. small form factors with the Mac Mini), it's hard to deny that there's a big "Apple Tax" on Mac Pro desktops.

Laptops are a place where that "Apple Tax" is largely a case of you get what you pay for.  Apple has long competed with another pricey player -- Sony Corp. (TYO:6758) to deliver the thinnest, lightest, most full-featured ultrathins.  Now with ultrabooks coming from a slew of other companies, Apple has to put on its 'A' game.

At the 2012 WWDC keynote Apple announced new 11- and 13-inch MacBook Airs.  The new designs pack a new dual-core 1.7 GHz (11-inch) or 1.8 GHz (13-inch) Ivy Bridge third-gen Core i-Series CPU from Intel Corp. (INTC), up to 8 GB of DRAM, and up to 512 GB of NAND flash storage in the SSD-driven designs.  

The base configuration comes with a 64 GB (11-inch)/128 GB (13-inch) SSD and 4 GB DRAM (both).

The new Airs also add USB 3.0 support to the two onboard ports that grace either side of the laptop (legacy USB 2.0 support is also maintained).  The USB 3.0 inclusion isn't exactly glamorous, but it at least fills in a long criticized lack in Apple's line.  A 720p FaceTime camera is also added, for those who use Apple's video-chat service.

The bad news for those Apple fans who have been blasting laptop makers for their "garbage" "low-resolution" displays, is that Apple is sticking with its low-resolution 1440x900 pixel LCD units in the MBA line (but wait, there is a silver lining, read on) -- not even bleeding edge by its own standards.  Unlike many upcoming Windows 8 designs, there's still no touch on the screen -- for better or worse.

There's also no discrete graphics, MBA owners will have to make due with integrated HD 4000 graphics from Intel.  The 11-inch starts at $999 USD (filling the slot once occupied by the defunct MacBook) and the 13-inch starts at $1199 USD.

ii. MacBook Pros (13-/15-inch)

Next up is the MacBook Pro refresh.  

Apple first unveiled a 13- and 15-inch models -- relatively ho-hum designs, with 1280x800 and 1440x900 pixel (respectively) displays and new Ivy Bridge CPUs.  The pair start at $1199 and $1799 a pop, respectively.  The 13-inch has 2.5 and 2.9 GHz dual-core CPU options, while the 15-inch model's processor options are bumped to 2.3 and 2.6 GHz quad-core chips. 

The base configurations comes with 4 GB DRAM and a 500 GB HDD.  The new Pros are 0.95-in. thick and weigh 4.5 and 5.6 lb, respectively.

An upgraded 17-inch model was not mentioned, it's possible Apple is eliminating that SKU.

iii. "Next Generation" MacBook Pro (15.4-inch)

But wait -- Apple packed a surprise -- a much more impressive single new entrant into the MacBook Pro line.  Tim Cook teased, "With the MBA, the team did something bold. There were aggressive in embracing new tech. They also got rid of stuff that was trending out. That enabled them to do something bold. So we've been asking the team to think about what would make the next gen MBP?"

"Want to know the answer?  You want it to have a killer new display. You want an architecture built for the future, you want it to be light. You want it unlike anything else.  Want to see it? Let's show it now.  The most beautfiul computer we have ever made."

Remember those dashed "Retina Display" hopes with the Air?  Well Apple is including an incredible 15.4-inch 2880x1800 pixel display on its high-end laptops.  So MBP gets double the resolution, while the MBA gets a miniscule bump.


MacBook Pro's flagship model indeed received a Retina display. 

Tim Cook remarks of the new screen on the 'Pro, "The pixels are so small that your retina cannot discern them."

Among the apps promised to make good use of that impressive resolution are Apple's own Mail, Safari (browser), iMovie, iPhoto, Aperture, and Final Cut Pro.  Apple's frienemy Adobe Systems Inc. (ADBE) is also offering HD Photoshop, while AutoDesk, Inc. (ADSK) is giving the high-resolution treament to its AutoCAD app.

For the gamers out their Activision Blizzard, Inc. (ATVI) Retina display Diablo III was briefly demoed.

The new "Next Generation" 15-inch MacBook Pros are also as thin as the Air (0.71 in.) and only weigh 4.46 lb.  What's more they also feature GeForce GT 650M graphics (1 GB GDDR5) from NVIDIA Corp. (NVDA) (Kepler chip).


This new super-ultrathin packs up to a 768 GB SSD (yes, you read that right).  It gets the same 7 hour battery life as its lesser 'Pro brethren, despite its 220 ppi screen.  The discrete graphics remain unchanged, but it supports an upgrade to a 2.7 GHz CPU and support for up to 16 GB of DRAM.  Bluetooth 4.0 is onboard.

The base configurations comes with 8 GB of DRAM, a 256 GB NAND SSD, and a 2.3 GHz quad-core chip.  That variant costs $2199 USD, a cost Apple promises is due to all its custom components like "asymmetrical fans" and other ultrathin oddiities.

All the new laptops are available immediately to ship, according to Tim Cook

II. OS X 10.8 "Mountain Lion"

Microsoft Corp. (MSFT) has received much criticism (including from Apple) for boldly importing pieces of its mobile operating system du jour (namely the Metro UI bits) into its upcoming Windows 8.  In many ways Apple is following a similar approach, bring onboard more iOS-like features, after first opening up the App Store expansion Mac App Store.

That said Mountain Lion's new features mark slightly less of an extreme makeover than Microsoft's arguably, and thus should be a bit less of a system shock to veteran users (though on the flip side potentially passing on the benefits of a more extreme redesign).

Mountain Lion
[Image Source: HD Wallpapers]

i. Yay Cloud

Craig Federighi previewed the new OS.

Apple claims that there are 65 million Macs in the wild, with 26 million of those on OS X 10.7 Lion.  Humorously Apple bragged that its own OS outsold Windows 7.  Of course it's talking about percent adoption within its drastically smaller user base, but in Apple's world it's the "fastest"* selling operating system in history (*=some restrictions may apply).  So take that, reports of slowing OS X Lion adoption.

Mountain Lion brings iCloud integration.  Apple has added "Documents in the Cloud" to iCloud, which allows you to use Pages, Numbers, Keynote, Preview, and TextEdit options to present or edit your content on the go.

The new OS also supports cloud data backup (AirPlay mirroring) when its in "sleep" mode.  The process is done silently and power-efficiently, according to Apple. 

iCloud backup
Airplay mirroring [Image Source: The Verge]

ii. New Apps

The new OS introduces 3 new apps -- messages, reminders, and notes -- whose purposes are pretty self-explanatory.  Apple also has integrated dictation with a Siri-like icon into Mountain Lion.  It even works, as Apple humorously notes in Microsoft Word.  

Mountain Lion new apps
OS X 10.8 Mountain Lion also introduces 3 new core apps. [Image Source: The Verge]

Then there's the notifications -- a feature some disliked in preview builds. (Apple has added the ability to turn them off at least.)  Sharing has also been made easier with Apple's GUI offering many options such as Twitter, Facebook, AirDrop, or Message.

A new build of the Safari browser is also onboard with unified search (like Google Inc.'s (GOOG) Chrome).  Apple claims Safari is faster than Firefox 13, Chrome 19, and Internet Explorer 9 in JavaScript.  There's a couple new additions to the browser like iCloud tabs (syncs your mobile tabs) and Tabview, which allows easy zooming in and out.

Other new features include the "Gatekeeper" security app, offline reading lists, Mail VIPs, LaunchPad Search, and more -- 200 in all by Apple's estimation.  Apple is also looking to woo Chinese buyers with freshly added Baidu.com, Inc. (BIDU) support and an improved Chinese dictionary.

iii. Availability

Apple is releasing Mountain Lion next month for $19.99 USD.  The license is good for installing on any supported existed (Apple) system.  For those buying the aforementioned fancy new laptop designs, they will receive a free bump to Mountain Lion, so early adopters won't be burned (not that $20 USD is a big deal after you've ponied up $2200 USD for that new MBP).

Sources: Apple, The Verge



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

By CubicleDilbert on 6/11/2012 6:08:57 PM , Rating: 0
Now this is something I don't understand. The resolution of 2880x1800 on 15.4" is just complete nonsense! Its only purpose is marketing and boasting to have the highest resolution screen in the world.

I have been using a Thinkpad T61p with ultrasharp WUXGA (1920x1200) on 15.4" for years. And that is "Retina" as well, according to Apple's own calculations. The pixels are not visible anymore because on a laptop your eyes are at least 40-50cm away from the screen. You'd had to rub your nose on the LCD to see individual pixels.
I have to enlarge everything to make it readable, for a full working day. Now having 2880x1800 improves exactly nothing. The invisible pixels become even more invisible, but the graphics processor has to move around a whole lot more of those invisible pixels.

It is just pure marketing hype without any useful meaning. According to physiological calculations the eye resolution with 20/20 can distinguish around 140 dpi at 40cm (notebook working distance). Anything beyond that is overkill. WUXGA is ca. 147dpi or a little bit more.

I tried to use my WUXGA screen in 100% size (Win7), but everything is soooo tiny it makes my eyes hurt after 15 minutes. Now imagine 2880 pixels, which will require a microscope. It just doesn't make sense anymore.

On the iPad Retina was fine, because you are much, much closer to the screen. But on laptops it is nonsense. I need to have ergonomics for a full 10h working day and not a fancy gadget.




By TakinYourPoints on 6/11/2012 6:32:35 PM , Rating: 5
quote:
I tried to use my WUXGA screen in 100% size (Win7), but everything is soooo tiny it makes my eyes hurt after 15 minutes. Now imagine 2880 pixels, which will require a microscope. It just doesn't make sense anymore.


Everything scales up properly, just like with an iPad. UI and text scale up, and OS X has had 2x res assets for months now. The latter is why rumors of a MBP with a 2x res display appeared in the first place.

None of what you're worrying about is a concern, things will be the same size as before, just with no visible pixels or anti-aliasing in text.


By retrospooty on 6/11/2012 6:57:38 PM , Rating: 2
"I tried to use my WUXGA screen in 100% size (Win7), but everything is soooo tiny it makes my eyes hurt after 15 minutes. Now imagine 2880 pixels, which will require a microscope. It just doesn't make sense anymore."

Windows was never set up to scale very well. You have the dpi adjustment, but you just lose some screen real-estate. Windows 8 corrects that, as does IOS, MAc OS, Android etc.


By Rand on 6/11/2012 8:11:54 PM , Rating: 2
Win8's DPI scaling hasn't changed at all except in Metro. The desktop's DPI scaling is unchanged from Win7.


By goatfajitas on 6/11/2012 8:28:07 PM , Rating: 2
Windows 8 will have high res screens like this, they have already announced several different hi res options. You can bet that they have a solution for it


By ritualm on 6/12/2012 10:04:34 PM , Rating: 2
Not now, they don't.

Below is what happens when you run Windows 8 Release Preview at the full 2880 x 1800 screen resolution:

http://weekly.ascii.jp/elem/000/000/093/93342/1206...

http://weekly.ascii.jp/elem/000/000/093/93341/1206...

Whatever Microsoft is using for DPI scaling on W8, it's clear this isn't going to work.

source: http://weekly.ascii.jp/elem/000/000/093/93360/


By retrospooty on 6/13/2012 8:56:48 AM , Rating: 2
Good thing it isnt out yet. When it comes out then take a look.


By FaaR on 6/11/2012 7:58:45 PM , Rating: 3
quote:
Its only purpose is marketing and boasting to have the highest resolution screen in the world.

No, the purpose is twofold:

One, to offer smooth text and graphics, without any visible stair-stepping in characters - maybe not quite so important for us latin alphabet users, but asian pictograms are vastly helped by it. And asia is a rapidly growing market.

Anyway, even bog-standard regular characters look much nicer at better-than-print resolution, as this screen allows when looked at at a regular viewing distance.

Reason two, screen real-estate. If you watch the presentation, you see a demo of a video editing application where you have one full-HD, 1080P stream, with lots and lots of room on the sides for user interface to manipulate said video. It's definitely an advantage to have the resolution to fit all of this all on the same screen without the need for multiple screens, or scaling down the video. You can see everything all at once, at 1:1 pixel mapping without scaling distortion or scrolling around.

Using windows as an example of trying to portray this tech as merely a marketing ploy is pure fail from the start. Windows isn't currently geared to handle high DPI screens properly.

Please broaden your horizons a bit. Maybe you should go seek out a store demo of a retina iPad and play with it for a while. I'm sure you too will see the light eventually. :)


By CubicleDilbert on 6/11/2012 8:23:41 PM , Rating: 3
It seems you just didn't understand my arguments.
I know the iPad 3 very well and I did mention the calculations.
The ultra-high resolutions does not make sense, because the human eye can only distinguish a certain amount of different pixels per arcsecond. It is pure physics. The human eye limit with 20/20 vision at 40cm distance is around 140dpi. Even Apple has this in their white paper when they explain retina. Physicists know that better and have different names for this. The limiting factor is the human retina and its low resolution. Eagles have much better vision than humans. They would look at the new Macbook and would see an ocean of individual pixels. But not humans.

Physically speaking, any resolution higher than the human eye can distinguish is a complete waste.

And yes, the more pixels the better and sharper the image. Think at laser printer and needle printer. Big difference.

But you will definitely not see a difference between a laser printer with 1200dpi and 2400dpi when looking at the printout.

So in conclusion anything beyond 145dpi on notebooks is a waste and adds only to marketing hype.
Working at 145dpi and Windows 7 or Ubuntu with standard text 100% resolution is next to impossible for a full 10h working day. You would have to enlarge fonts, icons and everything. I do this all the time and Windows does it almost perfectly.

And your argument, that with ultra-high resolution you can edit 3 videos at the same time is not valid, because then you would have to run tiny fonts and menues, which is not feasible for a 10h/7d working professional. That's why you have the 30" Apple display.


By integr8d on 6/11/2012 9:03:10 PM , Rating: 4
Agreed. Res is overhyped. Give me 100% AdobeRGB, 10-bit (wishful thinking) and decent viewing angles and then we'll talk.


By CyCl0n3 on 6/13/2012 4:58:20 AM , Rating: 2
Agreed. Totaly unnecessary to have anything above 140dpi.
The eye cant see difference, it needs more graphic power and rescources, thus it needs more energy, means worse battery life. Also Full HD Movies loose quality and gaming will be close to impossible on modern games on that resolution. What a waste on a mobile (Notebook) device.
So basically the usage gets hugely limited.


By inperfectdarkness on 6/13/2012 7:09:25 AM , Rating: 2
again. 1080p. 15.6" screen. i can see the aliasing on my desktop icons. guess what? that's > 140 ppi.

your logic is fatally flawed.


By inperfectdarkness on 6/13/2012 6:50:06 AM , Rating: 2
http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_mon...

you want to move backwards in technology. thanks for playing.


By inperfectdarkness on 6/13/2012 7:18:32 AM , Rating: 2
Has it ever occured to you that how the human eye perceives an image may be different from how a display renders it? That is to say (assuming 40cm viewing distance) that the human eye may not be able to detect higher ppi, but a display at 140ppi may still render anomalies visible to the human eye (aliasing, etc).

Perhaps this is a function of rendering abnormalities...but it still doesn't change the fact that if these anomalies can be perceived by human eyes--ANY human eyes--that it is sufficient cause to increase the resolution.

There MUST be a reason why printed pictures have different effective "dpi" than a computer monitor--while displaying the same image at equal quality. It is a function of the media. 140ppi for a human eye != 140ppi from a TN panel display.

As far as I'm concerned, until games can be played in resolutions where AA is a feature no longer requiring support--then I'll take any resolution boost I can get.


By testerguy on 6/14/2012 7:46:26 AM , Rating: 2
Guys, lets try and stop the complete disinformation on this site, OK?

Let me take you through some maths/science:

The accepted capability of an eye with 20/20 vision, is denoted as an ANGLE. That angle is the smallest viewing angle between two objects (in our case, pixels) at which the human eye can discern the two separate pixels. If the angle is too small, the pixels blur into one when processed by our brains.

That angle has a commonly accepted scientific value. That value is one arcminute, or 1/60th of a degree.

Due to the laws of trigonometry, as an object becomes closer, the top and the bottom of that object, as perceived by our eyes, become separated by a greater angle. Thus, our ability to perceive individual objects (or pixels) is inversely proportional to the distance. Or, in plain English, if things are closer we can see the detail more easily.

It is possible, therefore, using the above two facts, to calculate the necessary PPI a device would need for the pixels to not be discernible, taking a single parameter of the distance the device is held away. Since PPI is measured in Inches, the distance must be too.

Lets take an example of the iPhone 4S - which Apple claim is held 12 inches away (or 'around 1 foot').

Apply an angle of 1/60 degrees over a distance of 12 inches, gives you 1/(Tan(1/60)*12) which is 286 PPI.

If we assume a distance of 11 inches, this gives you 1/(Tan(1/60)*11) which is 312.5 PPI.

Thus, if you agree that the iPhone 4S is held 11 inches away or more, it qualifies as a 'retina' display - where 'retina' display means a device that someone with 20/20 eyesight can't distinguish pixels on from normal viewing distance.

Now, lets apply this to two other examples. Firstly - the new iPad. The new iPad has a PPI of 264. Lets reverse engineer the distance at which this qualifies as a 'retina' display:

Distance = (1/264)/Tan(1/60) which gives us 13.0 (to 1 dp). Therefore, if you believe that the iPad is held (on average) at 13 inches or more away from your eyes, the iPad 2 also qualifies as a retina display.

Now, finally, lets address the claim in the post to which I'm replying that 40cm distance requires 140dpi. 40cm is 15.75 inches.

Applying 15.75 inches into the above formula gives us: 218.27 PPI. Not the 140 DPI which is claimed .


By inperfectdarkness on 6/15/2012 5:03:30 AM , Rating: 2
THANK YOU!

It's been "clear" to me from the get-go that 200ppi+ is where the 15" laptop segment needs to have its screens. I applaud you for mathematically proving it.


By shyhh on 6/13/2012 9:58:57 AM , Rating: 2
Reason number 2 is not right. Anandtech reviewed the screen and the maximum resolution you can use as desktop is 1920 x 1200. The extra pixels goes to waste.

However, judging on how people were rushing to buy the new iPad, introducing the retina macbook makes absolute business sense.


By CubicleDilbert on 6/12/2012 4:16:41 AM , Rating: 6
Thanks for the flowers, calling me a moron.

Maybe you should take a 101 course in physics and physiology before shouting out such drastic insults.

You just don't get it and I am giving up, arguing probably with an Apple fanatic.

The required resolution of a device always depends on the maximum capabilities of the human eye. Beyond that it is a waste of technology and money.

You don't seem to understand this basic concept and why an iPhone 4 perfectly needs 300dpi because it is just right before your eyes. And why an iPad 3 needs 225dpi because it is further away and why a laptop only needs 150dpi because it is an armlength away.

In the football stadium you need only 0.001 dpi for a perfect huge LED panel, because you are 100-200 yards away.

Just calling someone a moron because you don't understand the fundamental basics of physics just makes everyone else understand what you are: an idiot.

Wish you a nice day... thanks. >:-(


By Mitch101 on 6/12/2012 5:18:25 PM , Rating: 2
Apple needs the retina screen to hide the subliminal messages it feeds to its users.


By mead drinker on 6/12/2012 6:35:36 PM , Rating: 2
Ummm... I use my laptop at distances that are an arm length away very few times but use it at distances that you would define as "ipad" or "iphone" proximity. I can hardly think of the occasions when my laptop is sitting neatly on a desk and I am roughly 16" from it. Laying in bed watching streaming content or surfing the web, closer. Perched on a media cart while I stand and hover almost over it, closer. On one hand acting as a stand, closer.

I shoot and edit footage for a living, opening a 2K file at 1:1 and having all of the panes in an editing suite on the bottom and sides of the program window is pretty priceless. Also the ability to show my client what a ~3K vs 2K output would look like and their best efforts to locate pixels as their noses crush my screen sells the services of my Red Epic even more. Then I get to tell them that the camera shoots 5K. For me its a difference maker.


By TakinYourPoints on 6/12/2012 7:32:25 PM , Rating: 2
You'll be able to run full 1080p video in video editing software while also keeping the full editing and media management UI on screen. That is pretty great.


By TakinYourPoints on 6/12/2012 7:30:08 PM , Rating: 2
+6 for a post justifying the argument that going over 96 DPI doesn't matter on a desktop display.

Its nice to see the editors here also have their heads in the sand.


By inperfectdarkness on 6/13/2012 1:45:20 AM , Rating: 2
The fact that this gets a 6 is probably the most hilariously outrageous thing I've ever seen on DT.

He's making ARBITRARY judgments about the lengths you view said screens from. I sit no more than 18" from my 15.4" laptop screen, and sometimes closer. I can clearly see aliasing in my desktop icons on a 1920x1080p screen.

Is he correct that "retina dpi" varies on viewing distance? yes---but he's hardly correct at making arbitrary judgments on what those DPI should be...based on some random figure for viewer distance.

This guy should be DOWN rated, not UP rated for this comment.


By Shadowself on 6/13/2012 5:31:45 PM , Rating: 2
Obviously YOU (and the person who gave you a 6) NEVER heard of things like "edge effects" and "vernier resolution" and "sensor [human eye] motion affecting perceived resolution" and "super resolution effects of multiple frames".

Don't be so naive. Go look things up. The "one arc second resolving power of the human eye" has been debunked time and time again.

One very simple example can prove this --- and it is purely digital and not at all an analog system like the human vision system:

At worse than 1 km resolution sensor resolution the old GOES satellites easily create an image of the bridge crossing Lake Pontchartrain (and have for many, many years). AND NO ONE will ever try to claim that bridge is even close to 1 km wide. I believe at its widest point is is 150 or less feet wide across the spans -- WAY under that 1 km (about 5%).

By your reasoning, the old GOES satellites should never be able to show that bridge, but they do all the time.

And that is only one effect that shows up and proves simple systems can perceive (and record) higher resolution than that stupid yardstick of 1 arc minute on which you are relying. In some tests by the U.S. military they have shown that some people can perceive effects as small as 1/30th of that 1 arc minute resolution.

You may not be a moron, but you do need to think beyond the simplest of terms. You need to understand more than just the fundamentals and basic concepts -- and so does that person who gave you the 6.

Oh, and as a physicist who has worked on everything from designs for our nuclear fleet to designs for imaging systems for satellites, I DO KNOW THE FUNDAMENTAL BASICS OF PHYSICS!


By inperfectdarkness on 6/14/2012 2:18:42 AM , Rating: 2
THIS deserves a 6.


By inperfectdarkness on 6/12/2012 4:04:01 AM , Rating: 1
Fornicate thyself with a whetted instrument.

1. FINALLY someone is moving past 2MP resolution on a laptop. This should have happened years ago, as WUXGA was the standard for high-resolution back ~2008.

2. I'm estatic that someone is putting 16:10 resolution back into laptops, rather than the abhorrant 16:9 resolution. All that BS about letterboxing is a load of hooey when 22:9 is what all new movies are being released in today--and I for one do NOT want a 22:9 display.

3. People who complain about font size & the like are obviously not tech-savvy enough to figure out how to adjust their computer's display settings. Worst-case, you set your monitor to display in 1/2 native resolution and never worry about your PEBKAC-induced eyestrain. The rest of us in the real-world will enjoy the additional real-estate, sharper-picture, and higher-resolution gaming.

4. Funny, but you don't hear people complaining about 120hz or 240hz TV's as being "complete nonsense". The established limit for the human eye is 60FPS, so everything above that must be stupid and wasted--by the same flawed logic applied to denigrating "retina" displays.

5. I dispise apple. With a passion. And yet, this is perhaps the first time in history that I would seriously consider buying from them--since they've proven willing to tread where others have been content to sit on their fat-asses and let progress stagnate. If I could get better hardware in it, I'd buy an MBP, format it, run win 7, and live happily ever after.

I absolutely hate people like you. People who stand in the way of technical progress because of some antiquated, illogical concerns stemming from a lack of user knowledge.


By TakinYourPoints on 6/13/2012 5:16:18 AM , Rating: 2
2. 16:9 is standard throughout most of the Macbook line. The only exception is the 11" MBA, and that's the result of reducing the dimensions without shrinking the full size keyboard. Otherwise they've all been 16:9 for over a decade.

5. Internals are dictated by thermals and a target for battery life. Getting something like a 680M in there would result in a 2" thick notebook with two hours of battery life, not something just over half an inch thick with seven hours. Either way, a quad i7 and a 650M (about as fast as a 560M) in that slim a chassis was better than I was expecting.

I completely agree though, 16:9 has no place in anything smaller than a 1440p 27"


By inperfectdarkness on 6/13/2012 7:06:30 AM , Rating: 2
I was mainly referring to the Nvidia GPU. I'd much rather have ATI in there. Radeon 6970M would have been more attractive to me. I'd also be worried about sub-par cooling--a notorious problem for apple designs--as I game extensively on my laptops.

I'm glad I'm not the only one who hates 16:9 on my PC.


By chemist1 on 6/15/2012 11:56:56 PM , Rating: 2
Though for me, the NVIDIA GPU is appealing because of their CUDA programming platform, which enables the GPU to be used for highly-parallel scientific computation. ATI has something similar, but I don't believe it's as well-developed.


By chemist1 on 6/15/2012 11:50:18 PM , Rating: 2
quote:
16:9 is standard throughout most of the Macbook line. The only exception is the 11" MBA, and that's the result of reducing the dimensions without shrinking the full size keyboard. Otherwise they've all been 16:9 for over a decade.


I'm afraid it's the reverse of what you stated (maybe it was a typo). Going back to their introduction in 2006, 16:10 has been the standard for the MacBook line (both MacBook and MacBook Pro) (see http://en.wikipedia.org/wiki/MacBook and http://en.wikipedia.org/wiki/MacBook_Pro). That is, to my mind, one important part of their appeal. The 11" MBA, which is 16:9, is the exception.


By WalksTheWalk on 6/14/2012 11:52:37 AM , Rating: 2
No one's stopping you from dropping $2,200 for the Mac Pro Retina. Be my guest...

(BTW - That's the base price)


By Flunk on 6/12/2012 9:03:01 AM , Rating: 2
You do have to consider that there are other people and that everyone's eyesight is different. I'm quite nearsighted and because of that I actually see things that are close to me much better than someone with "average" sight. There are also people with above average vision. So what may be good for you may not necessarily be right for everyone.

I do see your point that the resolution seems excessive, I just have a bone to pick with your reasoning because there is a wide range of human sight to account for.

Also, you won't need a microscope for the new MacBook. Apple will just blow up the size of all the text and images so all the onscreen objects will be the same size, just more detailed. But you're right, that probably doesn't matter and even if it does it doesn't matter much.


By CubicleDilbert on 6/12/2012 9:49:58 AM , Rating: 2
I heard that the ultra high resolution is because it is just 4x Macbook (1440x900) resolution and old programs can just be scaled up. Just like with the step from regular iPhone to iPhone 4.

The human eye has only a limited amount of light receptors on its retina, much lower than e.g. eagles. And the lens is suboptimal too. The very best human eyes (beyond 20/20 vision) can distinguish about 20-30 arcsec. Regular humans (like me) have maybe 45-60 arcsec resolution.

Now building a notebook LCD that satisfies a 5-10 arcsec human eye is nonsense, because no such human exists. But it increases production costs dramatically.

I guess Apple marketing just wanted this high resolution for boasting and Apple engineers wanted the simple 2x scaling horizontally and vertically. And who cares, there are enough Apple fanatics who have more than enough money to spent on a USD 2300-2800 notebook. I don't mind.


By aliasfox on 6/12/2012 2:38:12 PM , Rating: 2
Ah, but building a display with 5 arcsec resolution would allow you to pick any resolution between 5 and 45 arcsec and still have perfectly legible (ie non-fuzzy) screens.

If (for example) 1440 x 900 equates to 60 arcsec, then 2880 x 1800 equates to 30 arcsec. Even if you pick a resolution in the middle such as 1920 x 1200, you'll *still* have a display around 45 arcsec, or 1680 x 1050 would give you around 52 arcsec - still perfectly crisp to the normal eye. If you were to start with a 1920 x 1200 resolution, scaling and aliasing artifacts would be much easier for your eye to see.

In other words, if you start with 1920 x 1200, only 1920 x 1200 and 960 x 600 would look like 'native' resolutions on an LCD. If you were to start with 2880 x 1800, the pixels are small enough that pretty much every resolution down to 1440 x 900 would look 'native,' including 1680, 1920, 2560, etc.

I could see resolutions maxing out one more step above 2880 x 1800 (5760 x 3600) in a few years - with resolution independence and dynamic scaling (along with appropriately fast hardware), the screen would be able to scale from 1440 x 900 up to 4k resolutions and you'd never actually be able to tell - everything will simply always be crisply rendered at ~300dpi - a point where your computer screen will look just as sharp as a National Geographic (printed at 300 dpi, I believe).


By TakinYourPoints on 6/13/2012 5:06:52 AM , Rating: 2
Text that requires no anti-aliasing and is sharper than physical print by itself should be enough to sell anyone on the benefits of high resolution displays. I can't believe this argument is still happening when both the iPhone and iPad have made the advantages so clear.

I can't wait to hear the argument again when 27" monitors go to 5120x2880.

"But you don't need all those pixels, herp derp"

Please


By shyhh on 6/13/2012 2:54:45 AM , Rating: 2
Now would that be just great for watching porn...


By WalksTheWalk on 6/14/2012 11:53:39 AM , Rating: 2
Way to go, my friend!


"And boy have we patented it!" -- Steve Jobs, Macworld 2007














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki