Print 85 comment(s) - last by tmouse.. on Oct 5 at 8:20 AM

Memory protections in Snow Leopard are still too weak, though it shows other improvements

Apple has been bragging about the security of its new operating system, OS X 10.6 "Snow Leopard".  Leaping from Leopard to Snow Leopard, Apple gives its users limited antivirus/anti-malware protection (the feature currently only detects two signatures out of a handful of known OS X malware signatures).

Still, security experts aren't so hot on Snow Leopard, criticizing the operating system's default firewall setting of "off", its lack of fully automatic updates, and weak anti-phishing efforts for Safari.  They also weren't impressed that Apple shipped with a vulnerable version of Flash, which downgrade users from the safer current version.

Now one prominent Mac hacker has pointed out a significant difference that makes Snow Leopard less secure than the upcoming Microsoft OS, Windows 7. 

Charlie Miller, of Baltimore-based Independent Security Evaluators, the co-author of The Mac Hacker's Handbook, and winner of two consecutive "Pwn2own" hacker contests is about as experienced as OS X hackers come.  He recently criticized Snow Leopard, stating, "Apple didn't change anything.  It's the exact same ASLR as in Leopard, which means it's not very good."

ASLR is address space layout randomization, a security technology that randomly assigns data to memory to make it tougher for attackers to determine the location of critical operating system functions.  According to Mr. Miller, unlike Windows 7, which features robust ASLR, Snow Leopard's ASLR is half-baked. It does not properly randomize the heap, the stack and the dynamic linker, the part of Snow Leopard that links multiple shared libraries for an executable.  This means that it's much easier for hackers to attack Snow Leopard via memory injection than Windows 7.

Still Mr. Miller offered some praise for Apple.  They rewrote QuickTime X, their video player, largely from scratch fixing many holes and insecurities in the process -- including an exploit Mr. Miller had been saving.  He states, "Apple rewrote a bunch of QuickTime, which was really smart, since it's been the source of lots of bugs in the past.  They've shaken out hundreds of bugs in QuickTime over the years, but it was still really smart of them to rewrite it.  [Still] I'd reduce the number of file formats from 200 or so to 50, and reduce the attack surface. I don't think anyone would miss them."

He also praises Apple's relatively effective implementation of DEP (data execution prevention), another memory protection scheme that Windows 7 also has.  DEP is also present in Windows XP Service Pack 2 (SP2) and Windows Vista.  Still without ASLR, DEP is only so good he says.  He states, "Snow Leopard's more secure than Leopard, but it's not as secure as Vista or Windows 7.  When Apple has both [in place], that's when I'll stop complaining about Apple's security."

So why aren't Macs being exploited left and right and why can Apple still air commercials claiming superior security?  Mr. Miller states, "It's harder to write exploits for Windows than the Mac, but all you see are Windows exploits. That's because if [the hacker] can hit 90% of the machines out there, that's all he's gonna do. It's not worth him nearly doubling his work just to get that last 10%."

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: As a Windows user,
By gstrickler on 9/18/2009 5:45:13 PM , Rating: 2
I challenge you to find a large group of people using it in mission critical applications.
Go ask the company who did the survey, because they had enough reports from "Enterprise IT" departments that Mac OS X Server showed up in their report.
Banks and ATMs, vending machines, science labs, the military, factories and production facilities, inventory systems all use windows. (and linux as well as someone will no doubt point out). The list is vast. Are these people stupid?
No, and please stop trying put words in my mouth. Windows or Linux may be the better option for them. Undoubtedly, some of it is because it's what they already know. Some may because Mac OS is definitely user focused, and for a server, you don't need the all the user focused stuff (it's nice, but unnecessary). However, since Mac OS X is based upon BSD, and almost anything that will run on BSD can be ported to OS X pretty easily (not counting a Mac-like UI), anywhere that Linux or BSD is appropriate, a Mac OS machine can usually work as well. Whether or not a Mac is beneficial or "better" depends upon the environment, software, and needs.
Temps in the 50s will definitely shorten the life of your drive, you can count on it.
Your evidence? If you have a link to any studies, please give them.

I agree that cooler is preferable, but I have yet to see any evidence that temperatures below 60c have any affect on the reliability or durability of HDs. The manufacturers warranty the drive for 3-5 years as long as you keep it in the specified operating range (0c-60c), which they couldn't afford to do if operating at the edges of that range significantly shortened it's life. And 51c is not at the edge of the range anyway.

On the other hand, I do have anecdotal evidence that drives can operate at very high temps for extended periods without causing problems. One of my clients had a whole bunch of servers with arrays of drives operating 24/7 at temps too hot to handle for 10 years with under 10% drive failure during that 10 years. The drives were too old to have built-in thermal sensors, so I don't have an exact temp, but 60c is 140f. 130f takes about 30 secs to actually burn and 160f will burn in 1 sec, so too hot to hold for 1-2 sec without getting burned is between 130f and 160f, and it's likely in the 145f-155f range.
That's a good one! It's a perfect example of an apple fanboy. If I haven't heard about it, it's nonsense!
Again, not what I said. I can't tell if you have trouble with comprehension or you are deliberately misinterpreting what I write.

I didn't state that I haven't heard of problems, I certainly have. But to date, every time I've encountered a problematic machine, I was able to identify a piece of defective hardware or was able to demonstrate that the problem does not exist with a clean install of the OS. 80% of the time, it's caused by a third party extension (and most of those are corrected by installing an updated version), 10% by hardware problems, 10% by a corrupted installation. Which brings me right back to what I DID say, if you're frequently having those types of problems, it's a problem with your system.
Your definition of less maintenance is time to apply patches?! Do you do any actual work?
Time to apply patches is part of the time to maintain systems. Backups, disk checks, software updates, anti-malware scans, installing/removing software, tracking down compatibility issues, etc. are also part of that time. On average, I spend about half as much time doing those thing in a Mac as on a Windows machine.

I do lots of work, that's why I choose a Mac. I'm not telling you that you should use a Mac, nor that anyone else I haven't discussed specific details with that they should use a Mac. In fact, I tell most of my clients they need Windows, not because Windows is "better", but because they need to run Windows specific applications most of the time. However, for clients who don't have Windows only applications (or for those that only need to use Windows on occasion) I often recommend a Mac. Sometimes they go Mac, sometimes they don't.

What I do recommend is that anyone make a list of the software they need to use and how often they need to use that software. Find out if there are Mac equivalent programs available for those. Then, if you will spend more than about 30% of your time using software that is Windows only, get a Windows machine. If you'll spend less than 30% using Windows only software, consider a Mac for your primary tasks and use Windows under VMware Fusion for any Windows only software you need.

I can tell you this for certain. I make about 2x more money (per machine) supporting Windows computers than I do Macs. I charge by the hour, and since a Windows machine typically takes about 2x as much time to maintain, it costs the client 2x as much to maintain. It doesn't bother me one bit when my clients need or choose Windows, it means more money in my pocket.

It's mostly irrelevant on this site, most of the readers prefer to build their own machines and/or are hardcore gamers, and for most of those people, a Mac is not a good option.

What I am trying to do is to educate people and get them to quit telling everyone else that "Macs suck", "are unreliable", or any of the other "myths", "misunderstandings", or "lies". If a Mac doesn't meet your needs, use Windows, or Linux, or AIX, or Solaris, or a Commodore 64 for all I care. I use Mac OS X, Windows XP (looking forward to Win7), and Linux, and I've worked on Solaris, Xenix, SCO Unix, and a variety of mini-computer and mainframe system, I don't hold anyone's choice of OS against them. However, you won't know what a Mac can do until you try it, preferably with the guidance of someone who knows both Macs and Windows because not everything you know from Windows will help you on a Mac.

Ok, I do hold it against you if you willingly choose Win3x/Win9x/ME. :)

"Well, there may be a reason why they call them 'Mac' trucks! Windows machines will not be trucks." -- Microsoft CEO Steve Ballmer

Most Popular ArticlesAre you ready for this ? HyperDrive Aircraft
September 24, 2016, 9:29 AM
Leaked – Samsung S8 is a Dream and a Dream 2
September 25, 2016, 8:00 AM
Inspiron Laptops & 2-in-1 PCs
September 25, 2016, 9:00 AM
Snapchat’s New Sunglasses are a Spectacle – No Pun Intended
September 24, 2016, 9:02 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki