backtop


Print 45 comment(s) - last by svrep.. on Feb 25 at 6:05 PM


The Asus U2E is among the products that the hackers were easily able to gain logon to by spoofing the facial recognition software. The hackers broke into Lenovo, Toshiba, and ASUSTek systems with ease.  (Source: ASUSTek)
At a major hacking conference participants showed yet another supposedly secure technology just isn't very secure

The problem with any hot technology in the security world is that the desire to raise a product above the competition seems to invariably lead to boastful claims.  Such claims make the technology a high profile target for hackers, and with the bright minds in the field, it takes little time to take many supposedly "unbeatable" countermeasures down.  Thus was the case with RFID, recently shown to be extremely insecure, and now it appears that at least some types of biometrics are headed down the same path.

Nguyen Minh Duc, manager of the application security department at the Bach Khoa Internetwork Security Center at Hanoi University of Technology, is scheduled to demonstrate at Black Hat DC this week how he and his colleagues used multiple methods to hack top biometric facial recognition products and gain easy access to systems.

He and his colleagues hacked Lenovo's Veriface III, ASUS' SmartLogon V1.0.0005, and Toshiba's Face Recognition 2.0.2.32 systems, which come on the companies' webcam equipped laptops.  These Windows XP and Windows Vista laptops use the webcams to scan the user's face, and if it matches the stored image, analyzed by an algorithm, it will log the user on.  Facial recognition is considered by many in the security world to be less of a hassle then fingerprints and more secure than passwords.

The Vietnamese researchers showed that the tech might not be such a good idea, though, by using multiple means to crack it.  The simplest way was to simply use a picture of the person to spoof the webcam into thinking it was the user.  Given the ready availability of images on sites like MySpace and Facebook, this seems to be an easy route to access.

The researchers also showed that they could use a brute force attack generating multiple random fake faces to eventually gain access, for lack of a picture to use the easier route.  States Profesor Duc in his paper on the hack, "The mechanisms used by those three vendors haven't met the security requirements needed by an authentication system, and they cannot wholly protect their users from being tampered."

He continues, "There is no way to fix this vulnerability.  ASUS, Lenovo, and Toshiba have to remove this function from all the models of their laptops ... [they] must give an advisory to users all over the world: Stop using this [biometric] function."

He and his colleagues will be releasing a suite of tools for hacking facial recognition software at the Black Hat DC conference.  The key to using spoofed images, he and his team found, was simply tweaking the lighting and angle of the photo until the system accepts it.  Describes Professor Duc, "Due to the fact that a hacker doesn't know exactly how the face learnt by the system looks like, he has to create a large number of images...let us call this method of attack 'Fake Face Bruteforce.' It is just easy to do that with a wide range of image editing programs at the moment."

He breaks down the weakness further, stating, "One special point we found out when studying those algorithms is that all of them work with images that have already been digitalized and gone through image processing. Consequently, we think that this is the weakest security spot in face recognition systems, generally, and access control system of the three vendors, particularly."

Many government efforts in the U.S. and elsewhere are looking to use facial recognition software as a means to identify citizens in motor vehicles or at sensitive public locations like airports.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

no surprise
By Moishe on 2/18/2009 9:06:49 AM , Rating: 2
If a camera compares a picture to a picture and then bases access on the similarities... It should be easy to fake. Plenty of people look very similar and the threshold can't be too high or you would be locked out of your own PC if you shaved your mustache or forgot to shave for a few days.




RE: no surprise
By tmouse on 2/18/2009 9:24:30 AM , Rating: 2
It could be made stronger for better security but as the article stated the manufactures wanted the authentication to be "hassle free" which is diametrically opposed to secure. ALL biometric security I have seen allows some form of password regulated access at some level to overcome the obvious problems of temporary damage (shaving bandages ect.) although most use things that are not easily changed (multiple ratios of facial landmarks). The thing I do not get is why do these dweebs have to offer a "complete hacking suite" when they expose the vulnerability? This crosses the line from being a concerned intelligent person to being a bone head who encourages criminal activity.
The last paragraph was, in my mind, completely unnecessary. What does facial recognition systems have to do with laptop security? Their goal is simply to narrow the number of people that need to be personally identified. This just makes security more efficient, so instead of a person scanning the severely limited field of vision trying to remember a few faces; a computer can cover a much larger area and have access to a much larger database of suspects. Of course the false discovery rate will be high but it will be far less than the other problem of missing a true target. Security is then free to check the limited number of potential targets (instead of checking out the really cute non-targets).


"If you look at the last five years, if you look at what major innovations have occurred in computing technology, every single one of them came from AMD. Not a single innovation came from Intel." -- AMD CEO Hector Ruiz in 2007














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki