backtop


Print 25 comment(s) - last by Cypherdude1.. on Feb 15 at 12:45 AM

The Hawaii facility didn't have modern security updates to prevent this from happening

One would think that former NSA contractor Edward Snowden's attempt to access confidential agency files would not be easy, but it reportedly didn't take much. 
 
According to The New York Times, Snowden used a cheap Web crawler to delve deep into the NSA's classified files and take them.
 
A Web crawler is software used to index and back up websites. It can be programmed with various search phrases, and then jumps automatically from Web page to Web page by following links, traveling far and wide in search of relevant documents. 
 
Some examples of Web crawlers are Googlebot and wget.  
 
The Web crawler used by Snowden has not been named, but Snowden reportedly programmed his search to find certain subjects and see how deep the search would take him into the NSA’s internal networks.
 
This raises some major questions; such as why a simple Web crawler was able to return such information on supposedly tightly protected government networks. 
 
The answer lies in Snowden's location. Back when WikiLeaks incident occurred in 2010, government facilities were required to install updated anti-leak software. But a facility in Hawaii was unable to receive the update because the outpost's network didn't have enough power to run it.


Edward Snowden [SOURCE: Wired]

When Snowden downloaded the 1.7 million NSA files, he was working at that government facility in Hawaii. 
 
It's currently unclear if Snowden just happened to be placed at that facility or if he made a request, according to reports. 
 
Nevertheless, this is just one more example of how Snowden outwitted the NSA. During his time at the NSA regional operations center for a month in Hawaii last spring, Snowden conned between 20 to 25 NSA employees to give him their login credentials and passwords. Snowden reportedly told the NSA employees that he needed their passwords in order to do his job, and after downloading secret NSA documents, he leaked the information to the media.
 
Since the leaks, the floodgates have been opened. In August 2013, reports said that the NSA admitted to touching 1.6 percent of total globe Web traffic. Its technique was to filter data after harvesting it, which led to over-collection on a major scale.
 
Many top tech leaders, like Facebook CEO Mark Zuckerberg and Google Executive Chairman Eric Schmidt, have spoken out against the NSA's programs along with civil-liberties advocates, U.S. citizens and even other countries that had the NSA peeping in their window. 
 
A presidential review panel made 46 recommendations regarding greater restraint on the NSA's surveillance programs last month, which sought for an end to bulk collection of data among other suggestions.
 
Snowden's thoughts on his lead in the NSA revelations?
 
"Mission accomplished."

Source: The New York Times



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

By cfaalm on 2/10/2014 5:38:38 PM , Rating: 1
In the end it's all people working there. That however is a lame excuse for an organisation like the NSA. Snowden shouldn't have been able to get at that information. Now that he's done it, it proves two points: NSA's own security lacks sophistication and NSA was gathering more information than people were be comfortable with. They should thank him and get their shit together.
I have worked in a classified environment and most on the corporate ladder do to a certain degree. You NEVER give your password to anyone. Not the CEO, CFO, head IT, no one.


“So far we have not seen a single Android device that does not infringe on our patents." -- Microsoft General Counsel Brad Smith














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki