Yayyyyy cloud, Inc.'s (AMZN) S3 cloud storage solution is once again under the spotlight of scrutiny following a report by security firm Rapid 7.  The report by the firm's Senior Security Consultant Will Vandevanter complains that poor security in the cloud may be endangering scores of customers worldwide.

But before you blame Amazon, Mr. Vandevanter makes it clear that it is the businesses that are to blame for the exposure of sensitive customer data.

The service provider Amazon gives its customers the ability to make their data private.  But according to the report, approximately one in six bins on Amazon S3 containing sensitive data are made public.  It's hard to tell if this is mere incompetence on the part of firms using the service, or what the justification for this dangerous practice might be, but it's clear it's bad news for customers and perhaps unfair bad publicity for the service provider Amazon.

In his post "There's a Hole in 1,951 Amazon S3 Buckets", Mr. Vandevanter blogs:

The worst case scenario is that a bucket has been marked as "public", exposes a list of sensitive files, and no access controls have been placed on those files. In situations where the bucket is public, but the files are locked down, sensitive information can still be exposed through the file names themselves, such as the names of customers or how frequently a particular application is backed up.
It should be emphasized that a public bucket is not a risk created by Amazon but rather a misconfiguration caused by the owner of the bucket. And although a file might be listed in a bucket it does not necessarily mean that it can be downloaded. Buckets and objects have their own access control lists (ACLs).  Amazon provides information on managing access controls for bucketshere. Furthermore, Amazon helps their users by publishing a best practices document on public access considerations around S3 buckets. The default configuration of an S3 bucket is private.

While Amazon is not directly to blame, the structure of S3 URLs does exacerbate the problem, according to Mr. Vandevanter.  As Amazon expects its users to make use of its robust privacy options when necessary, it gears its public S3 URLs for ease of use.  As a result, it's very easy to guess the URL names needed to get access to public records.

Amazon S3 bucket
Mr. Vandevanter demonstrates how companies are goofing on cloud security.
[Image Source: BrightCove/Rapid7]

The URLs of a bucket are[bucket_name]/ or http://[bucket_name]; private buckets will give an access denied error, while public buckets will show the first 1,000 records.

To guess the bucket names, the researcher used several sources:
  1. Guessing names through a few different dictionaries:
    • List of Fortune1000 company names with permutations on .com, -backup, -media. For example, walmart becomes walmart,, walmart-backup, walmart-media.
    • List of the top Alexa 100,000 sites with permutations on the TLD and www. For example, becomes,,, and walmart.
  2. Extracting S3 links from the HTTP responses identified by the Critical.IO project. This enabled us to identify and addresses “in the wild”. It is very common for a address to point to an S3 bucket.
  3. The Bing Search API was queried to gather a list of potentials.
The results allowed the discovery of 12,328 unique buckets -- 1,951 of which were public, and 10,377 of which were private.  In total he gathered 126 billion files from the open buckets -- including files that appeared to contain private user data.

Robin Wood aka "DigiNinja" has published a public web tool that uses similar sources to generate a list of S3 bucket names.

An exasperated Amazon has told Mr. Vandevanter that it "currently putting measures in place to proactively identify misconfigured files and buckets moving forward."

In other words, Amazon realizes that some of its clients are too incompetent to manage its own security settings, so it's trying to take the responsibility on itself to double check their work.  

But fixing the settings retroactively may not protect users fully.  Mr. Vandevanter writes:

Also, the WayBackMachine is a great resource to identify previously open buckets. Using a modified version of @mubix’s Metasploit module, I also quickly identified a few hundred buckets that are currently private that previously weren’t.

Looks like the only real solution is to not move to the cloud if you can't handle simple security of the hand-holding variety (which appears to be the case for some firms).

Side Note:

This security "study" is similar to the hack done by Goatse Security researcher Andrew Auernheimer.  Showcasing the ambiguity of computer security laws, Mr. Vandevanter will likely be praised for his work (and has the security firm backing to protect him if he gets charged/sued) where as Mr. Auernheimer's similar dump of exposed AT&T, Inc. (T) customer data earned him nearly 4 years in prison.  The message seems to be that if you work for corporate security, feel free to probe away, but if you work as an independent security researcher prepare to be harassed and sent to prison due to the U.S.'s poorly written computer crime laws.

Sources: Rapid7 [blog], [video]

"I'm an Internet expert too. It's all right to wire the industrial zone only, but there are many problems if other regions of the North are wired." -- North Korean Supreme Commander Kim Jong-il

Copyright 2017 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki