Located in scenic Prineville, Ore. Facebook's server farm offers some amazing cost savings and power efficiency gains.  (Source: Real Estate 4 You)

Some of its servers use AMD's 8-12 core Magny Cours chips.  (Source: AMD via Xbit Computing)

The data centers use NO air conditioning. They keep the temperature at a balmy 85 degrees Fahrenheit via evaporative cooling.  (Source: Facebook)

Could the decision to openly share all of this information be an effort to distract the public from privacy concerns about the site? Of course. But that's not to say it couldn't greatly benefit and improve many companies looking to improve their data centers or build new ones.  (Source: Scrape TV)
Social network offers a veritable blueprint for anyone who wants to build an energy efficient data center

When one in six people in the world are subscribed to your service and half of those access it on a daily basis, you're in a pretty unique position.  Or more aptly, your name is probably Facebook.  The mega social network has been the topic of much scrutiny -- and even a Hollywood movie -- but it's doing something now that might surprise its critics and supporters alike; it's sharing its secrets.

When a business cooks up a model for success, it typically keeps it close to its chest.  Take Google.  For all its talk of "not being evil" and being open Google has released precious little about how it is accomplishing massive energy saving at is data centers.

But Facebook today announced [press release] a massive contribution to the Open Compute Project -- a wealth of information on how it has designed industry-leading servers and data centers capable of handling the hundreds of millions of users who daily ping its site.

I. The Servers

The announcement starts with the servers themselves.  They're 50 percent taller than typical rack mounted servers (1.5U v. 1U).  That extra space allows for bigger fans, but there's fewer of them.  The fan system by Rackable (now SGI) require approximately 2 to 4 percent of the power budget, versus normal fans, which require 10 to 20 percent.  

The extra energy is applied directly to the data center's intended purpose -- processing data from internet requests.  The super-servers come in two flavors.  The first is a design based on Magny-Cours 12 and 8 core CPUs, chips from Advanced Micro Devices, Inc. (AMD)  That build features the AMD SR5650 chipset for I/O, and supports up to a maximum 192GB of memory.  The second uses two Intel Corp. (INTCXeon 5500 series or 5600 series processors, up to 144GB memory, and an Intel 5500 I/O Hub chip.

The systems were designed by the world's two top computer makers, Hewlett-Packard Comp. (HPQ) and Dell, Inc. (DELL).

The server contains specially wired power supplies that allow for it to use DC power directly, as well as AC power. This help prevents service outage in case of a brownout.

Employees can build servers in three minutes.  That's thanks to the snapping components.  Much like today's top PC cases, hard drives slide into handy racks, you can easily pop on and off sides of the case and more.

Speaking of the case, it ditches the front-panel for easy maintenance (given the heavy airflow and air filtering dust isn't a major concern).  It also ditches the decorations and logos that companies often plaster servers with.  The exterior is composed of 1.2mm zinc pre-plated, corrosion-resistant steel panels.

These changes could save 120 tons (!) of waste during the life of an average data center.

Facebook has shared the exact specs and CAD diagrams of the servers and battery backup systems.  Given this treasure trove of information, it shouldn't be too hard to duplicate the design.

II. The Farm

Facebook's green server super-farm is located in Prineville, Oregon.   

The farm took two years to build, but that was pretty incredible, given the innovations that were packed into it.  The site does not use traditional air conditioning (!).  It uses direct evaporative cooling.  Due to the server design, a maintained temperature of 85°F with a 65 percent relative humidity is sufficient for stable operation.

The site's electrical distribution center operates at 277 volts, rather than 208 volts as is typical for a data center.  This eliminates a transformer -- a site of efficiency loss.

The plant redirects the hot air from the servers in the winter to heat employee offices (the site houses a great number of dedicated IT and computer engineers).  In the summer the air intake sprays water to absorb some of the incoming air heat.

Lighting in the building is motion sensitive LED lighting, saving even more on power costs.

The server farm is a curious, but ultra-efficient beast.  The amount of innovations Facebook packs in is incredible.  What is even more incredible is that it's sharing such detailed information with the public and competitors.

The Pinesville server farm was 24 percent cheaper than an "average" server farm of its size, according to Facebook (operational+maintenance+deployment costs).  And it is 38 percent more power efficient.

In fact the U.S. Environmental Protection Agency's voluntary efficiency guidelines suggest a power usage effectiveness (PUE) ratio (a measure defined [PDF] by the GreenGrid project) of 1.5.  Facebook's farm is an incredible 1.07 efficiency.

III. Conclusions

Digging deeper into Facebook's release absolutely seems a worthwhile endeavor for anyone in charge of setting up a large server farm.  The decision to release this information is a measure of Facebook's confidence that it is untouchable in the social networking business.

Some other companies have made similar publications.  Microsoft published information on how it rapidly sets up data centers.  Yahoo published information about "green centers" that came complete with chicken coops.  But these publications lacked the information and depth of Facebook's.

In that sense, Facebook is truly doing a service to the environment, national security, and large tech businesses.  Could the move backfire?  Most definitely.  Could this attempt to earn good karma be designed to counter the site's rampant privacy concerns?  Sure.  But who is the tech community to look such an appealing gift horse in the mouth?

"Game reviewers fought each other to write the most glowing coverage possible for the powerhouse Sony, MS systems. Reviewers flipped coins to see who would review the Nintendo Wii. The losers got stuck with the job." -- Andy Marken

Copyright 2017 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki