Facebook ekes IT efficiency from sleek data centre design

By , on
Facebook ekes IT efficiency from sleek data centre design
Facebook's Marco Magarelli.

The Data Centre Strategy Summit keynote in full.

Facebook data centre designer Marco Magarelli has given the Australian industry a detailed insight into how careful site selection, new construction methodologies, free air cooling and custom servers have enabled the social network to deliver industry-leading power efficiency at its three main data centres.

Magarelli deconstructed the master plans for Facebook's three campuses — Prineville, Forest City and the first stage of Luleå in Sweden — for attendees of the Australian Data Centre Strategy Summit.

Facebook has taken a greenfields view of every aspect of the data centre build.

Running a single application at huge scale allowed its engineers to design their own server and rack configuration to minimise the need for mechanical cooling.

Its servers are fitted with a custom power unit that can handle 480/277VAC, skipping the need for power conversion and UPS systems and thus reducing power losses to the server by 13 percent.

The Prineville data centre has achieved a PUE (power usage effectiveness) of a staggering 1.08.

At its new Luleå site, close to the Arctic Circle, the team has also engineered out the need for as many emergency generators and is holding 70 percent less fuel on site as a result.

Magarelli said the company took an iterative approach to site and facility design, taking key learnings gathered from its own work and from industry via the Open Compute Project, and applying them to newer works.

Facebook makes many of the data centre design specifications it has come up with publicly available via the project.

"Growth is all about iteration," he said. "We're constantly evolving, exploring new ideas."

Site selection

Magarelli said Facebook has a team that is responsible for sourcing potential locations for data centre infrastructure, which are qualified against certain criteria, such as availability of utilities and fibre with protected routes.

"We looked at finding the most regular shaped parcel of land possible, with the idea here that we would be deploying not just one building but potentially a campus environment," Magarelli said.

"We want a site that's large enough that we can locate not just our current needs but make sure we also have some availability for buildings ... or services that we might not yet know [are] on our horizon."

Part of selecting a large site is also maximisation of capital investment.

"When we spend money bringing [utility and fibre] services to the site, we want to be able to leverage all that capital investment that we've put into the location," he said.

The ability to use outside air to cool proposed facilities is also a key site selection consideration.

"What was important to us was to know the prevailing wind direction, because we wanted to be able to capture the wind - like you would in a boat sail," Magarelli said.

"We would [also] construct our projects from upwind down so that  during construction we're not throwing dust into the [air] intakes of our operating data centres.

"The other reason why [prevailing wind direction is] important is that when we do test the generators, or we need to go on generators due to a utility interruption, we minimise the chances of recirculating any of that exhaust heat and exhaust flue into the data centres themselves."

Magarelli said the initial Prineville campus design focused on minimising the impact of putting a data centre on the site.

"The footprint of the building was [as it is] to minimise the amount of disruption to the soil, to the siting area," he said, noting Facebook sought to replant displaced vegetation and reuse excavation rock for the site landscaping.

Materials for the actual buildings that house the data halls had to be "rugged, durable, [and] have a certain harmony with the surroundings of the site".

At Prineville, pre-cast concrete panels, corrugated metal and insulated glass panels were among the materials used.

The choice of materials at Prineville was also in part driven by a desire to "minimise the amount of truck traffic" that the site required, thus adding to its environmental credentials.

Facebook's iterative approach to data centre design is evident in the changes in building materials between Prineville, Forest City and Luleå.

"Both Prineville and Forest City used precast concrete panels. In Forest City we also introduced insulated precast panels," Magarelli said.

"In Luleå, the material of choice is insulated metal panels.

"We've kind of adopted that in our subsequent projects, so much so that we are now using it as interior partitions within the actual data centre penthouse spaces.

"It goes up very quickly, again reducing the amount of material that one would generate in the construction of one of our centres."

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © iTnews.com.au . All rights reserved.
Tags:

Most Read Articles

Log In

  |  Forgot your password?