What do you get if you cross 60 trillion unique URLs, 100 billion monthly search queries, 4.8 million active applications on Google Cloud Platform, and 6.3 trillion monthly operations in Cloud Datastore? One massively scalable global infrastructure, and yet a zero carbon footprint. That is, if you do it the Google way.

This year’s report from Greenpeace, Clicking Clean: How Companies are Creating the Green Internet, takes a look at how the world’s largest data center operators are contributing to building a green internet. In this report, Google is recognized for innovating new ways to power our data centers. Greenpeace commends Google for investing over $1 billion in 16 renewable energy projects, resulting in 2 GW of clean power. They also recognize us for innovation in energy efficiency in our data centers, as well as the work we are doing to change US energy policy to make renewable energy more accessible.
“Google maintains its leadership in building a renewably powered internet, as it significantly expands its renewable energy purchasing and investment both independently and through collaboration with its utility vendors.”
-- Greenpeace Clicking Clean Report, page 6
Across our data center infrastructure, we work hard to minimize the environmental impact of our services. Through intense energy efficiency efforts, renewable power purchasing, and carbon offset procurement, the net carbon emissions of our global operations is zero.

So how did we get here? From day 1, the Google infrastructure was designed for scalability, security, and efficiency to deliver global search. Today, Google delivers hundreds of global services on our infrastructure, including YouTube, Gmail and Google Cloud Platform. By building a single infrastructure on which we run all our services, we increase the speed at which we can globally drive energy savings through our fleet of data centers. The result of this is a plunging PUE -- a measure of how efficiently data centers operate -- and with PUE, the lower the better.
With a fleet PUE of 1.12, and real-time PUEs less than 1.09, we use far less energy to host services in our data centers than it would take in other data centers. To put this into context, the Uptime Institute estimates that in 2013, PUE averaged 1.65 across the 1000 data center users it surveyed.

We never stop searching for new, cutting edge ways to reduce PUE. Today, we announced that a small group within the data center operations team has come up with a way to use machine learning - a similar approach to those used in speech recognition, self-driving cars and other innovative technology - to improve performance, resulting in lots of new opportunities to save energy.

Another benefit of designing, building, and running our facilities is that we have control over the whole system, so we can optimize cooling choice based on site location. We always choose ‘free cooling’ when we can, whether it’s by using outside air, or alternative water sources such as seawater, industrial canal water, or municipal waste water. When we use water, we ‘recycle’ it by reusing it multiple times, then clean it and return it to the environment in better shape than we got it. In either case, we use natural processes, either air flow or evaporation, to provide cooling to our data centers, rather than mechanical chillers.

So whether you measure ‘greenness’ by how little energy it takes to run our infrastructure, how much renewable energy we’ve enabled, our net carbon emissions, or our rating in the Greenpeace report, you can rest assured that you minimize your environmental impact by choosing Google.

(More information on Google data centers, the engine that hosts Search, Drive, Cloud Platform, YouTube and more can be found on the Google Data Centers and The Big Picture web pages.)

-Posted by Joyce Dickerson, Data Center Sustainability