Posted by Jeannette Beltran | Posted in Uptime Institute Symposium | Posted on 30-07-2012
According to the 2012 Uptime Institute Data Center Industry Survey, 30% of data centers are rapidly running out of capacity. Many companies are seeking creative solutions to meet demand and conserve costs, but most of the strategies deployed involve significant up-front capital — requiring investment in new hardware and server virtualization.
Uptime Institute was interested in what kind of efficiency gains and cost savings organizations could achieve by decommissioning existing hardware in their data centers.
The companies that participated in the first annual Uptime Institute Server Roundup documented the decommissioning of outdated, unused and power-draining equipment and shared their energy- and cost-saving accomplishments. In March 2012, the Uptime Institute announced that AOL had come out on top after clearing out 9,484 servers, for a total savings of $5.5 million.
Starting in January 2011, AOL involved its entire TechOps team in three separate but related initiatives, with the objectives of eliminating inefficient and abandoned servers; shutting down or merging extraneous applications; and increasing utilization with an internal cloud. This project spanned three data centers in the United States, one small domestic colocation facility, and a leased colo space overseas.
One major component of AOL’s strategy to consolidate and decommission servers was its transition to cloud computing. Like many organizations today, AOL was running out of capacity in its data centers, so the company sought to migrate data to a scalable private cloud that would allow for quicker deployment times and reduce the capex and opex required for IT expansion. The AOL Cloud project kicked off in July 2010, and by October 2011, the company had launched its cloud data center, ATC — a 100% lights-out facility with no full-time employees — dedicated to hosting its private cloud.
Sister initiatives Project Absurdity and Power Hog involved extensive reviews of the organization’s existing products, applications and data center assets. Between the two projects, AOL had to look at nearly every individual server in its facilities and discuss each server’s future with its owners. As its name suggests, Project Absurdity sought to identify “absurd” server applications and products; specifically, products that had been neglected, abandoned or replaced with newer technology. Once identified, the AOL team was able to determine whether the product deserved re-investment or whether it should be transitioned to another project or shut down altogether.
The corresponding Power Hog initiative analyzed AOL’s hardware and assets, with a specific review of power consumption, as well as CPU, memory utilization and maintenance costs. Employees knew they and their servers had been “marked” as part of Power Hog when a trophy of a bronze pig appeared on their desks. (As AOL’s CTO Mike Manos said in a blog post, “I guess we were not below shame as a tactic.”) Of the 40,000 production hosts analyzed during the Power Hog project, the team decommissioned about 5,000, with an additional 2,400 hosts set to be decommissioned in the future. The investigation into AOL’s existing assets resulted in five outcomes: applications migrated to the cloud; applications migrated to a new non-cloud host; applications retired altogether; hosts consolidated; or hardware refreshed.
The 9,484 servers that were ultimately decommissioned equaled a 26% turnover of AOL’s IT assets. With each server operating at an average electrical cost of $174 per year, the decommissioning resulted in a savings of $1.4 million of its $13-million annual electric bill. Additionally, the consolidation saved $2.2 million in licensing costs, $62,400 in maintenance costs, and $1.2 million in recycling, scrapping and reselling old equipment – not to mention the avoidance of nearly 20 tons of carbon emissions. While the company did ultimately invest in 8,376 newer, more efficient servers, its net savings was still an impressive $4 million.
At AOL, it was all part of what Manos calls clearing the “cruft.” In his blog, Manos describes cruft as “years of build-up of technology, processes, politics, fiscal orphaning and poor operational hygiene” that can be a huge barrier to an organization’s agility in its online and IT operations. With the lessons learned through the course of the three projects, the AOL team offers these recommends to other organizations aiming to “clear the cruft”: Keep communication open with a broad audience; seek and secure ongoing executive sponsorship; maintain committed to continuing progress; and, importantly, keep it fun to avoid employee burnout.
Knowing how common it is for old, inefficient, and even completely abandoned servers to drain power in today’s data centers, it shouldn’t be surprising that so many organizations are running into capacity constraints in their facilities. And while it’s arguably “more fun” to focus on the deployment of new products and technologies, the cost savings and carbon-emission reductions that AOL saw after only one year demonstrates just how much organizations stand to gain by zeroing in on inefficiencies in their existing infrastructures.