Posted by mstansberry | Posted in Data center infrastructure management, Uptime Institute Symposium | Posted on 07-03-2013
From Kevin Heslin, Research Manager with 451 Group.
The sessions at Uptime Institute Symposium 2013 will help you make better decisions regarding DCIM investment and implementation.
451 Research senior analyst Rhonda Ascierto will address the special considerations for assessing and procuring DCIM software. Many of the issues she will cover are crucial and worth underscoring, and they may not be part of a routine DCIM assessment or procurement process. Her presentation is based on a 56-page 451 Research report released in December 2012 titled Beyond the Basics: A Guide to Procuring DCIM Software.
The 451 Research team found more than 100 questions that you should ask your DCIM vendor before purchasing a DCIM product. Asking the right questions can help minimize the risk associated with DCIM by helping data center operators purchase the right level of service, avoid potential incompatibilities, and get a suitable pricing model.
451 Research Vice President Andy Lawrence will deliver a keynote entitled The Disrupted Data Center, during which he will discuss advanced DCIM, among other technologies. In my view, Andy is the world’s foremost expert on the DCIM market, with a fantastic track record anticipating how DCIM will affect data center operations. He’s simply “the right guy” to ask about DCIM and a host of other data center issues.
Uptime Institute Director of Content Matt Stansberry will present the 2013 Data Center Industry Survey results, which delve into DCIM adoption statistics, pricing, key features and barriers to adoption. This data will provide you in-depth analysis of what kinds of systems your peers are buying, and how much they’re paying for them.
I will also contribute to the program, moderating a panel of end users who have experience with DCIM implementation. I’ll ask them to describe the results they got and how those results compared with the benefits they expected. We’ll also talk about product selection, pricing, and lessons learned. What’s more, I expect to explore any difficulties the panelists had during the procurement and application processes and find out how they resolved these issues.
Register today for Uptime Institute Symposium. I look forward to seeing you this May in Santa Clara!
Posted by mstansberry | Posted in Data center infrastructure management, Uptime Institute Symposium, Uptime Symposium | Posted on 12-04-2012
There’s no question that the cost and capacity demands of today’s high-density computing environments require a greater level of visibility and control than ever before. And data center infrastructure management, or DCIM, is a critical integrating technology that fills the need for real-time information. But just as data centers are evolving, DCIM technology is evolving even faster. Simple tools with monitoring capabilities are developing into sophisticated control systems, and asset systems are evolving into auto-populating configuration management systems. How much can DCIM do? And where does it go next?
Even once-skeptical organizations are buying into what DCIM has to offer, but face challenges selecting and deploying these tools. As adoption of DCIM technologies continues to grow, organizations struggle with varying vendor pricing models that make comparisons difficult. There are also issues with scalability and integration, as well as the question of DCIM coexisting with legacy systems.
In this session, Andy Lawrence, Research Director at 451 Research, will explain the current capabilities and limitations of DCIM software; explore the emerging IT convergence trend; and offer insight into the leading vendors and available toolsets.
Posted by mstansberry | Posted in Data center design, Data center energy efficiency, Data center infrastructure management, Uptime Institute Symposium, Uptime Symposium | Posted on 05-04-2012
In 2009, Deutsche Bank announced its commitment to achieving an unprecedented level of environmental sustainability in its IT operations. Among the objectives included in the bank’s eight-step program were overhauling its computing infrastructure with the latest hardware; neutralizing its carbon footprint; and achieving four times the energy efficiency in its data centers. As part of Deutsche Bank’s green data center strategy, the organization took a step that was considered both innovative and remarkably risky when it chose to use outside air for free cooling in a mission-critical data center – a move that was basically uncharted territory for the traditionally conservative banking industry.
But fast forward to the present day, and the efforts are already paying off. Last year, the bank met its data center energy-efficiency goal one year ahead of schedule. And in January 2011, Deutsche Bank opened its Eco Data Center in the New York City area, which uses air-side economization to provide free cooling. In a video from last year’s Uptime Institute Symposium, Deutsche Bank chief scientist Andrew Stokes describes how the organization had to contend with the risks of running mission-critical workloads during major infrastructure changes that had never before been attempted in the field.
Stokes will return to this year’s Uptime Symposium to share key lessons Deutsche Bank learned in its first year operating the Eco Data Center. Find out how air-side economization and evaporative cooling, often considered appropriate only for test and dev, effectively handle mission-critical workloads and significantly reduce energy costs for one of the world’s largest financial institutions.
Some of the topic areas Stokes will cover include the following:
- IT equipment performance under varying temperatures;
- data center air flow and room design;
- measuring system-wide efficiency instead of power usage effectiveness;
- system tuning and optimization;
- and the realities of air-side economization and free air cooling in the New York City climate.
Posted by mstansberry | Posted in Data center consolidation, Data center infrastructure management, Data center operations, IT and Facilities Management Integration, Uptime Institute Symposium | Posted on 09-12-2011
Response to the Inaugural Uptime Institute Server Roundup Contest has been great so far, with teams signing up from around the world to participate. The goal of the event is to remove obsolete servers, save energy, and save money. Decommissioning a single 1U rack server can result in $500 per year in energy savings, an additional $500 in operating system licenses, and $1,500 in hardware maintenance costs. That’s not chump change.
Winners of the contest will receive one of these beautiful rodeo belt buckles, just finished by cowboy artisans in Texas:
Winners will also be honored at Uptime Institute Symposium 2012 and their case studies will be featured sessions.
Posted by mstansberry | Posted in Data center colocation, Data center energy efficiency, Data center infrastructure management, Data Center Metrics, IT and Facilities Management Integration, Uptime Tier Certification Awards | Posted on 08-11-2011
Family owned Central-Oregon cable company BendBroadband only recently decided to get into the data center business, but in that short time the organization has quickly proven it wants to be a leader in the industry.
The company earned Uptime Institute Tier III Facility Certification, and was the first site to be certified with a Kyoto Cooling system. In fact, the BendBroadband Vault is one of the biggest Kyoto Cooling installations in North America.
BendBroadband was also awarded the U.S. Green Building Council’s LEED Gold status, under the LEED 2009 Building Design and Construction (LEED BD+C) rating system. From the BendBroadband blog: This is a monumental achievement and one that we have been working toward since the initial concept stage of the Vault. This certification puts us in the upper echelon of data centers and makes us only the 5th data center in the world to attain this level.
BendBroadband is also an early adopter of Data Center Infrastructure Management (DCIM) software. The company recently announced it is using nlyte software for capacity planning.
I spoke with Steven Hall, Data Center Director at BendBroadband about the company’s DCIM use. Hall uses nlyte to help the organization bring new customers into the data center. Clients provide BendBroadband with a list of the servers they want to deploy, and Hall plugs those models into nlyte to plan out how much space, power and cooling will be necessary to meet the workload, and offers various options for deploying the equipment.
“We use it as part of our on-boarding process,” Hall said. “We help customers take a quick look at different cabinet layouts. Do they need a high density cabinet, or should we spread it out over two cabinets? The tool was perfect for that.”