Posted by mstansberry | Posted in Cloud Computing, Uptime Institute Symposium | Posted on 19-03-2013
Tags: #Uptime13, Microsoft, Rostelecom, Twitter
In this blog post, we’re focusing on the invited Symposium Keynote speakers – three individuals who are responsible for delivering data center services around the globe.
Twitter’s data center evolution
Anoop Mavath, Director, Global Data Center Services at Twitter explains how his company has deployed data center infrastructure to keep pace with the constraints and changing demands of managing one of the most dynamic workloads on the Web. Mavath will discuss design specifications, financing models, vendor management, dynamic growth, geographic presence, energy efficiency and his company’s green programs.
The cloud will change the data center … Forever
Christian Belady, General Manager of Data Center Services at Microsoft will present on how the latest trends in cloud computing may impact your data centers: Cloud-optimized servers, fail-in-place and self-healing IT, and a shift to OpEx-driven accounting. Belady will also share his organization’s experience, building out the cloud infrastructure to support over 200 cloud services to over a billion customers.
Building Russia’s cloud computing platform
Rostelecom is Russia’s largest national telecommunications operator with presence in all Russian regions. The organization launched a project in 2011 to develop a national cloud computing platform. Rostelecom started building its main data center in Moscow, which will become the largest in Russia, around 37 megawatts and 40,000 square meters. The organization will build smaller, 3MW data centers in regional locations. The success of any data center build out is defined by how efficient the collaboration is between various project teams. Rostelecom’s Senior Project Director Alexander Martynyuk will discuss how his organization is meeting priorities, deadlines and budget on this complex project.
Register today for Uptime Institute Symposium. We look forward to seeing you this May in Santa Clara!
Posted by mstansberry | Posted in Cloud Computing, Data center colocation, Uptime Institute Symposium | Posted on 13-03-2013
A 2012 Uptime Institute survey of global data center owners revealed that 85% use some form of colocation or cloud computing. Yet 54% had no confidence in their ability to compare cost and performance of outsourcing alternatives dependably.
Currently, the vast majority of enterprises deploy a hybrid computing environment. The decision is no longer binary—whether or not to outsource. It’s multi-faceted—how many and how much of each alternative to deploy.
The sessions at Uptime Institute Symposium will instill confidence in your decisions when working with a third party service provider.
FORCSS: A framework for effective communication and decision making
The Special Focus of Symposium 2013 is Uptime Institute’s FORCSS™ Methodology, a means to capture, compare, and prioritize the financial, risk, and performance factors that impact IT-dependent business decisions.
FORCSS recognizes and weighs the benefits and exposures of IT or applications deployment alternatives: internal data center(s), colocation, hosting, or public cloud.
Uptime Institute will present the methodology, and host a series of executive FORCSS panels that will explore how these decisions are made at large IT organizations:
• Methods for Determining Comparative Cost of IT Service Delivery
• Quantifying the Cost of an IT Service Outage
• How to Conduct Service Quality Analysis for Internal and Third-Party Data Centers
In addition to the FORCSS programming, Uptime Institute Symposium will provide even more expert advice on how to navigate the third-party service provider landscape.
The why and how of third-party data center due diligence
Many recent media reports describe enterprise IT staff’s “shock” at a third-party data center outage. But upfront due diligence could have provided appropriate insight into the performance potential of a specific solution. Investigating and accepting the specific capabilities of third-party data center services should be baked into your organization’s outsourced Digital Infrastructure strategy. Keith Klesner of Uptime Institute will share recent downtime anecdotes and explain how to be a better customer of third-party data center services.
Large enterprises and the cloud: Time to make the move?
In this session, Ken Male will discuss how IT organization in Global 2000 Enterprises are evolving toward the use of hybrid cloud architectures. Relying on findings from TheInfoPro’s full portfolio of research, Ken will share information about barriers to adoption and enabling technologies in cloud environments, including software-defined networks, storage and networks automation, converged infrastructure platforms, and orchestration frameworks. TheInfoPro’s surveys also tracks the performance of cloud vendors such as Amazon, Verizon, Rackspace, and HP.
Register today for Uptime Institute Symposium. I look forward to seeing you this May in Santa Clara!
Posted by mstansberry | Posted in Cloud Computing, Data center availability, Data center energy efficiency, Data center media | Posted on 04-10-2012
This is the second blog post in a series that features continuing discussion with various senior staff of the Uptime Institute in response to the New York Times feature on data center energy use. The following was drawn from a discussion with Julian Kudritzki, Senior Vice President of Uptime Institute.
Many data centers are inherently wasteful due to the governing (mis)economics of cheap, high-availability computing. As long as IT reaps economic benefits divorced of true costs, then discipline in server procurement, utilization, and data center management is a ways off.
These favorable economic conditions delay motivation to change, but once dissipated, will transform our industry’s characteristics.
Hydro power west of the 100th meridian in the US is fundamentally government subsidized. (See Marc Reisner Cadillac Desert.) Federal investment in Grand Coulee generated massive amounts of inexpensive and reliable power. Today’s result is that Quincy is a desirable data center location. Power commissions seek data centers as the ideal customers of high concentrations of power. Now those bulk power deals are unavailable in the Pacific NW (scroll past login for full article).
Cost are no longer sub-3 cents, but susceptible to market fluctuations. Arguably, Quincy’s data centers are the last IT beneficiaries of a federal spend made over 50 years ago. If new data centers flock to available bulk power, which is dwindling nationally, the cost repercussions will compel a new data center mindset that looks at the IT provisioning and utilization and data center capacity deployment with a much more clinical and harsh eye. And, if we look beyond western hydro, there will be cost and lifecycle consequences of the existing carbon-intensive power generation.
The commodity server model allows for cheap and short-lived deployments. But, commodity servers are viable due to leveraged global labor pools and variable environmental regulations to dispose of the troublesome contents of a decommissioned server. Similar to hybrid cars, the sticker price does not reflect the cost of throwing it away. If foreign outsourcing of server recycling (i.e., teardown) was performed onshore at the scale of disposal, what would the total cost of a throwaway server become?
The consumer’s approach would evolve to extending the life of that server rather than replacing it. The irregular and need-based operation of diesel engine generators pales in environmental comparison to server disposal toxicity. Uptime Institute Survey shows 20% of IT departments pay their own power bill. Thus, is it safe to assume that the same percentage pay their own garbage bill?
For the enterprise, data centers and their contents are often treated as a cost center. For the IT and data center teams, the mission of uninterruptible uptime has been paramount. Thus, the prevailing management mode has been to hold nose and sign check. Budget reductions will threaten headcount in operations teams, but overall data center budgets continue to grow.
The unintended result of the favorable economic conditions is an unrealistic and unsustainable end user mindset of all IT functions available all the time. Many apps whose business value is not of the highest order have luxuriated in a ‘buy new now’ servers and enterprise-grade data center platform (i.e., power, cooling, monitoring, and automation infrastructure). Forward-thinking enterprises have been analyzing and distributing applications to match data-center infrastructure-level support to business value. But, these leading companies do not indicate industry prevalent behavior.
Debating whether server utilization is 7% or 12% or even 20% is a distraction. The issue is the economic factors that allow those low numbers to be perpetuated. There is a move afoot to compress IT, such as virtual server instances. But, the fact that this is an emergent trend shows how far we must travel and so fast.
OUTCOME: A BOON FOR OUTSOURCERS
A disruption in the economic conditions that IT has been enjoying will compel a new level of discipline and consequence in IT decision making. And, outsource alternatives will rush to propose a host of solutions to this economic crisis.
Many of these options will be so complex, or have such efficiencies of scale, that more enterprises will continue to divest themselves of data center facilities or entire IT assets.
A 2012 Uptime Institute survey of global owners revealed that 85% utilize colocation or hosting. Yet 54% had no confidence in their ability to compare outsourcing alternatives dependably. This is to the detriment of both enterprises and providers, as it calls into question the basis and viability of such commitments.
Previous IT decisions were based upon a narrower set of competitive offerings. Currently, the vast majority of enterprises deploy a hybrid computing environment. The decision is no longer binary—whether or not to outsource. It’s multi-faceted—how many and how much of each alternative to deploy.
Uptime Institute has been intensively developing the FORCSS methodology to weigh deployment alternatives with a full look at the major benefits and constraints of these options. FORCSS is the theme of Uptime Institute Symposium 2013. For more information on FORCSS click here.
Posted by mstansberry | Posted in Cloud Computing | Posted on 06-09-2012
451 Research’s 8th Annual Hosting & Cloud Transformation Summit North America (HCTS-NA), North America’s go-to convergence event for the Internet Infrastructure, colocation and third-party services industry, is just two weeks away and there is only limited space available. There are over 700 IT executives, cloud decision-makers, vendors and investors expected to attend HCTS NA this year. Don’t miss your opportunity to attend HCTS NA, the networking event of the industry. Register today before registration closes.
Topics to be discussed at this year’s Hosting & Cloud Transformation Summit include:
-451 Research Analysts Carl Brooks and Jim Davis will lead a panel discussion, ‘Managed Hosting – Staying the in the Game,’ with executives from Rackspace and Dreamhost.
-Executives from Digital Realty, Fortrust, QTS, and ViaWest will participate in a panel discussion on ‘The Evolving Multi-Tenant Datacenter Market – Beyond Colo and Wholesale.’
-451 Research VP William Fellows and a panel of industry experts from Cloudsoft, HP, Nimbula, and Solidfire will discuss best practices for a multi-cloud world – migrating, developing, deploying and orchestrating applications on private and hosted clouds.
-Peter Hopper, co-founder and CEO of DH Capital will provide a leadership perspective on valuation and M&A in the data center and managed hosting sectors.
-451 Research VP Rachel Chalmers will deliver a keynote presentation on ‘The Internet of Everything – The Impact of BYOD and Other Smart Devices on IT.’
-451 Research Managing Director Ken Male provides a reality check on enterprise cloud adoption.
Don’t miss your opportunity to attend the networking event of the year. Join 451 Research Analysts and key industry leaders at the 8th Annual Hosting & Cloud Transformation Summit as we discuss, debate and interact. The stakes couldn’t be higher.
Posted by mstansberry | Posted in Cloud Computing, Data center colocation, Uptime Institute Tier Standard, Uptime Tier Certification Awards | Posted on 29-11-2011
The fastest growing market for Uptime Institute Tier Certification is multi-tenant data center service providers in the colocation and cloud computing business. Over 50% of Uptime Institute’s ongoing Tier Certifications are for third party data center service providers.
In this video, Uptime Institute VP Julian Kudritzki outlines Tier Certification’s value proposition for data center service providers.
Uptime Institute Tier Certification provide assurances to data center owners and their clients, that each and every aspect of the design meets the objective, down to the breaker and valve positioning level. It is insurance for data center owners that they don’t pay for a Tier IV and get a Tier II.
The second benefit is the external facing value. Uptime Institute Tier Certification provides marketing value for third party data center service providers. Having an unbiased third party do the due diligence on the design and constructed facility can also shorten customer’s time to contract.