Mixed storage too much to handle?

For some time now growing organizations have responded to exponential data growth and the need for more storage capacity for their different applications by simply buying new disk storage systems. An additional server here, a storage area network there. It worked well enough in the short term but because organizations – particularly those in the healthcare, retail and government sectors – were essentially building separate and dedicated storage islands, data management complexity increased and so did costs. Not to mention the resource constraints – both physical storage resources and human resources – IT organizations face if this complexity and growth remains unchecked.

Among the many pitfalls and challenges related to these heterogeneous storage environments:

1. Poor storage utilization rates of different SANs
2. Increased time to administer the storage
3. Expensive to purchase snapshot/replication licenses on multiple SANs
4. Difficult to migrate data between storage systems
5. Increased energy and floor space costs

That’s where storage virtualization comes in – a relatively new approach that allows these dedicated heterogeneous storage environments to be pooled and work together more efficiently across applications and users in a way that reduces costs, ups efficiency, offers IT pros more vendor flexibility and keeps existing storage assets running longer. 

In short, virtualization allows you to more easily scale system capacity and performance to meet growing data infrastructure needs, reduce the complexity of management and reduce the risk of system failure to your business.

Or does it?

The truth is while many storage industry vendors have held up virtualization as the cure-all for storage woes, only a few have been able to deliver on anything close. The goals and benefits of storage virtualization are achievable but only when storage virtualization is part of a well-planned, all-encompassing storage management strategy.

SaysVP and Gartner Fellow Brian Gammage:

“Organizations right now who are thinking of virtualization cannot view it as a simple transaction. Virtualization is giving them choice in something they have not had choice in before. You are going to make multiple steps to get to the point you need to be and you’re not going to be able to do that, as you need to do it, unless you start by building a roadmap.”

Related Posts

Isn’t It Time You Hopped on the Virtualization Bandwagon? – UPDATED Editor's note: our virtualization expert Stephen Akuffo weighs in on the Virtual Space Race Study blog series - see his notes below. You’ve probably researched how virt...
RECO Gains Peace of Mind with Cloud-Ready Storage The Real Estate Council of Ontario (RECO) is a not-for-profit corporation responsible for administering the Real Estate and Business Brokers Act, 2002 as well as associate...
2 Principles Of Data Backup That Save $12,500 Per Hour According to a recent survey of over 2,000 SMBs, the average cost per hour of a data center outage is $12,500 for a SMB organization, and up to $60,000 per hour for a mid-...

About Andy Thomas

With over 16 years of experience as an IBM Business Partner, Andy has worked in the open systems, server virtualization, and storage consolidation areas.